Cusack, Lynette; Smith, Morgan; Hegney, Desley; Rees, Clare S; Breen, Lauren J; Witt, Regina R; Rogers, Cath; Williams, Allison; Cross, Wendy; Cheung, Kin
2016-01-01
Building nurses' resilience to complex and stressful practice environments is necessary to keep skilled nurses in the workplace and ensuring safe patient care. A unified theoretical framework titled Health Services Workplace Environmental Resilience Model (HSWERM), is presented to explain the environmental factors in the workplace that promote nurses' resilience. The framework builds on a previously-published theoretical model of individual resilience, which identified the key constructs of psychological resilience as self-efficacy, coping and mindfulness, but did not examine environmental factors in the workplace that promote nurses' resilience. This unified theoretical framework was developed using a literary synthesis drawing on data from international studies and literature reviews on the nursing workforce in hospitals. The most frequent workplace environmental factors were identified, extracted and clustered in alignment with key constructs for psychological resilience. Six major organizational concepts emerged that related to a positive resilience-building workplace and formed the foundation of the theoretical model. Three concepts related to nursing staff support (professional, practice, personal) and three related to nursing staff development (professional, practice, personal) within the workplace environment. The unified theoretical model incorporates these concepts within the workplace context, linking to the nurse, and then impacting on personal resilience and workplace outcomes, and its use has the potential to increase staff retention and quality of patient care.
A Unified Framework for Analyzing and Designing for Stationary Arterial Networks
DOT National Transportation Integrated Search
2017-05-17
This research aims to develop a unified theoretical and simulation framework for analyzing and designing signals for stationary arterial networks. Existing traffic flow models used in design and analysis of signal control strategies are either too si...
Cusack, Lynette; Smith, Morgan; Hegney, Desley; Rees, Clare S.; Breen, Lauren J.; Witt, Regina R.; Rogers, Cath; Williams, Allison; Cross, Wendy; Cheung, Kin
2016-01-01
Building nurses' resilience to complex and stressful practice environments is necessary to keep skilled nurses in the workplace and ensuring safe patient care. A unified theoretical framework titled Health Services Workplace Environmental Resilience Model (HSWERM), is presented to explain the environmental factors in the workplace that promote nurses' resilience. The framework builds on a previously-published theoretical model of individual resilience, which identified the key constructs of psychological resilience as self-efficacy, coping and mindfulness, but did not examine environmental factors in the workplace that promote nurses' resilience. This unified theoretical framework was developed using a literary synthesis drawing on data from international studies and literature reviews on the nursing workforce in hospitals. The most frequent workplace environmental factors were identified, extracted and clustered in alignment with key constructs for psychological resilience. Six major organizational concepts emerged that related to a positive resilience-building workplace and formed the foundation of the theoretical model. Three concepts related to nursing staff support (professional, practice, personal) and three related to nursing staff development (professional, practice, personal) within the workplace environment. The unified theoretical model incorporates these concepts within the workplace context, linking to the nurse, and then impacting on personal resilience and workplace outcomes, and its use has the potential to increase staff retention and quality of patient care. PMID:27242567
Control of Distributed Parameter Systems
1990-08-01
vari- ant of the general Lotka - Volterra model for interspecific competition. The variant described the emergence of one subpopulation from another as a...distribut ion unlimited. I&. ARSTRACT (MAUMUnw2O1 A unified arioroximation framework for Parameter estimation In general linear POE models has been completed...unified approximation framework for parameter estimation in general linear PDE models. This framework has provided the theoretical basis for a number of
Unifying Different Theories of Learning: Theoretical Framework and Empirical Evidence
ERIC Educational Resources Information Center
Phan, Huy Phuong
2008-01-01
The main aim of this research study was to test out a conceptual model encompassing the theoretical frameworks of achievement goals, study processing strategies, effort, and reflective thinking practice. In particular, it was postulated that the causal influences of achievement goals on academic performance are direct and indirect through study…
A unified theoretical framework for mapping models for the multi-state Hamiltonian.
Liu, Jian
2016-11-28
We propose a new unified theoretical framework to construct equivalent representations of the multi-state Hamiltonian operator and present several approaches for the mapping onto the Cartesian phase space. After mapping an F-dimensional Hamiltonian onto an F+1 dimensional space, creation and annihilation operators are defined such that the F+1 dimensional space is complete for any combined excitation. Commutation and anti-commutation relations are then naturally derived, which show that the underlying degrees of freedom are neither bosons nor fermions. This sets the scene for developing equivalent expressions of the Hamiltonian operator in quantum mechanics and their classical/semiclassical counterparts. Six mapping models are presented as examples. The framework also offers a novel way to derive such as the well-known Meyer-Miller model.
A theoretical formulation of wave-vortex interactions
NASA Technical Reports Server (NTRS)
Wu, J. Z.; Wu, J. M.
1989-01-01
A unified theoretical formulation for wave-vortex interaction, designated the '(omega, Pi) framework,' is presented. Based on the orthogonal decomposition of fluid dynamic interactions, the formulation can be used to study a variety of problems, including the interaction of a longitudinal (acoustic) wave and/or transverse (vortical) wave with a main vortex flow. Moreover, the formulation permits a unified treatment of wave-vortex interaction at various approximate levels, where the normal 'piston' process and tangential 'rubbing' process can be approximated dfferently.
Crupi, Vincenzo; Nelson, Jonathan D; Meder, Björn; Cevolani, Gustavo; Tentori, Katya
2018-06-17
Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. Copyright © 2018 Cognitive Science Society, Inc.
Family Systems Theory: A Unifying Framework for Codependence.
ERIC Educational Resources Information Center
Prest, Layne A.; Protinsky, Howard
1993-01-01
Considers addictions and construct of codependence. Offers critical review and synthesis of codependency literature, along with an intergenerational family systems framework for conceptualizing the relationship of the dysfunctional family to the construct of codependence. Presents theoretical basis for systemic clinical work and research in this…
A unified framework for approximation in inverse problems for distributed parameter systems
NASA Technical Reports Server (NTRS)
Banks, H. T.; Ito, K.
1988-01-01
A theoretical framework is presented that can be used to treat approximation techniques for very general classes of parameter estimation problems involving distributed systems that are either first or second order in time. Using the approach developed, one can obtain both convergence and stability (continuous dependence of parameter estimates with respect to the observations) under very weak regularity and compactness assumptions on the set of admissible parameters. This unified theory can be used for many problems found in the recent literature and in many cases offers significant improvements to existing results.
Ridgeway, Jennifer L; Wang, Zhen; Finney Rutten, Lila J; van Ryn, Michelle; Griffin, Joan M; Murad, M Hassan; Asiedu, Gladys B; Egginton, Jason S; Beebe, Timothy J
2017-08-04
There exists a paucity of work in the development and testing of theoretical models specific to childhood health disparities even though they have been linked to the prevalence of adult health disparities including high rates of chronic disease. We conducted a systematic review and thematic analysis of existing models of health disparities specific to children to inform development of a unified conceptual framework. We systematically reviewed articles reporting theoretical or explanatory models of disparities on a range of outcomes related to child health. We searched Ovid Medline In-Process & Other Non-Indexed Citations, Ovid MEDLINE, Ovid Embase, Ovid Cochrane Central Register of Controlled Trials, Ovid Cochrane Database of Systematic Reviews, and Scopus (database inception to 9 July 2015). A metanarrative approach guided the analysis process. A total of 48 studies presenting 48 models were included. This systematic review found multiple models but no consensus on one approach. However, we did discover a fair amount of overlap, such that the 48 models reviewed converged into the unified conceptual framework. The majority of models included factors in three domains: individual characteristics and behaviours (88%), healthcare providers and systems (63%), and environment/community (56%), . Only 38% of models included factors in the health and public policies domain. A disease-agnostic unified conceptual framework may inform integration of existing knowledge of child health disparities and guide future research. This multilevel framework can focus attention among clinical, basic and social science research on the relationships between policy, social factors, health systems and the physical environment that impact children's health outcomes. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
A Unified Framework for Monetary Theory and Policy Analysis.
ERIC Educational Resources Information Center
Lagos, Ricardo; Wright, Randall
2005-01-01
Search-theoretic models of monetary exchange are based on explicit descriptions of the frictions that make money essential. However, tractable versions of these models typically make strong assumptions that render them ill suited for monetary policy analysis. We propose a new framework, based on explicit micro foundations, within which macro…
A general modeling framework for describing spatially structured population dynamics
Sample, Christine; Fryxell, John; Bieri, Joanna; Federico, Paula; Earl, Julia; Wiederholt, Ruscena; Mattsson, Brady; Flockhart, Tyler; Nicol, Sam; Diffendorfer, James E.; Thogmartin, Wayne E.; Erickson, Richard A.; Norris, D. Ryan
2017-01-01
Variation in movement across time and space fundamentally shapes the abundance and distribution of populations. Although a variety of approaches model structured population dynamics, they are limited to specific types of spatially structured populations and lack a unifying framework. Here, we propose a unified network-based framework sufficiently novel in its flexibility to capture a wide variety of spatiotemporal processes including metapopulations and a range of migratory patterns. It can accommodate different kinds of age structures, forms of population growth, dispersal, nomadism and migration, and alternative life-history strategies. Our objective was to link three general elements common to all spatially structured populations (space, time and movement) under a single mathematical framework. To do this, we adopt a network modeling approach. The spatial structure of a population is represented by a weighted and directed network. Each node and each edge has a set of attributes which vary through time. The dynamics of our network-based population is modeled with discrete time steps. Using both theoretical and real-world examples, we show how common elements recur across species with disparate movement strategies and how they can be combined under a unified mathematical framework. We illustrate how metapopulations, various migratory patterns, and nomadism can be represented with this modeling approach. We also apply our network-based framework to four organisms spanning a wide range of life histories, movement patterns, and carrying capacities. General computer code to implement our framework is provided, which can be applied to almost any spatially structured population. This framework contributes to our theoretical understanding of population dynamics and has practical management applications, including understanding the impact of perturbations on population size, distribution, and movement patterns. By working within a common framework, there is less chance that comparative analyses are colored by model details rather than general principles
A Unified Theoretical Framework for Cognitive Sequencing.
Savalia, Tejas; Shukla, Anuj; Bapi, Raju S
2016-01-01
The capacity to sequence information is central to human performance. Sequencing ability forms the foundation stone for higher order cognition related to language and goal-directed planning. Information related to the order of items, their timing, chunking and hierarchical organization are important aspects in sequencing. Past research on sequencing has emphasized two distinct and independent dichotomies: implicit vs. explicit and goal-directed vs. habits. We propose a theoretical framework unifying these two streams. Our proposal relies on brain's ability to implicitly extract statistical regularities from the stream of stimuli and with attentional engagement organizing sequences explicitly and hierarchically. Similarly, sequences that need to be assembled purposively to accomplish a goal require engagement of attentional processes. With repetition, these goal-directed plans become habits with concomitant disengagement of attention. Thus, attention and awareness play a crucial role in the implicit-to-explicit transition as well as in how goal-directed plans become automatic habits. Cortico-subcortical loops basal ganglia-frontal cortex and hippocampus-frontal cortex loops mediate the transition process. We show how the computational principles of model-free and model-based learning paradigms, along with a pivotal role for attention and awareness, offer a unifying framework for these two dichotomies. Based on this framework, we make testable predictions related to the potential influence of response-to-stimulus interval (RSI) on developing awareness in implicit learning tasks.
A Unified Theoretical Framework for Cognitive Sequencing
Savalia, Tejas; Shukla, Anuj; Bapi, Raju S.
2016-01-01
The capacity to sequence information is central to human performance. Sequencing ability forms the foundation stone for higher order cognition related to language and goal-directed planning. Information related to the order of items, their timing, chunking and hierarchical organization are important aspects in sequencing. Past research on sequencing has emphasized two distinct and independent dichotomies: implicit vs. explicit and goal-directed vs. habits. We propose a theoretical framework unifying these two streams. Our proposal relies on brain's ability to implicitly extract statistical regularities from the stream of stimuli and with attentional engagement organizing sequences explicitly and hierarchically. Similarly, sequences that need to be assembled purposively to accomplish a goal require engagement of attentional processes. With repetition, these goal-directed plans become habits with concomitant disengagement of attention. Thus, attention and awareness play a crucial role in the implicit-to-explicit transition as well as in how goal-directed plans become automatic habits. Cortico-subcortical loops basal ganglia-frontal cortex and hippocampus-frontal cortex loops mediate the transition process. We show how the computational principles of model-free and model-based learning paradigms, along with a pivotal role for attention and awareness, offer a unifying framework for these two dichotomies. Based on this framework, we make testable predictions related to the potential influence of response-to-stimulus interval (RSI) on developing awareness in implicit learning tasks. PMID:27917146
Optimization Techniques for Analysis of Biological and Social Networks
2012-03-28
analyzing a new metaheuristic technique, variable objective search. 3. Experimentation and application: Implement the proposed algorithms , test and fine...alternative mathematical programming formulations, their theoretical analysis, the development of exact algorithms , and heuristics. Originally, clusters...systematic fashion under a unifying theoretical and algorithmic framework. Optimization, Complex Networks, Social Network Analysis, Computational
General System Theory: Toward a Conceptual Framework for Science and Technology Education for All.
ERIC Educational Resources Information Center
Chen, David; Stroup, Walter
1993-01-01
Suggests using general system theory as a unifying theoretical framework for science and technology education for all. Five reasons are articulated: the multidisciplinary nature of systems theory, the ability to engage complexity, the capacity to describe system dynamics, the ability to represent the relationship between microlevel and…
Making Learning Personally Meaningful: A New Framework for Relevance Research
ERIC Educational Resources Information Center
Priniski, Stacy J.; Hecht, Cameron A.; Harackiewicz, Judith M.
2018-01-01
Personal relevance goes by many names in the motivation literature, stemming from a number of theoretical frameworks. Currently these lines of research are being conducted in parallel with little synthesis across them, perhaps because there is no unifying definition of the relevance construct within which this research can be situated. In this…
Brainerd, C J; Reyna, V F; Howe, M L
2009-10-01
One of the most extensively investigated topics in the adult memory literature, dual memory processes, has had virtually no impact on the study of early memory development. The authors remove the key obstacles to such research by formulating a trichotomous theory of recall that combines the traditional dual processes of recollection and familiarity with a reconstruction process. The theory is then embedded in a hidden Markov model that measures all 3 processes with low-burden tasks that are appropriate for even young children. These techniques are applied to a large corpus of developmental studies of recall, yielding stable findings about the emergence of dual memory processes between childhood and young adulthood and generating tests of many theoretical predictions. The techniques are extended to the study of healthy aging and to the memory sequelae of common forms of neurocognitive impairment, resulting in a theoretical framework that is unified over 4 major domains of memory research: early development, mainstream adult research, aging, and neurocognitive impairment. The techniques are also extended to recognition, creating a unified dual process framework for recall and recognition.
Theoretical Foundation of Copernicus: A Unified System for Trajectory Design and Optimization
NASA Technical Reports Server (NTRS)
Ocampo, Cesar; Senent, Juan S.; Williams, Jacob
2010-01-01
The fundamental methods are described for the general spacecraft trajectory design and optimization software system called Copernicus. The methods rely on a unified framework that is used to model, design, and optimize spacecraft trajectories that may operate in complex gravitational force fields, use multiple propulsion systems, and involve multiple spacecraft. The trajectory model, with its associated equations of motion and maneuver models, are discussed.
A Unifying Framework for Causal Analysis in Set-Theoretic Multimethod Research
ERIC Educational Resources Information Center
Rohlfing, Ingo; Schneider, Carsten Q.
2018-01-01
The combination of Qualitative Comparative Analysis (QCA) with process tracing, which we call set-theoretic multimethod research (MMR), is steadily becoming more popular in empirical research. Despite the fact that both methods have an elected affinity based on set theory, it is not obvious how a within-case method operating in a single case and a…
Self-Efficacy: Toward a Unifying Theory of Behavioral Change
ERIC Educational Resources Information Center
Bandura, Albert
1977-01-01
This research presents an integrative theoretical framework to explain and to predict psychological changes achieved by different modes of treatment. This theory states that psychological procedures, whatever their form, alter the level and strength of "self-efficacy". (Editor/RK)
A quasi-likelihood approach to non-negative matrix factorization
Devarajan, Karthik; Cheung, Vincent C.K.
2017-01-01
A unified approach to non-negative matrix factorization based on the theory of generalized linear models is proposed. This approach embeds a variety of statistical models, including the exponential family, within a single theoretical framework and provides a unified view of such factorizations from the perspective of quasi-likelihood. Using this framework, a family of algorithms for handling signal-dependent noise is developed and its convergence proven using the Expectation-Maximization algorithm. In addition, a measure to evaluate the goodness-of-fit of the resulting factorization is described. The proposed methods allow modeling of non-linear effects via appropriate link functions and are illustrated using an application in biomedical signal processing. PMID:27348511
The thermodynamics of dense granular flow and jamming
NASA Astrophysics Data System (ADS)
Lu, Shih Yu
The scope of the thesis is to propose, based on experimental evidence and theoretical validation, a quantifiable connection between systems that exhibit the jamming phenomenon. When jammed, some materials that flow are able to resist deformation so that they appear solid-like on the laboratory scale. But unlike ordinary fusion, which has a critically defined criterion in pressure and temperature, jamming occurs under a wide range of conditions. These condition have been rigorously investigated but at the moment, no self-consistent framework can apply to grains, foam and colloids that may have suddenly ceased to flow. To quantify the jamming behavior, a constitutive model of dense granular flows is deduced from shear-flow experiments. The empirical equations are then generalized, via a thermodynamic approach, into an equation-of-state for jamming. Notably, the unifying theory also predicts the experimental data on the behavior of molecular glassy liquids. This analogy paves a crucial road map for a unifying theoretical framework in condensed matter, for example, ranging from sand to fire retardants to toothpaste.
Hu, Shiang; Yao, Dezhong; Valdes-Sosa, Pedro A
2018-01-01
The choice of reference for the electroencephalogram (EEG) is a long-lasting unsolved issue resulting in inconsistent usages and endless debates. Currently, both the average reference (AR) and the reference electrode standardization technique (REST) are two primary, apparently irreconcilable contenders. We propose a theoretical framework to resolve this reference issue by formulating both (a) estimation of potentials at infinity, and (b) determination of the reference, as a unified Bayesian linear inverse problem, which can be solved by maximum a posterior estimation. We find that AR and REST are very particular cases of this unified framework: AR results from biophysically non-informative prior; while REST utilizes the prior based on the EEG generative model. To allow for simultaneous denoising and reference estimation, we develop the regularized versions of AR and REST, named rAR and rREST, respectively. Both depend on a regularization parameter that is the noise to signal variance ratio. Traditional and new estimators are evaluated with this framework, by both simulations and analysis of real resting EEGs. Toward this end, we leverage the MRI and EEG data from 89 subjects which participated in the Cuban Human Brain Mapping Project. Generated artificial EEGs-with a known ground truth, show that relative error in estimating the EEG potentials at infinity is lowest for rREST. It also reveals that realistic volume conductor models improve the performances of REST and rREST. Importantly, for practical applications, it is shown that an average lead field gives the results comparable to the individual lead field. Finally, it is shown that the selection of the regularization parameter with Generalized Cross-Validation (GCV) is close to the "oracle" choice based on the ground truth. When evaluated with the real 89 resting state EEGs, rREST consistently yields the lowest GCV. This study provides a novel perspective to the EEG reference problem by means of a unified inverse solution framework. It may allow additional principled theoretical formulations and numerical evaluation of performance.
Koay, Cheng Guan; Chang, Lin-Ching; Carew, John D; Pierpaoli, Carlo; Basser, Peter J
2006-09-01
A unifying theoretical and algorithmic framework for diffusion tensor estimation is presented. Theoretical connections among the least squares (LS) methods, (linear least squares (LLS), weighted linear least squares (WLLS), nonlinear least squares (NLS) and their constrained counterparts), are established through their respective objective functions, and higher order derivatives of these objective functions, i.e., Hessian matrices. These theoretical connections provide new insights in designing efficient algorithms for NLS and constrained NLS (CNLS) estimation. Here, we propose novel algorithms of full Newton-type for the NLS and CNLS estimations, which are evaluated with Monte Carlo simulations and compared with the commonly used Levenberg-Marquardt method. The proposed methods have a lower percent of relative error in estimating the trace and lower reduced chi2 value than those of the Levenberg-Marquardt method. These results also demonstrate that the accuracy of an estimate, particularly in a nonlinear estimation problem, is greatly affected by the Hessian matrix. In other words, the accuracy of a nonlinear estimation is algorithm-dependent. Further, this study shows that the noise variance in diffusion weighted signals is orientation dependent when signal-to-noise ratio (SNR) is low (
Unified framework for information integration based on information geometry
Oizumi, Masafumi; Amari, Shun-ichi
2016-01-01
Assessment of causal influences is a ubiquitous and important subject across diverse research fields. Drawn from consciousness studies, integrated information is a measure that defines integration as the degree of causal influences among elements. Whereas pairwise causal influences between elements can be quantified with existing methods, quantifying multiple influences among many elements poses two major mathematical difficulties. First, overestimation occurs due to interdependence among influences if each influence is separately quantified in a part-based manner and then simply summed over. Second, it is difficult to isolate causal influences while avoiding noncausal confounding influences. To resolve these difficulties, we propose a theoretical framework based on information geometry for the quantification of multiple causal influences with a holistic approach. We derive a measure of integrated information, which is geometrically interpreted as the divergence between the actual probability distribution of a system and an approximated probability distribution where causal influences among elements are statistically disconnected. This framework provides intuitive geometric interpretations harmonizing various information theoretic measures in a unified manner, including mutual information, transfer entropy, stochastic interaction, and integrated information, each of which is characterized by how causal influences are disconnected. In addition to the mathematical assessment of consciousness, our framework should help to analyze causal relationships in complex systems in a complete and hierarchical manner. PMID:27930289
Interprofessional Care and Collaborative Practice.
ERIC Educational Resources Information Center
Casto, R. Michael; And Others
This book provides materials for those learning about the dynamics, techniques, and potential of interprofessional collaboration in health care and human services professions. Eight case studies thread their way through most chapters to unify and illustrate the text. Part 1 addresses the theoretical framework that forms the basis for…
Spending on School Infrastructure: Does Money Matter?
ERIC Educational Resources Information Center
Crampton, Faith E.
2009-01-01
Purpose: The purpose of this study is to further develop an emerging thread of quantitative research that grounds investment in school infrastructure in a unified theoretical framework of investment in human, social, and physical capital. Design/methodology/approach: To answer the research question, what is the impact of investment in human,…
Food-web based unified model of macro- and microevolution.
Chowdhury, Debashish; Stauffer, Dietrich
2003-10-01
We incorporate the generic hierarchical architecture of foodwebs into a "unified" model that describes both micro- and macroevolutions within a single theoretical framework. This model describes the microevolution in detail by accounting for the birth, ageing, and natural death of individual organisms as well as prey-predator interactions on a hierarchical dynamic food web. It also provides a natural description of random mutations and speciation (origination) of species as well as their extinctions. The distribution of lifetimes of species follows an approximate power law only over a limited regime.
A unified data representation theory for network visualization, ordering and coarse-graining
Kovács, István A.; Mizsei, Réka; Csermely, Péter
2015-01-01
Representation of large data sets became a key question of many scientific disciplines in the last decade. Several approaches for network visualization, data ordering and coarse-graining accomplished this goal. However, there was no underlying theoretical framework linking these problems. Here we show an elegant, information theoretic data representation approach as a unified solution of network visualization, data ordering and coarse-graining. The optimal representation is the hardest to distinguish from the original data matrix, measured by the relative entropy. The representation of network nodes as probability distributions provides an efficient visualization method and, in one dimension, an ordering of network nodes and edges. Coarse-grained representations of the input network enable both efficient data compression and hierarchical visualization to achieve high quality representations of larger data sets. Our unified data representation theory will help the analysis of extensive data sets, by revealing the large-scale structure of complex networks in a comprehensible form. PMID:26348923
Kwok, T; Smith, K A
2000-09-01
The aim of this paper is to study both the theoretical and experimental properties of chaotic neural network (CNN) models for solving combinatorial optimization problems. Previously we have proposed a unifying framework which encompasses the three main model types, namely, Chen and Aihara's chaotic simulated annealing (CSA) with decaying self-coupling, Wang and Smith's CSA with decaying timestep, and the Hopfield network with chaotic noise. Each of these models can be represented as a special case under the framework for certain conditions. This paper combines the framework with experimental results to provide new insights into the effect of the chaotic neurodynamics of each model. By solving the N-queen problem of various sizes with computer simulations, the CNN models are compared in different parameter spaces, with optimization performance measured in terms of feasibility, efficiency, robustness and scalability. Furthermore, characteristic chaotic neurodynamics crucial to effective optimization are identified, together with a guide to choosing the corresponding model parameters.
Theory and applications of structured light single pixel imaging
NASA Astrophysics Data System (ADS)
Stokoe, Robert J.; Stockton, Patrick A.; Pezeshki, Ali; Bartels, Randy A.
2018-02-01
Many single-pixel imaging techniques have been developed in recent years. Though the methods of image acquisition vary considerably, the methods share unifying features that make general analysis possible. Furthermore, the methods developed thus far are based on intuitive processes that enable simple and physically-motivated reconstruction algorithms, however, this approach may not leverage the full potential of single-pixel imaging. We present a general theoretical framework of single-pixel imaging based on frame theory, which enables general, mathematically rigorous analysis. We apply our theoretical framework to existing single-pixel imaging techniques, as well as provide a foundation for developing more-advanced methods of image acquisition and reconstruction. The proposed frame theoretic framework for single-pixel imaging results in improved noise robustness, decrease in acquisition time, and can take advantage of special properties of the specimen under study. By building on this framework, new methods of imaging with a single element detector can be developed to realize the full potential associated with single-pixel imaging.
ERIC Educational Resources Information Center
Dannhauser, Walter
1980-01-01
Described is an experiment designed to provide an experimental basis for a unifying point of view (utilizing theoretical framework and chemistry laboratory experiments) for physical chemistry students. Three experiments are described: phase equilibrium, chemical equilibrium, and a test of the third law of thermodynamics. (Author/DS)
Conceptualizing the Suicide-Alcohol Relationship.
ERIC Educational Resources Information Center
Rogers, James R.
Despite the strong empirical evidence linking alcohol use across varying levels to suicidal behavior, the field is lacking a unifying theoretical framework in this area. The concept of alcohol induced myopia to explain the varied effects of alcohol on the behaviors of individuals who drink has been proposed. The term "alcohol myopia" refers to its…
Evolution of spatially structured host-parasite interactions.
Lion, S; Gandon, S
2015-01-01
Spatial structure has dramatic effects on the demography and the evolution of species. A large variety of theoretical models have attempted to understand how local dispersal may shape the coevolution of interacting species such as host-parasite interactions. The lack of a unifying framework is a serious impediment for anyone willing to understand current theory. Here, we review previous theoretical studies in the light of a single epidemiological model that allows us to explore the effects of both host and parasite migration rates on the evolution and coevolution of various life-history traits. We discuss the impact of local dispersal on parasite virulence, various host defence strategies and local adaptation. Our analysis shows that evolutionary and coevolutionary outcomes crucially depend on the details of the host-parasite life cycle and on which life-history trait is involved in the interaction. We also discuss experimental studies that support the effects of spatial structure on the evolution of host-parasite interactions. This review highlights major similarities between some theoretical results, but it also reveals an important gap between evolutionary and coevolutionary models. We discuss possible ways to bridge this gap within a more unified framework that would reconcile spatial epidemiology, evolution and coevolution. © 2014 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2014 European Society For Evolutionary Biology.
Hu, Shiang; Yao, Dezhong; Valdes-Sosa, Pedro A.
2018-01-01
The choice of reference for the electroencephalogram (EEG) is a long-lasting unsolved issue resulting in inconsistent usages and endless debates. Currently, both the average reference (AR) and the reference electrode standardization technique (REST) are two primary, apparently irreconcilable contenders. We propose a theoretical framework to resolve this reference issue by formulating both (a) estimation of potentials at infinity, and (b) determination of the reference, as a unified Bayesian linear inverse problem, which can be solved by maximum a posterior estimation. We find that AR and REST are very particular cases of this unified framework: AR results from biophysically non-informative prior; while REST utilizes the prior based on the EEG generative model. To allow for simultaneous denoising and reference estimation, we develop the regularized versions of AR and REST, named rAR and rREST, respectively. Both depend on a regularization parameter that is the noise to signal variance ratio. Traditional and new estimators are evaluated with this framework, by both simulations and analysis of real resting EEGs. Toward this end, we leverage the MRI and EEG data from 89 subjects which participated in the Cuban Human Brain Mapping Project. Generated artificial EEGs—with a known ground truth, show that relative error in estimating the EEG potentials at infinity is lowest for rREST. It also reveals that realistic volume conductor models improve the performances of REST and rREST. Importantly, for practical applications, it is shown that an average lead field gives the results comparable to the individual lead field. Finally, it is shown that the selection of the regularization parameter with Generalized Cross-Validation (GCV) is close to the “oracle” choice based on the ground truth. When evaluated with the real 89 resting state EEGs, rREST consistently yields the lowest GCV. This study provides a novel perspective to the EEG reference problem by means of a unified inverse solution framework. It may allow additional principled theoretical formulations and numerical evaluation of performance. PMID:29780302
NASA Astrophysics Data System (ADS)
Beretta, Gian Paolo
2014-10-01
By suitable reformulations, we cast the mathematical frameworks of several well-known different approaches to the description of nonequilibrium dynamics into a unified formulation valid in all these contexts, which extends to such frameworks the concept of steepest entropy ascent (SEA) dynamics introduced by the present author in previous works on quantum thermodynamics. Actually, the present formulation constitutes a generalization also for the quantum thermodynamics framework. The analysis emphasizes that in the SEA modeling principle a key role is played by the geometrical metric with respect to which to measure the length of a trajectory in state space. In the near-thermodynamic-equilibrium limit, the metric tensor is directly related to the Onsager's generalized resistivity tensor. Therefore, through the identification of a suitable metric field which generalizes the Onsager generalized resistance to the arbitrarily far-nonequilibrium domain, most of the existing theories of nonequilibrium thermodynamics can be cast in such a way that the state exhibits the spontaneous tendency to evolve in state space along the path of SEA compatible with the conservation constraints and the boundary conditions. The resulting unified family of SEA dynamical models is intrinsically and strongly consistent with the second law of thermodynamics. The non-negativity of the entropy production is a general and readily proved feature of SEA dynamics. In several of the different approaches to nonequilibrium description we consider here, the SEA concept has not been investigated before. We believe it defines the precise meaning and the domain of general validity of the so-called maximum entropy production principle. Therefore, it is hoped that the present unifying approach may prove useful in providing a fresh basis for effective, thermodynamically consistent, numerical models and theoretical treatments of irreversible conservative relaxation towards equilibrium from far nonequilibrium states. The mathematical frameworks we consider are the following: (A) statistical or information-theoretic models of relaxation; (B) small-scale and rarefied gas dynamics (i.e., kinetic models for the Boltzmann equation); (C) rational extended thermodynamics, macroscopic nonequilibrium thermodynamics, and chemical kinetics; (D) mesoscopic nonequilibrium thermodynamics, continuum mechanics with fluctuations; and (E) quantum statistical mechanics, quantum thermodynamics, mesoscopic nonequilibrium quantum thermodynamics, and intrinsic quantum thermodynamics.
Assessing the formability of metallic sheets by means of localized and diffuse necking models
NASA Astrophysics Data System (ADS)
Comşa, Dan-Sorin; Lǎzǎrescu, Lucian; Banabic, Dorel
2016-10-01
The main objective of the paper consists in elaborating a unified framework that allows the theoretical assessment of sheet metal formability. Hill's localized necking model and the Extended Maximum Force Criterion proposed by Mattiasson, Sigvant, and Larsson have been selected for this purpose. Both models are thoroughly described together with their solution procedures. A comparison of the theoretical predictions with experimental data referring to the formability of a DP600 steel sheet is also presented by the authors.
Cosmology and unified gauge theory
NASA Astrophysics Data System (ADS)
Oraifeartaigh, L.
1981-09-01
Theoretical points in common between cosmology and unified gauge theory (UGT) are reviewed, with attention given to areas of one which have proven useful for the other. The underlying principles for both theoretical frameworks are described, noting the differences in scale, i.e., 10 to the 25th cm in cosmology and 10 to the -15th cm for UGT. Cosmology has produced bounds on the number of existing neutrino species, and also on the mass of neutrinos, two factors of interest in particle physics. Electrons, protons, and neutrinos, having been spawned from the same massive leptons, each composed of three quarks, have been predicted to be present in equal numbers in the Universe by UGT, in line with necessities of cosmology. The Grand UGT also suggests specific time scales for proton decay, thus accounting for the observed baryon assymmetry.
The Administrator Training Program. A Model of Educational Leadership.
ERIC Educational Resources Information Center
Funderburg, Jean; And Others
This paper describes the Administrator Training Program (ATP), a joint venture between San Jose Unified School District and Stanford University. A discussion of the ATP's theoretical framework is followed by an outline of the structure and content of the program and a review of the ATP outcomes. Then the generic elements of the ATP model are…
Impact of Beads and Drops on a Repellent Solid Surface: A Unified Description
NASA Astrophysics Data System (ADS)
Arora, S.; Fromental, J.-M.; Mora, S.; Phou, Ty; Ramos, L.; Ligoure, C.
2018-04-01
We investigate freely expanding sheets formed by ultrasoft gel beads, and liquid and viscoelastic drops, produced by the impact of the bead or drop on a silicon wafer covered with a thin layer of liquid nitrogen that suppresses viscous dissipation thanks to an inverse Leidenfrost effect. Our experiments show a unified behavior for the impact dynamics that holds for solids, liquids, and viscoelastic fluids and that we rationalize by properly taking into account elastocapillary effects. In this framework, the classical impact dynamics of solids and liquids, as far as viscous dissipation is negligible, appears as the asymptotic limits of a universal theoretical description. A novel material-dependent characteristic velocity that includes both capillary and bulk elasticity emerges from this unified description of the physics of impact.
Ochs, Christopher; Geller, James; Perl, Yehoshua; Musen, Mark A.
2016-01-01
Software tools play a critical role in the development and maintenance of biomedical ontologies. One important task that is difficult without software tools is ontology quality assurance. In previous work, we have introduced different kinds of abstraction networks to provide a theoretical foundation for ontology quality assurance tools. Abstraction networks summarize the structure and content of ontologies. One kind of abstraction network that we have used repeatedly to support ontology quality assurance is the partial-area taxonomy. It summarizes structurally and semantically similar concepts within an ontology. However, the use of partial-area taxonomies was ad hoc and not generalizable. In this paper, we describe the Ontology Abstraction Framework (OAF), a unified framework and software system for deriving, visualizing, and exploring partial-area taxonomy abstraction networks. The OAF includes support for various ontology representations (e.g., OWL and SNOMED CT's relational format). A Protégé plugin for deriving “live partial-area taxonomies” is demonstrated. PMID:27345947
Ochs, Christopher; Geller, James; Perl, Yehoshua; Musen, Mark A
2016-08-01
Software tools play a critical role in the development and maintenance of biomedical ontologies. One important task that is difficult without software tools is ontology quality assurance. In previous work, we have introduced different kinds of abstraction networks to provide a theoretical foundation for ontology quality assurance tools. Abstraction networks summarize the structure and content of ontologies. One kind of abstraction network that we have used repeatedly to support ontology quality assurance is the partial-area taxonomy. It summarizes structurally and semantically similar concepts within an ontology. However, the use of partial-area taxonomies was ad hoc and not generalizable. In this paper, we describe the Ontology Abstraction Framework (OAF), a unified framework and software system for deriving, visualizing, and exploring partial-area taxonomy abstraction networks. The OAF includes support for various ontology representations (e.g., OWL and SNOMED CT's relational format). A Protégé plugin for deriving "live partial-area taxonomies" is demonstrated. Copyright © 2016 Elsevier Inc. All rights reserved.
Theory of the Origin, Evolution, and Nature of Life
Andrulis, Erik D.
2011-01-01
Life is an inordinately complex unsolved puzzle. Despite significant theoretical progress, experimental anomalies, paradoxes, and enigmas have revealed paradigmatic limitations. Thus, the advancement of scientific understanding requires new models that resolve fundamental problems. Here, I present a theoretical framework that economically fits evidence accumulated from examinations of life. This theory is based upon a straightforward and non-mathematical core model and proposes unique yet empirically consistent explanations for major phenomena including, but not limited to, quantum gravity, phase transitions of water, why living systems are predominantly CHNOPS (carbon, hydrogen, nitrogen, oxygen, phosphorus, and sulfur), homochirality of sugars and amino acids, homeoviscous adaptation, triplet code, and DNA mutations. The theoretical framework unifies the macrocosmic and microcosmic realms, validates predicted laws of nature, and solves the puzzle of the origin and evolution of cellular life in the universe. PMID:25382118
Parrott, Dominic J.
2008-01-01
Theory and research on antigay aggression has identified different motives that facilitate aggression based on sexual orientation. However, the individual and situational determinants of antigay aggression associated with these motivations have yet to be organized within a single theoretical framework. This limits researchers’ ability to organize existing knowledge, link that knowledge with related aggression theory, and guide the application of new findings. To address these limitations, this article argues for the use of an existing conceptual framework to guide thinking and generate new research in this area of study. Contemporary theories of antigay aggression, and empirical support for these theories, are reviewed and interpreted within the unifying framework of the general aggression model [Anderson, C.A. & Bushman, B.J. (2002). Human aggression. Annual Review of Psychology, 53, 27–51.]. It is concluded that this conceptual framework will facilitate investigation of individual and situational risk factors that may contribute to antigay aggression and guide development of individual-level intervention. PMID:18355952
Olbert, Charles M; Gala, Gary J; Tupler, Larry A
2014-05-01
Heterogeneity within psychiatric disorders is both theoretically and practically problematic: For many disorders, it is possible for 2 individuals to share very few or even no symptoms in common yet share the same diagnosis. Polythetic diagnostic criteria have long been recognized to contribute to this heterogeneity, yet no unified theoretical understanding of the coherence of symptom criteria sets currently exists. A general framework for analyzing the logical and mathematical structure, coherence, and diversity of Diagnostic and Statistical Manual diagnostic categories (DSM-5 and DSM-IV-TR) is proposed, drawing from combinatorial mathematics, set theory, and information theory. Theoretical application of this framework to 18 diagnostic categories indicates that in most categories, 2 individuals with the same diagnosis may share no symptoms in common, and that any 2 theoretically possible symptom combinations will share on average less than half their symptoms. Application of this framework to 2 large empirical datasets indicates that patients who meet symptom criteria for major depressive disorder and posttraumatic stress disorder tend to share approximately three-fifths of symptoms in common. For both disorders in each of the datasets, pairs of individuals who shared no common symptoms were observed. Any 2 individuals with either diagnosis were unlikely to exhibit identical symptomatology. The theoretical and empirical results stemming from this approach have substantive implications for etiological research into, and measurement of, psychiatric disorders.
A unified account of perceptual layering and surface appearance in terms of gamut relativity.
Vladusich, Tony; McDonnell, Mark D
2014-01-01
When we look at the world--or a graphical depiction of the world--we perceive surface materials (e.g. a ceramic black and white checkerboard) independently of variations in illumination (e.g. shading or shadow) and atmospheric media (e.g. clouds or smoke). Such percepts are partly based on the way physical surfaces and media reflect and transmit light and partly on the way the human visual system processes the complex patterns of light reaching the eye. One way to understand how these percepts arise is to assume that the visual system parses patterns of light into layered perceptual representations of surfaces, illumination and atmospheric media, one seen through another. Despite a great deal of previous experimental and modelling work on layered representation, however, a unified computational model of key perceptual demonstrations is still lacking. Here we present the first general computational model of perceptual layering and surface appearance--based on a boarder theoretical framework called gamut relativity--that is consistent with these demonstrations. The model (a) qualitatively explains striking effects of perceptual transparency, figure-ground separation and lightness, (b) quantitatively accounts for the role of stimulus- and task-driven constraints on perceptual matching performance, and (c) unifies two prominent theoretical frameworks for understanding surface appearance. The model thereby provides novel insights into the remarkable capacity of the human visual system to represent and identify surface materials, illumination and atmospheric media, which can be exploited in computer graphics applications.
A Unified Account of Perceptual Layering and Surface Appearance in Terms of Gamut Relativity
Vladusich, Tony; McDonnell, Mark D.
2014-01-01
When we look at the world—or a graphical depiction of the world—we perceive surface materials (e.g. a ceramic black and white checkerboard) independently of variations in illumination (e.g. shading or shadow) and atmospheric media (e.g. clouds or smoke). Such percepts are partly based on the way physical surfaces and media reflect and transmit light and partly on the way the human visual system processes the complex patterns of light reaching the eye. One way to understand how these percepts arise is to assume that the visual system parses patterns of light into layered perceptual representations of surfaces, illumination and atmospheric media, one seen through another. Despite a great deal of previous experimental and modelling work on layered representation, however, a unified computational model of key perceptual demonstrations is still lacking. Here we present the first general computational model of perceptual layering and surface appearance—based on a boarder theoretical framework called gamut relativity—that is consistent with these demonstrations. The model (a) qualitatively explains striking effects of perceptual transparency, figure-ground separation and lightness, (b) quantitatively accounts for the role of stimulus- and task-driven constraints on perceptual matching performance, and (c) unifies two prominent theoretical frameworks for understanding surface appearance. The model thereby provides novel insights into the remarkable capacity of the human visual system to represent and identify surface materials, illumination and atmospheric media, which can be exploited in computer graphics applications. PMID:25402466
Kim, Sung-Cheol; Wunsch, Benjamin H; Hu, Huan; Smith, Joshua T; Austin, Robert H; Stolovitzky, Gustavo
2017-06-27
Deterministic lateral displacement (DLD) is a technique for size fractionation of particles in continuous flow that has shown great potential for biological applications. Several theoretical models have been proposed, but experimental evidence has demonstrated that a rich class of intermediate migration behavior exists, which is not predicted. We present a unified theoretical framework to infer the path of particles in the whole array on the basis of trajectories in a unit cell. This framework explains many of the unexpected particle trajectories reported and can be used to design arrays for even nanoscale particle fractionation. We performed experiments that verify these predictions and used our model to develop a condenser array that achieves full particle separation with a single fluidic input.
ERIC Educational Resources Information Center
Perla, Rocco J.; Carifio, James
2011-01-01
Background: Extending Merton's (1936) work on the consequences of purposive social action, the model, theory and taxonomy outlined here incorporates and formalizes both anticipated and unanticipated research findings in a unified theoretical framework. The model of anticipated research findings was developed initially by Carifio (1975, 1977) and…
ERIC Educational Resources Information Center
Jagodzinski, Wolfgang
2010-01-01
This paper investigates the influence of the economic, social, and cultural variables on life satisfaction in Asia and Europe. The second section sets a unifying theoretical framework for all three domains by defining life satisfaction as a function of aspirations and expectations which in turn are affected by micro- and macro-level variables. On…
Temporal cognition: Connecting subjective time to perception, attention, and memory.
Matthews, William J; Meck, Warren H
2016-08-01
Time is a universal psychological dimension, but time perception has often been studied and discussed in relative isolation. Increasingly, researchers are searching for unifying principles and integrated models that link time perception to other domains. In this review, we survey the links between temporal cognition and other psychological processes. Specifically, we describe how subjective duration is affected by nontemporal stimulus properties (perception), the allocation of processing resources (attention), and past experience with the stimulus (memory). We show that many of these connections instantiate a "processing principle," according to which perceived time is positively related to perceptual vividity and the ease of extracting information from the stimulus. This empirical generalization generates testable predictions and provides a starting-point for integrated theoretical frameworks. By outlining some of the links between temporal cognition and other domains, and by providing a unifying principle for understanding these effects, we hope to encourage time-perception researchers to situate their work within broader theoretical frameworks, and that researchers from other fields will be inspired to apply their insights, techniques, and theorizing to improve our understanding of the representation and judgment of time. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rusek, Marian; Orlowski, Arkadiusz
2005-04-01
The dynamics of small ({<=}55 atoms) argon clusters ionized by an intense femtosecond laser pulse is studied using a time-dependent Thomas-Fermi model. The resulting Bloch-like hydrodynamic equations are solved numerically using the smooth particle hydrodynamics method without the necessity of grid simulations. As follows from recent experiments, absorption of radiation and subsequent ionization of clusters observed in the short-wavelength laser frequency regime (98 nm) differs considerably from that in the optical spectral range (800 nm). Our theoretical approach provides a unified framework for treating these very different frequency regimes and allows for a deeper understanding of the underlying cluster explosionmore » mechanisms. The results of our analysis following from extensive numerical simulations presented in this paper are compared both with experimental findings and with predictions of other theoretical models.« less
NASA Astrophysics Data System (ADS)
Fukuda, Jun'ichi; Johnson, Kaj M.
2010-06-01
We present a unified theoretical framework and solution method for probabilistic, Bayesian inversions of crustal deformation data. The inversions involve multiple data sets with unknown relative weights, model parameters that are related linearly or non-linearly through theoretic models to observations, prior information on model parameters and regularization priors to stabilize underdetermined problems. To efficiently handle non-linear inversions in which some of the model parameters are linearly related to the observations, this method combines both analytical least-squares solutions and a Monte Carlo sampling technique. In this method, model parameters that are linearly and non-linearly related to observations, relative weights of multiple data sets and relative weights of prior information and regularization priors are determined in a unified Bayesian framework. In this paper, we define the mixed linear-non-linear inverse problem, outline the theoretical basis for the method, provide a step-by-step algorithm for the inversion, validate the inversion method using synthetic data and apply the method to two real data sets. We apply the method to inversions of multiple geodetic data sets with unknown relative data weights for interseismic fault slip and locking depth. We also apply the method to the problem of estimating the spatial distribution of coseismic slip on faults with unknown fault geometry, relative data weights and smoothing regularization weight.
Discrete shearlet transform: faithful digitization concept and its applications
NASA Astrophysics Data System (ADS)
Lim, Wang-Q.
2011-09-01
Over the past years, various representation systems which sparsely approximate functions governed by anisotropic features such as edges in images have been proposed. Alongside the theoretical development of these systems, algorithmic realizations of the associated transforms were provided. However, one of the most common short-comings of these frameworks is the lack of providing a unified treatment of the continuum and digital world, i.e., allowing a digital theory to be a natural digitization of the continuum theory. Shearlets were introduced as means to sparsely encode anisotropic singularities of multivariate data while providing a unified treatment of the continuous and digital realm. In this paper, we introduce a discrete framework which allows a faithful digitization of the continuum domain shearlet transform based on compactly supported shearlets. Finally, we show numerical experiments demonstrating the potential of the discrete shearlet transform in several image processing applications.
Some characteristics of supernetworks based on unified hybrid network theory framework
NASA Astrophysics Data System (ADS)
Liu, Qiang; Fang, Jin-Qing; Li, Yong
Comparing with single complex networks, supernetworks are more close to the real world in some ways, and have become the newest research hot spot in the network science recently. Some progresses have been made in the research of supernetworks, but the theoretical research method and complex network characteristics of supernetwork models are still needed to further explore. In this paper, we propose three kinds of supernetwork models with three layers based on the unified hybrid network theory framework (UHNTF), and introduce preferential and random linking, respectively, between the upper and lower layers. Then we compared the topological characteristics of the single networks with the supernetwork models. In order to analyze the influence of the interlayer edges on network characteristics, the cross-degree is defined as a new important parameter. Then some interesting new phenomena are found, the results imply this supernetwork model has reference value and application potential.
Single atom emission in an optical resonator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Childs, J.J.; An, K.; Dasari, R.R.
A single atom coupled to a single mode of a radiation field is a fundamental system for studying the interaction of radiation with matter. The study of such systems has come to be called cavity quantum electrodynamics (QED). Atoms coupled to a single mode of a resonator have been studied experimentally and theoretically in several interesting regimes since this basic system was first considered theoretically by Janes and Cummings. The objective of the present chapter is to provide a theoretical framework and present a unifying picture of the various phenomena which can occur in such a system. 35 refs., 11more » figs.« less
Konik, Robert M.; Sfeir, Matthew Y.; Misewich, James A.
2015-02-17
We demonstrate that a non-perturbative framework for the treatment of the excitations of single walled carbon nanotubes based upon a field theoretic reduction is able to accurately describe experiment observations of the absolute values of excitonic energies. This theoretical framework yields a simple scaling function from which the excitonic energies can be read off. This scaling function is primarily determined by a single parameter, the charge Luttinger parameter of the tube, which is in turn a function of the tube chirality, dielectric environment, and the tube's dimensions, thus expressing disparate influences on the excitonic energies in a unified fashion. Asmore » a result, we test this theory explicitly on the data reported in [NanoLetters 5, 2314 (2005)] and [Phys. Rev. B 82, 195424 (2010)] and so demonstrate the method works over a wide range of reported excitonic spectra.« less
Research directions in large scale systems and decentralized control
NASA Technical Reports Server (NTRS)
Tenney, R. R.
1980-01-01
Control theory provides a well established framework for dealing with automatic decision problems and a set of techniques for automatic decision making which exploit special structure, but it does not deal well with complexity. The potential exists for combining control theoretic and knowledge based concepts into a unified approach. The elements of control theory are diagrammed, including modern control and large scale systems.
NASA Astrophysics Data System (ADS)
Herrmann, K.
2009-11-01
Information-theoretic approaches still play a minor role in financial market analysis. Nonetheless, there have been two very similar approaches evolving during the last years, one in the so-called econophysics and the other in econometrics. Both generalize the notion of GARCH processes in an information-theoretic sense and are able to capture kurtosis better than traditional models. In this article we present both approaches in a more general framework. The latter allows the derivation of a wide range of new models. We choose a third model using an entropy measure suggested by Kapur. In an application to financial market data, we find that all considered models - with similar flexibility in terms of skewness and kurtosis - lead to very similar results.
Kim, Sung-Cheol; Wunsch, Benjamin H.; Hu, Huan; Smith, Joshua T.; Stolovitzky, Gustavo
2017-01-01
Deterministic lateral displacement (DLD) is a technique for size fractionation of particles in continuous flow that has shown great potential for biological applications. Several theoretical models have been proposed, but experimental evidence has demonstrated that a rich class of intermediate migration behavior exists, which is not predicted. We present a unified theoretical framework to infer the path of particles in the whole array on the basis of trajectories in a unit cell. This framework explains many of the unexpected particle trajectories reported and can be used to design arrays for even nanoscale particle fractionation. We performed experiments that verify these predictions and used our model to develop a condenser array that achieves full particle separation with a single fluidic input. PMID:28607075
Fingelkurts, Andrew A; Fingelkurts, Alexander A; Neves, Carlos F H
2012-01-05
Instead of using low-level neurophysiology mimicking and exploratory programming methods commonly used in the machine consciousness field, the hierarchical operational architectonics (OA) framework of brain and mind functioning proposes an alternative conceptual-theoretical framework as a new direction in the area of model-driven machine (robot) consciousness engineering. The unified brain-mind theoretical OA model explicitly captures (though in an informal way) the basic essence of brain functional architecture, which indeed constitutes a theory of consciousness. The OA describes the neurophysiological basis of the phenomenal level of brain organization. In this context the problem of producing man-made "machine" consciousness and "artificial" thought is a matter of duplicating all levels of the operational architectonics hierarchy (with its inherent rules and mechanisms) found in the brain electromagnetic field. We hope that the conceptual-theoretical framework described in this paper will stimulate the interest of mathematicians and/or computer scientists to abstract and formalize principles of hierarchy of brain operations which are the building blocks for phenomenal consciousness and thought. Copyright © 2010 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Singh, Gurmak; Hardaker, Glenn
2017-01-01
Using Giddens' theory of structuration as a theoretical framework, this paper outlines how five prominent United Kingdom universities aimed to integrate top-down and bottom-up approaches to the adoption and diffusion of e-learning. The aim of this paper is to examine the major challenges that arise from the convergence of bottom-up perspectives…
The formal Darwinism project: a mid-term report.
Grafen, A
2007-07-01
For 8 years I have been pursuing in print an ambitious and at times highly technical programme of work, the 'Formal Darwinism Project', whose essence is to underpin and formalize the fitness optimization ideas used by behavioural ecologists, using a new kind of argument linking the mathematics of motion and the mathematics of optimization. The value of the project is to give stronger support to current practices, and at the same time sharpening theoretical ideas and suggesting principled resolutions of some untidy areas, for example, how to define fitness. The aim is also to unify existing free-standing theoretical structures, such as inclusive fitness theory, Evolutionary Stable Strategy (ESS) theory and bet-hedging theory. The 40-year-old misunderstanding over the meaning of fitness optimization between mathematicians and biologists is explained. Most of the elements required for a general theory have now been implemented, but not together in the same framework, and 'general time' remains to be developed and integrated with the other elements to produce a final unified theory of neo-Darwinian natural selection.
Leaving behind our preparadigmatic past: Professional psychology as a unified clinical science.
Melchert, Timothy P
2016-09-01
The behavioral and neurosciences have made remarkable progress recently in advancing the scientific understanding of human psychology. Though research in many areas is still in its early stages, knowledge of many psychological processes is now firmly grounded in experimental tests of falsifiable theories and supports a unified, paradigmatic understanding of human psychology that is thoroughly consistent with the rest of the natural sciences. This new body of knowledge poses critical questions for professional psychology, which still often relies on the traditional theoretical orientations and other preparadigmatic practices for guiding important aspects of clinical education and practice. This article argues that professional psychology needs to systematically transition to theoretical frameworks and a curriculum that are based on an integrated scientific understanding of human psychology. Doing so would be of historic importance for the field and would result in major changes to professional psychology education and practice. It would also allow the field to emerge as a true clinical science. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Connecting single cell to collective cell behavior in a unified theoretical framework
NASA Astrophysics Data System (ADS)
George, Mishel; Bullo, Francesco; Campàs, Otger
Collective cell behavior is an essential part of tissue and organ morphogenesis during embryonic development, as well as of various disease processes, such as cancer. In contrast to many in vitro studies of collective cell migration, most cases of in vivo collective cell migration involve rather small groups of cells, with large sheets of migrating cells being less common. The vast majority of theoretical descriptions of collective cell behavior focus on large numbers of cells, but fail to accurately capture the dynamics of small groups of cells. Here we introduce a low-dimensional theoretical description that successfully captures single cell migration, cell collisions, collective dynamics in small groups of cells, and force propagation during sheet expansion, all within a common theoretical framework. Our description is derived from first principles and also includes key phenomenological aspects of cell migration that control the dynamics of traction forces. Among other results, we explain the counter-intuitive observations that pairs of cells repel each other upon collision while they behave in a coordinated manner within larger clusters.
Statistical mechanics framework for static granular matter.
Henkes, Silke; Chakraborty, Bulbul
2009-06-01
The physical properties of granular materials have been extensively studied in recent years. So far, however, there exists no theoretical framework which can explain the observations in a unified manner beyond the phenomenological jamming diagram. This work focuses on the case of static granular matter, where we have constructed a statistical ensemble which mirrors equilibrium statistical mechanics. This ensemble, which is based on the conservation properties of the stress tensor, is distinct from the original Edwards ensemble and applies to packings of deformable grains. We combine it with a field theoretical analysis of the packings, where the field is the Airy stress function derived from the force and torque balance conditions. In this framework, Point J characterized by a diverging stiffness of the pressure fluctuations. Separately, we present a phenomenological mean-field theory of the jamming transition, which incorporates the mean contact number as a variable. We link both approaches in the context of the marginal rigidity picture proposed by Wyart and others.
Buckingham, C D; Adams, A
2000-10-01
This is the second of two linked papers exploring decision making in nursing. The first paper, 'Classifying clinical decision making: a unifying approach' investigated difficulties with applying a range of decision-making theories to nursing practice. This is due to the diversity of terminology and theoretical concepts used, which militate against nurses being able to compare the outcomes of decisions analysed within different frameworks. It is therefore problematic for nurses to assess how good their decisions are, and where improvements can be made. However, despite the range of nomenclature, it was argued that there are underlying similarities between all theories of decision processes and that these should be exposed through integration within a single explanatory framework. A proposed solution was to use a general model of psychological classification to clarify and compare terms, concepts and processes identified across the different theories. The unifying framework of classification was described and this paper operationalizes it to demonstrate how different approaches to clinical decision making can be re-interpreted as classification behaviour. Particular attention is focused on classification in nursing, and on re-evaluating heuristic reasoning, which has been particularly prone to theoretical and terminological confusion. Demonstrating similarities in how different disciplines make decisions should promote improved multidisciplinary collaboration and a weakening of clinical elitism, thereby enhancing organizational effectiveness in health care and nurses' professional status. This is particularly important as nurses' roles continue to expand to embrace elements of managerial, medical and therapeutic work. Analysing nurses' decisions as classification behaviour will also enhance clinical effectiveness, and assist in making nurses' expertise more visible. In addition, the classification framework explodes the myth that intuition, traditionally associated with nurses' decision making, is less rational and scientific than other approaches.
Electric and Magnetic Interactions
NASA Astrophysics Data System (ADS)
Chabay, Ruth W.; Sherwood, Bruce A.
1994-08-01
The curriculum has been restructured so that students will have the necessary fundamental understanding of charges and fields before going on to more complex issues. Qualitative reasoning and quantitative analysis are discussed equally in order to provide a meaningful conceptual framework within which the quantitative work makes more sense. Atomic-level analysis is stressed and electrostatics and circuits are unified. Desktop experiments can be conducted at home or in the classroom and are tightly integrated with the theoretical treatment.
Ghosh, Avijit; Scott, Dennis O; Maurer, Tristan S
2014-02-14
In this work, we provide a unified theoretical framework describing how drug molecules can permeate across membranes in neutral and ionized forms for unstirred in vitro systems. The analysis provides a self-consistent basis for the origin of the unstirred water layer (UWL) within the Nernst-Planck framework in the fully unstirred limit and further provides an accounting mechanism based simply on the bulk aqueous solvent diffusion constant of the drug molecule. Our framework makes no new assumptions about the underlying physics of molecular permeation. We hold simply that Nernst-Planck is a reasonable approximation at low concentrations and all physical systems must conserve mass. The applicability of the derived framework has been examined both with respect to the effect of stirring and externally applied voltages to measured permeability. The analysis contains data for 9 compounds extracted from the literature representing a range of permeabilities and aqueous diffusion coefficients. Applicability with respect to ionized permeation is examined using literature data for the permanently charged cation, crystal violet, providing a basis for the underlying mechanism for ionized drug permeation for this molecule as being due to mobile counter-current flow. Copyright © 2013 Elsevier B.V. All rights reserved.
Papadimitriou, Konstantinos I.; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M.
2014-01-01
The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4th) order topology. PMID:25653579
Papadimitriou, Konstantinos I; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M
2014-01-01
The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4(th)) order topology.
Towards a unified theory of health-disease: II. Holopathogenesis
Almeida-Filho, Naomar
2014-01-01
This article presents a systematic framework for modeling several classes of illness-sickness-disease named as Holopathogenesis. Holopathogenesis is defined as processes of over-determination of diseases and related conditions taken as a whole, comprising selected facets of the complex object Health. First, a conceptual background of Holopathogenesis is presented as a series of significant interfaces (biomolecular-immunological, physiopathological-clinical, epidemiological-ecosocial). Second, propositions derived from Holopathogenesis are introduced in order to allow drawing the disease-illness-sickness complex as a hierarchical network of networks. Third, a formalization of intra- and inter-level correspondences, over-determination processes, effects and links of Holopathogenesis models is proposed. Finally, the Holopathogenesis frame is evaluated as a comprehensive theoretical pathology taken as a preliminary step towards a unified theory of health-disease. PMID:24897040
2015-03-01
a hotel and a hospital. 2. Event handler for emergency policies (item 2 above): this has been implemented in two UG projects, one project developed a...Workshop on Logical and Se- mantic Frameworks, with Applications, Brasilia, Brazil , September 2014. Electronic Notes in Theoretical Computer Science (to...Brasilia, Brazil , September 2014, 2015. [3] S. Barker. The next 700 access control models or a unifying meta-model? In SACMAT 2009, 14th ACM Symposium on
Wang, Guoli; Ebrahimi, Nader
2014-01-01
Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H, such that V ∼ W H. It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H. In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data. PMID:25821345
Devarajan, Karthik; Wang, Guoli; Ebrahimi, Nader
2015-04-01
Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H , such that V ∼ W H . It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H . In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data.
Equivalent formulations of “the equation of life”
NASA Astrophysics Data System (ADS)
Ao, Ping
2014-07-01
Motivated by progress in theoretical biology a recent proposal on a general and quantitative dynamical framework for nonequilibrium processes and dynamics of complex systems is briefly reviewed. It is nothing but the evolutionary process discovered by Charles Darwin and Alfred Wallace. Such general and structured dynamics may be tentatively named “the equation of life”. Three equivalent formulations are discussed, and it is also pointed out that such a quantitative dynamical framework leads naturally to the powerful Boltzmann-Gibbs distribution and the second law in physics. In this way, the equation of life provides a logically consistent foundation for thermodynamics. This view clarifies a particular outstanding problem and further suggests a unifying principle for physics and biology.
Wall, Matthew A; Harmsen, Stefan; Pal, Soumik; Zhang, Lihua; Arianna, Gianluca; Lombardi, John R; Drain, Charles Michael; Kircher, Moritz F
2017-06-01
Gold nanoparticles have unique properties that are highly dependent on their shape and size. Synthetic methods that enable precise control over nanoparticle morphology currently require shape-directing agents such as surfactants or polymers that force growth in a particular direction by adsorbing to specific crystal facets. These auxiliary reagents passivate the nanoparticles' surface, and thus decrease their performance in applications like catalysis and surface-enhanced Raman scattering. Here, a surfactant- and polymer-free approach to achieving high-performance gold nanoparticles is reported. A theoretical framework to elucidate the growth mechanism of nanoparticles in surfactant-free media is developed and it is applied to identify strategies for shape-controlled syntheses. Using the results of the analyses, a simple, green-chemistry synthesis of the four most commonly used morphologies: nanostars, nanospheres, nanorods, and nanoplates is designed. The nanoparticles synthesized by this method outperform analogous particles with surfactant and polymer coatings in both catalysis and surface-enhanced Raman scattering. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Nesi, Jacqueline; Choukas-Bradley, Sophia; Prinstein, Mitchell J
2018-04-07
Investigators have long recognized that adolescents' peer experiences provide a crucial context for the acquisition of developmental competencies, as well as potential risks for a range of adjustment difficulties. However, recent years have seen an exponential increase in adolescents' adoption of social media tools, fundamentally reshaping the landscape of adolescent peer interactions. Although research has begun to examine social media use among adolescents, researchers have lacked a unifying framework for understanding the impact of social media on adolescents' peer experiences. This paper represents Part 1 of a two-part theoretical review, in which we offer a transformation framework to integrate interdisciplinary social media scholarship and guide future work on social media use and peer relations from a theory-driven perspective. We draw on prior conceptualizations of social media as a distinct interpersonal context and apply this understanding to adolescents' peer experiences, outlining features of social media with particular relevance to adolescent peer relations. We argue that social media transforms adolescent peer relationships in five key ways: by changing the frequency or immediacy of experiences, amplifying experiences and demands, altering the qualitative nature of interactions, facilitating new opportunities for compensatory behaviors, and creating entirely novel behaviors. We offer an illustration of the transformation framework applied to adolescents' dyadic friendship processes (i.e., experiences typically occurring between two individuals), reviewing existing evidence and offering theoretical implications. Overall, the transformation framework represents a departure from the prevailing approaches of prior peer relations work and a new model for understanding peer relations in the social media context.
Theoretical uncertainties in the calculation of supersymmetric dark matter observables
NASA Astrophysics Data System (ADS)
Bergeron, Paul; Sandick, Pearl; Sinha, Kuver
2018-05-01
We estimate the current theoretical uncertainty in supersymmetric dark matter predictions by comparing several state-of-the-art calculations within the minimal supersymmetric standard model (MSSM). We consider standard neutralino dark matter scenarios — coannihilation, well-tempering, pseudoscalar resonance — and benchmark models both in the pMSSM framework and in frameworks with Grand Unified Theory (GUT)-scale unification of supersymmetric mass parameters. The pipelines we consider are constructed from the publicly available software packages SOFTSUSY, SPheno, FeynHiggs, SusyHD, micrOMEGAs, and DarkSUSY. We find that the theoretical uncertainty in the relic density as calculated by different pipelines, in general, far exceeds the statistical errors reported by the Planck collaboration. In GUT models, in particular, the relative discrepancies in the results reported by different pipelines can be as much as a few orders of magnitude. We find that these discrepancies are especially pronounced for cases where the dark matter physics relies critically on calculations related to electroweak symmetry breaking, which we investigate in detail, and for coannihilation models, where there is heightened sensitivity to the sparticle spectrum. The dark matter annihilation cross section today and the scattering cross section with nuclei also suffer appreciable theoretical uncertainties, which, as experiments reach the relevant sensitivities, could lead to uncertainty in conclusions regarding the viability or exclusion of particular models.
A movement ecology paradigm for unifying organismal movement research
Nathan, Ran; Getz, Wayne M.; Revilla, Eloy; Holyoak, Marcel; Kadmon, Ronen; Saltz, David; Smouse, Peter E.
2008-01-01
Movement of individual organisms is fundamental to life, quilting our planet in a rich tapestry of phenomena with diverse implications for ecosystems and humans. Movement research is both plentiful and insightful, and recent methodological advances facilitate obtaining a detailed view of individual movement. Yet, we lack a general unifying paradigm, derived from first principles, which can place movement studies within a common context and advance the development of a mature scientific discipline. This introductory article to the Movement Ecology Special Feature proposes a paradigm that integrates conceptual, theoretical, methodological, and empirical frameworks for studying movement of all organisms, from microbes to trees to elephants. We introduce a conceptual framework depicting the interplay among four basic mechanistic components of organismal movement: the internal state (why move?), motion (how to move?), and navigation (when and where to move?) capacities of the individual and the external factors affecting movement. We demonstrate how the proposed framework aids the study of various taxa and movement types; promotes the formulation of hypotheses about movement; and complements existing biomechanical, cognitive, random, and optimality paradigms of movement. The proposed framework integrates eclectic research on movement into a structured paradigm and aims at providing a basis for hypothesis generation and a vehicle facilitating the understanding of the causes, mechanisms, and spatiotemporal patterns of movement and their role in various ecological and evolutionary processes. ”Now we must consider in general the common reason for moving with any movement whatever.“ (Aristotle, De Motu Animalium, 4th century B.C.) PMID:19060196
NASA Astrophysics Data System (ADS)
Ganapathy, Vinay; Ramachandran, Ramesh
2017-10-01
The response of a quadrupolar nucleus (nuclear spin with I > 1/2) to an oscillating radio-frequency pulse/field is delicately dependent on the ratio of the quadrupolar coupling constant to the amplitude of the pulse in addition to its duration and oscillating frequency. Consequently, analytic description of the excitation process in the density operator formalism has remained less transparent within existing theoretical frameworks. As an alternative, the utility of the "concept of effective Floquet Hamiltonians" is explored in the present study to explicate the nuances of the excitation process in multilevel systems. Employing spin I = 3/2 as a case study, a unified theoretical framework for describing the excitation of multiple-quantum transitions in static isotropic and anisotropic solids is proposed within the framework of perturbation theory. The challenges resulting from the anisotropic nature of the quadrupolar interactions are addressed within the effective Hamiltonian framework. The possible role of the various interaction frames on the convergence of the perturbation corrections is discussed along with a proposal for a "hybrid method" for describing the excitation process in anisotropic solids. Employing suitable model systems, the validity of the proposed hybrid method is substantiated through a rigorous comparison between simulations emerging from exact numerical and analytic methods.
Big behavioral data: psychology, ethology and the foundations of neuroscience.
Gomez-Marin, Alex; Paton, Joseph J; Kampff, Adam R; Costa, Rui M; Mainen, Zachary F
2014-11-01
Behavior is a unifying organismal process where genes, neural function, anatomy and environment converge and interrelate. Here we review the current state and discuss the future effect of accelerating advances in technology for behavioral studies, focusing on rodents as an example. We frame our perspective in three dimensions: the degree of experimental constraint, dimensionality of data and level of description. We argue that 'big behavioral data' presents challenges proportionate to its promise and describe how these challenges might be met through opportunities afforded by the two rival conceptual legacies of twentieth century behavioral science, ethology and psychology. We conclude that, although 'more is not necessarily better', copious, quantitative and open behavioral data has the potential to transform and unify these two disciplines and to solidify the foundations of others, including neuroscience, but only if the development of new theoretical frameworks and improved experimental designs matches the technological progress.
Standard representation and unified stability analysis for dynamic artificial neural network models.
Kim, Kwang-Ki K; Patrón, Ernesto Ríos; Braatz, Richard D
2018-02-01
An overview is provided of dynamic artificial neural network models (DANNs) for nonlinear dynamical system identification and control problems, and convex stability conditions are proposed that are less conservative than past results. The three most popular classes of dynamic artificial neural network models are described, with their mathematical representations and architectures followed by transformations based on their block diagrams that are convenient for stability and performance analyses. Classes of nonlinear dynamical systems that are universally approximated by such models are characterized, which include rigorous upper bounds on the approximation errors. A unified framework and linear matrix inequality-based stability conditions are described for different classes of dynamic artificial neural network models that take additional information into account such as local slope restrictions and whether the nonlinearities within the DANNs are odd. A theoretical example shows reduced conservatism obtained by the conditions. Copyright © 2017. Published by Elsevier Ltd.
The Price Equation, Gradient Dynamics, and Continuous Trait Game Theory.
Lehtonen, Jussi
2018-01-01
A recent article convincingly nominated the Price equation as the fundamental theorem of evolution and used it as a foundation to derive several other theorems. A major section of evolutionary theory that was not addressed is that of game theory and gradient dynamics of continuous traits with frequency-dependent fitness. Deriving fundamental results in these fields under the unifying framework of the Price equation illuminates similarities and differences between approaches and allows a simple, unified view of game-theoretical and dynamic concepts. Using Taylor polynomials and the Price equation, I derive a dynamic measure of evolutionary change, a condition for singular points, the convergence stability criterion, and an alternative interpretation of evolutionary stability. Furthermore, by applying the Price equation to a multivariable Taylor polynomial, the direct fitness approach to kin selection emerges. Finally, I compare these results to the mean gradient equation of quantitative genetics and the canonical equation of adaptive dynamics.
Phase noise suppression for coherent optical block transmission systems: a unified framework.
Yang, Chuanchuan; Yang, Feng; Wang, Ziyu
2011-08-29
A unified framework for phase noise suppression is proposed in this paper, which could be applied in any coherent optical block transmission systems, including coherent optical orthogonal frequency-division multiplexing (CO-OFDM), coherent optical single-carrier frequency-domain equalization block transmission (CO-SCFDE), etc. Based on adaptive modeling of phase noise, unified observation equations for different coherent optical block transmission systems are constructed, which lead to unified phase noise estimation and suppression. Numerical results demonstrate that the proposal is powerful in mitigating laser phase noise.
Emotion and the prefrontal cortex: An integrative review.
Dixon, Matthew L; Thiruchselvam, Ravi; Todd, Rebecca; Christoff, Kalina
2017-10-01
The prefrontal cortex (PFC) plays a critical role in the generation and regulation of emotion. However, we lack an integrative framework for understanding how different emotion-related functions are organized across the entire expanse of the PFC, as prior reviews have generally focused on specific emotional processes (e.g., decision making) or specific anatomical regions (e.g., orbitofrontal cortex). Additionally, psychological theories and neuroscientific investigations have proceeded largely independently because of the lack of a common framework. Here, we provide a comprehensive review of functional neuroimaging, electrophysiological, lesion, and structural connectivity studies on the emotion-related functions of 8 subregions spanning the entire PFC. We introduce the appraisal-by-content model, which provides a new framework for integrating the diverse range of empirical findings. Within this framework, appraisal serves as a unifying principle for understanding the PFC's role in emotion, while relative content-specialization serves as a differentiating principle for understanding the role of each subregion. A synthesis of data from affective, social, and cognitive neuroscience studies suggests that different PFC subregions are preferentially involved in assigning value to specific types of inputs: exteroceptive sensations, episodic memories and imagined future events, viscero-sensory signals, viscero-motor signals, actions, others' mental states (e.g., intentions), self-related information, and ongoing emotions. We discuss the implications of this integrative framework for understanding emotion regulation, value-based decision making, emotional salience, and refining theoretical models of emotion. This framework provides a unified understanding of how emotional processes are organized across PFC subregions and generates new hypotheses about the mechanisms underlying adaptive and maladaptive emotional functioning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Where do spontaneous first impressions of faces come from?
Over, Harriet; Cook, Richard
2018-01-01
Humans spontaneously attribute a wide range of traits to strangers based solely on their facial features. These first impressions are known to exert striking effects on our choices and behaviours. In this paper, we provide a theoretical account of the origins of these spontaneous trait inferences. We describe a novel framework ('Trait Inference Mapping') in which trait inferences are products of mappings between locations in 'face space' and 'trait space'. These mappings are acquired during ontogeny and allow excitation of face representations to propagate automatically to associated trait representations. This conceptualization provides a framework within which the relative contribution of ontogenetic experience and genetic inheritance can be considered. Contrary to many existing ideas about the origins of trait inferences, we propose only a limited role for innate mechanisms and natural selection. Instead, our model explains inter-observer consistency by appealing to cultural learning and physiological responses that facilitate or 'canalise' particular face-trait mappings. Our TIM framework has both theoretical and substantive implications, and can be extended to trait inferences from non-facial cues to provide a unified account of first impressions. Copyright © 2017 Elsevier B.V. All rights reserved.
Huang, Zhen
2017-01-01
This paper uses experimental investigation and theoretical derivation to study the unified failure mechanism and ultimate capacity model of reinforced concrete (RC) members under combined axial, bending, shear and torsion loading. Fifteen RC members are tested under different combinations of compressive axial force, bending, shear and torsion using experimental equipment designed by the authors. The failure mechanism and ultimate strength data for the four groups of tested RC members under different combined loading conditions are investigated and discussed in detail. The experimental research seeks to determine how the ultimate strength of RC members changes with changing combined loads. According to the experimental research, a unified theoretical model is established by determining the shape of the warped failure surface, assuming an appropriate stress distribution on the failure surface, and considering the equilibrium conditions. This unified failure model can be reasonably and systematically changed into well-known failure theories of concrete members under single or combined loading. The unified calculation model could be easily used in design applications with some assumptions and simplifications. Finally, the accuracy of this theoretical unified model is verified by comparisons with experimental results. PMID:28414777
Reid, Stephen J
2011-04-01
As the body of literature on rural health has grown, the need to develop a unifying theoretical framework has become more apparent. There are many different ways of seeing the same phenomenon, depending on the assumptions we make and the perspective we choose. A conceptual and theoretical basis for the education of health professionals in rural health has not yet been described. This paper examines a number of theoretical frameworks that have been used in the rural health discourse and aims to identify relevant theory that originates from an educational paradigm. The experience of students in rural health is described phenomenologically in terms of two complementary perspectives, using a geographic basis on the one hand, and a developmental viewpoint on the other. The educational features and implications of these perspectives are drawn out. The concept of a 'pedagogy of place' recognizes the importance of the context of learning and allows the uniqueness of a local community to integrate learning at all levels. The theory of critical pedagogy is also found relevant to education for rural health, which would ideally produce 'transformative' graduates who understand the privilege of their position, and who are capable of and committed to engaging in the struggles for equity and justice, both within their practices as well as in the wider society. It is proposed that a 'critical pedagogy of place,' which gives due acknowledgement to local peculiarities and strengths, while situating this within a wider framework of the political, social and economic disparities that impact on the health of rural people, is an appropriate theoretical basis for a distinct rural pedagogy in the health sciences.
NASA Astrophysics Data System (ADS)
Amaral, Barbara; Cabello, Adán; Cunha, Marcelo Terra; Aolita, Leandro
2018-03-01
Contextuality is a fundamental feature of quantum theory necessary for certain models of quantum computation and communication. Serious steps have therefore been taken towards a formal framework for contextuality as an operational resource. However, the main ingredient of a resource theory—a concrete, explicit form of free operations of contextuality—was still missing. Here we provide such a component by introducing noncontextual wirings: a class of contextuality-free operations with a clear operational interpretation and a friendly parametrization. We characterize them completely for general black-box measurement devices with arbitrarily many inputs and outputs. As applications, we show that the relative entropy of contextuality is a contextuality monotone and that maximally contextual boxes that serve as contextuality bits exist for a broad class of scenarios. Our results complete a unified resource-theoretic framework for contextuality and Bell nonlocality.
A Clustering-Based Approach to Enriching Code Foraging Environment.
Niu, Nan; Jin, Xiaoyu; Niu, Zhendong; Cheng, Jing-Ru C; Li, Ling; Kataev, Mikhail Yu
2016-09-01
Developers often spend valuable time navigating and seeking relevant code in software maintenance. Currently, there is a lack of theoretical foundations to guide tool design and evaluation to best shape the code base to developers. This paper contributes a unified code navigation theory in light of the optimal food-foraging principles. We further develop a novel framework for automatically assessing the foraging mechanisms in the context of program investigation. We use the framework to examine to what extent the clustering of software entities affects code foraging. Our quantitative analysis of long-lived open-source projects suggests that clustering enriches the software environment and improves foraging efficiency. Our qualitative inquiry reveals concrete insights into real developer's behavior. Our research opens the avenue toward building a new set of ecologically valid code navigation tools.
Amaral, Barbara; Cabello, Adán; Cunha, Marcelo Terra; Aolita, Leandro
2018-03-30
Contextuality is a fundamental feature of quantum theory necessary for certain models of quantum computation and communication. Serious steps have therefore been taken towards a formal framework for contextuality as an operational resource. However, the main ingredient of a resource theory-a concrete, explicit form of free operations of contextuality-was still missing. Here we provide such a component by introducing noncontextual wirings: a class of contextuality-free operations with a clear operational interpretation and a friendly parametrization. We characterize them completely for general black-box measurement devices with arbitrarily many inputs and outputs. As applications, we show that the relative entropy of contextuality is a contextuality monotone and that maximally contextual boxes that serve as contextuality bits exist for a broad class of scenarios. Our results complete a unified resource-theoretic framework for contextuality and Bell nonlocality.
Unifying practice schedules in the timescales of motor learning and performance.
Verhoeven, F Martijn; Newell, Karl M
2018-06-01
In this article, we elaborate from a multiple time scales model of motor learning to examine the independent and integrated effects of massed and distributed practice schedules within- and between-sessions on the persistent (learning) and transient (warm-up, fatigue) processes of performance change. The timescales framework reveals the influence of practice distribution on four learning-related processes: the persistent processes of learning and forgetting, and the transient processes of warm-up decrement and fatigue. The superposition of the different processes of practice leads to a unified set of effects for massed and distributed practice within- and between-sessions in learning motor tasks. This analysis of the interaction between the duration of the interval of practice trials or sessions and parameters of the introduced time scale model captures the unified influence of the between trial and session scheduling of practice on learning and performance. It provides a starting point for new theoretically based hypotheses, and the scheduling of practice that minimizes the negative effects of warm-up decrement, fatigue and forgetting while exploiting the positive effects of learning and retention. Copyright © 2018 Elsevier B.V. All rights reserved.
Unifying Suspension and Granular flows near Jamming
NASA Astrophysics Data System (ADS)
DeGiuli, Eric; Wyart, Matthieu
2017-06-01
Rheological properties of dense flows of hard particles are singular as one approaches the jamming threshold where flow ceases, both for granular flows dominated by inertia, and for over-damped suspensions. Concomitantly, the lengthscale characterizing velocity correlations appears to diverge at jamming. Here we review a theoretical framework that gives a scaling description of stationary flows of frictionless particles. Our analysis applies both to suspensions and inertial flows of hard particles. We report numerical results in support of the theory, and show the phase diagram that results when friction is added, delineating the regime of validity of the frictionless theory.
Towards a Grand Unified Theory of sports performance.
Glazier, Paul S
2017-12-01
Sports performance is generally considered to be governed by a range of interacting physiological, biomechanical, and psychological variables, amongst others. Despite sports performance being multi-factorial, however, the majority of performance-oriented sports science research has predominantly been monodisciplinary in nature, presumably due, at least in part, to the lack of a unifying theoretical framework required to integrate the various subdisciplines of sports science. In this target article, I propose a Grand Unified Theory (GUT) of sports performance-and, by elaboration, sports science-based around the constraints framework introduced originally by Newell (1986). A central tenet of this GUT is that, at both the intra- and inter-individual levels of analysis, patterns of coordination and control, which directly determine the performance outcome, emerge from the confluence of interacting organismic, environmental, and task constraints via the formation and self-organisation of coordinative structures. It is suggested that this GUT could be used to: foster interdisciplinary research collaborations; break down the silos that have developed in sports science and restore greater disciplinary balance to the field; promote a more holistic understanding of sports performance across all levels of analysis; increase explanatory power of applied research work; provide stronger rationale for data collection and variable selection; and direct the development of integrated performance monitoring technologies. This GUT could also provide a scientifically rigorous basis for integrating the subdisciplines of sports science in applied sports science support programmes adopted by high-performance agencies and national governing bodies for various individual and team sports. Copyright © 2017 Elsevier B.V. All rights reserved.
Unifying theoretical framework for deciphering the oxygen reduction reaction on platinum.
Huang, Jun; Zhang, Jianbo; Eikerling, Michael
2018-05-07
Rapid conversion of oxygen into water is crucial to the operation of polymer electrolyte fuel cells and other emerging electrochemical energy technologies. Chemisorbed oxygen species play double-edged roles in this reaction, acting as vital intermediates on one hand and site-blockers on the other. Any attempt to decipher the oxygen reduction reaction (ORR) must first relate the formation of oxygen intermediates to basic electronic and electrostatic properties of the catalytic surface, and then link it to parameters of catalyst activity. An approach that accomplishes this feat will be of great utility for catalyst materials development and predictive model formulation of electrode operation. Here, we present a theoretical framework for the multiple interrelated surface phenomena and processes involved, particularly, by incorporating the double-layer effects. It sheds light on the roles of oxygen intermediates and gives out the Tafel slope and exchange current density as continuous functions of electrode potential. Moreover, it develops the concept of a rate determining term, which should replace the concept of a rate determining step for multielectron reactions, and offers a new perspective on the volcano relation of the ORR.
Lappi, Otto; Mole, Callum
2018-06-11
The authors present an approach to the coordination of eye movements and locomotion in naturalistic steering tasks. It is based on recent empirical research, in particular, on driver eye movements, that poses challenges for existing accounts of how we visually steer a course. They first analyze how the ideas of feedback and feedforward processes and internal models are treated in control theoretical steering models within vision science and engineering, which share an underlying architecture but have historically developed in very separate ways. The authors then show how these traditions can be naturally (re)integrated with each other and with contemporary neuroscience, to better understand the skill and gaze strategies involved. They then propose a conceptual model that (a) gives a unified account to the coordination of gaze and steering control, (b) incorporates higher-level path planning, and (c) draws on the literature on paired forward and inverse models in predictive control. Although each of these (a-c) has been considered before (also in the context of driving), integrating them into a single framework and the authors' multiple waypoint identification hypothesis within that framework are novel. The proposed hypothesis is relevant to all forms of visually guided locomotion. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Combining statistical inference and decisions in ecology
Williams, Perry J.; Hooten, Mevin B.
2016-01-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.
Information spreading in Delay Tolerant Networks based on nodes' behaviors
NASA Astrophysics Data System (ADS)
Wu, Yahui; Deng, Su; Huang, Hongbin
2014-07-01
Information spreading in DTNs (Delay Tolerant Networks) adopts a store-carry-forward method, and nodes receive the message from others directly. However, it is hard to judge whether the information is safe in this communication mode. In this case, a node may observe other nodes' behaviors. At present, there is no theoretical model to describe the varying rule of the nodes' trusting level. In addition, due to the uncertainty of the connectivity in DTN, a node is hard to get the global state of the network. Therefore, a rational model about the node's trusting level should be a function of the node's own observing result. For example, if a node finds k nodes carrying a message, it may trust the information with probability p(k). This paper does not explore the real distribution of p(k), but instead presents a unifying theoretical framework to evaluate the performance of the information spreading in above case. This framework is an extension of the traditional SI (susceptible-infected) model, and is useful when p(k) conforms to any distribution. Simulations based on both synthetic and real motion traces show the accuracy of the framework. Finally, we explore the impact of the nodes' behaviors based on certain special distributions through numerical results.
De Ridder, Dirk; Vanneste, Sven; Weisz, Nathan; Londero, Alain; Schlee, Winnie; Elgoyhen, Ana Belen; Langguth, Berthold
2014-07-01
Tinnitus is a considered to be an auditory phantom phenomenon, a persistent conscious percept of a salient memory trace, externally attributed, in the absence of a sound source. It is perceived as a phenomenological unified coherent percept, binding multiple separable clinical characteristics, such as its loudness, the sidedness, the type (pure tone, noise), the associated distress and so on. A theoretical pathophysiological framework capable of explaining all these aspects in one model is highly needed. The model must incorporate both the deafferentation based neurophysiological models and the dysfunctional noise canceling model, and propose a 'tinnitus core' subnetwork. The tinnitus core can be defined as the minimal set of brain areas that needs to be jointly activated (=subnetwork) for tinnitus to be consciously perceived, devoid of its affective components. The brain areas involved in the other separable characteristics of tinnitus can be retrieved by studies on spontaneous resting state magnetic and electrical activity in people with tinnitus, evaluated for the specific aspect investigated and controlled for other factors. By combining these functional imaging studies with neuromodulation techniques some of the correlations are turned into causal relationships. Thereof, a heuristic pathophysiological framework is constructed, integrating the tinnitus perceptual core with the other tinnitus related aspects. This phenomenological unified percept of tinnitus can be considered an emergent property of multiple, parallel, dynamically changing and partially overlapping subnetworks, each with a specific spontaneous oscillatory pattern and functional connectivity signature. Communication between these different subnetworks is proposed to occur at hubs, brain areas that are involved in multiple subnetworks simultaneously. These hubs can take part in each separable subnetwork at different frequencies. Communication between the subnetworks is proposed to occur at discrete oscillatory frequencies. As such, the brain uses multiple nonspecific networks in parallel, each with their own oscillatory signature, that adapt to the context to construct a unified percept possibly by synchronized activation integrated at hubs at discrete oscillatory frequencies. Copyright © 2013 Elsevier Ltd. All rights reserved.
Unified Program Design: Organizing Existing Programming Models, Delivery Options, and Curriculum
ERIC Educational Resources Information Center
Rubenstein, Lisa DaVia; Ridgley, Lisa M.
2017-01-01
A persistent problem in the field of gifted education has been the lack of categorization and delineation of gifted programming options. To address this issue, we propose Unified Program Design as a structural framework for gifted program models. This framework defines gifted programs as the combination of delivery methods and curriculum models.…
A Unified Framework for Complex Networks with Degree Trichotomy Based on Markov Chains.
Hui, David Shui Wing; Chen, Yi-Chao; Zhang, Gong; Wu, Weijie; Chen, Guanrong; Lui, John C S; Li, Yingtao
2017-06-16
This paper establishes a Markov chain model as a unified framework for describing the evolution processes in complex networks. The unique feature of the proposed model is its capability in addressing the formation mechanism that can reflect the "trichotomy" observed in degree distributions, based on which closed-form solutions can be derived. Important special cases of the proposed unified framework are those classical models, including Poisson, Exponential, Power-law distributed networks. Both simulation and experimental results demonstrate a good match of the proposed model with real datasets, showing its superiority over the classical models. Implications of the model to various applications including citation analysis, online social networks, and vehicular networks design, are also discussed in the paper.
Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita
2013-01-01
Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration.
Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita
2013-01-01
Objective Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. Materials and methods We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Results Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. Conclusions We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration. PMID:23571850
Smolensky, Paul; Goldrick, Matthew; Mathis, Donald
2014-08-01
Mental representations have continuous as well as discrete, combinatorial properties. For example, while predominantly discrete, phonological representations also vary continuously; this is reflected by gradient effects in instrumental studies of speech production. Can an integrated theoretical framework address both aspects of structure? The framework we introduce here, Gradient Symbol Processing, characterizes the emergence of grammatical macrostructure from the Parallel Distributed Processing microstructure (McClelland, Rumelhart, & The PDP Research Group, 1986) of language processing. The mental representations that emerge, Distributed Symbol Systems, have both combinatorial and gradient structure. They are processed through Subsymbolic Optimization-Quantization, in which an optimization process favoring representations that satisfy well-formedness constraints operates in parallel with a distributed quantization process favoring discrete symbolic structures. We apply a particular instantiation of this framework, λ-Diffusion Theory, to phonological production. Simulations of the resulting model suggest that Gradient Symbol Processing offers a way to unify accounts of grammatical competence with both discrete and continuous patterns in language performance. Copyright © 2013 Cognitive Science Society, Inc.
Incubation, Insight, and Creative Problem Solving: A Unified Theory and a Connectionist Model
ERIC Educational Resources Information Center
Helie, Sebastien; Sun, Ron
2010-01-01
This article proposes a unified framework for understanding creative problem solving, namely, the explicit-implicit interaction theory. This new theory of creative problem solving constitutes an attempt at providing a more unified explanation of relevant phenomena (in part by reinterpreting/integrating various fragmentary existing theories of…
Efficiency limits for photoelectrochemical water-splitting
Fountaine, Katherine T.; Lewerenz, Hans Joachim; Atwater, Harry A.
2016-12-02
Theoretical limiting efficiencies have a critical role in determining technological viability and expectations for device prototypes, as evidenced by the photovoltaics community’s focus on detailed balance. However, due to their multicomponent nature, photoelectrochemical devices do not have an equivalent analogue to detailed balance, and reported theoretical efficiency limits vary depending on the assumptions made. Here we introduce a unified framework for photoelectrochemical device performance through which all previous limiting efficiencies can be understood and contextualized. Ideal and experimentally realistic limiting efficiencies are presented, and then generalized using five representative parameters—semiconductor absorption fraction, external radiative efficiency, series resistance, shunt resistance andmore » catalytic exchange current density—to account for imperfect light absorption, charge transport and catalysis. Finally, we discuss the origin of deviations between the limits discussed herein and reported water-splitting efficiencies. This analysis provides insight into the primary factors that determine device performance and a powerful handle to improve device efficiency.« less
Gauge Physics of Spin Hall Effect
Tan, Seng Ghee; Jalil, Mansoor B. A.; Ho, Cong Son; Siu, Zhuobin; Murakami, Shuichi
2015-01-01
Spin Hall effect (SHE) has been discussed in the context of Kubo formulation, geometric physics, spin orbit force, and numerous semi-classical treatments. It can be confusing if the different pictures have partial or overlapping claims of contribution to the SHE. In this article, we present a gauge-theoretic, time-momentum elucidation, which provides a general SHE equation of motion, that unifies under one theoretical framework, all contributions of SHE conductivity due to the kinetic, the spin orbit force (Yang-Mills), and the geometric (Murakami-Fujita) effects. Our work puts right an ambiguity surrounding previously partial treatments involving the Kubo, semiclassical, Berry curvatures, or the spin orbit force. Our full treatment shows the Rashba 2DEG SHE conductivity to be instead of −, and Rashba heavy hole instead of −. This renewed treatment suggests a need to re-derive and re-calculate previously studied SHE conductivity. PMID:26689260
Alonso, Ariel; Molenberghs, Geert
2008-10-01
The last two decades have seen a lot of development in the area of surrogate marker validation. One of these approaches places the evaluation in a meta-analytic framework, leading to definitions in terms of trial- and individual-level association. A drawback of this methodology is that different settings have led to different measures at the individual level. Using information theory, Alonso et al. proposed a unified framework, leading to a new definition of surrogacy, which offers interpretational advantages and is applicable in a wide range of situations. In this work, we illustrate how this information-theoretic approach can be used to evaluate surrogacy when both endpoints are of a time-to-event type. Two meta-analyses, in early and advanced colon cancer, respectively, are then used to evaluate the performance of time to cancer recurrence as a surrogate for overall survival.
Tensegrity and motor-driven effective interactions in a model cytoskeleton
NASA Astrophysics Data System (ADS)
Wang, Shenshen; Wolynes, Peter G.
2012-04-01
Actomyosin networks are major structural components of the cell. They provide mechanical integrity and allow dynamic remodeling of eukaryotic cells, self-organizing into the diverse patterns essential for development. We provide a theoretical framework to investigate the intricate interplay between local force generation, network connectivity, and collective action of molecular motors. This framework is capable of accommodating both regular and heterogeneous pattern formation, arrested coarsening and macroscopic contraction in a unified manner. We model the actomyosin system as a motorized cat's cradle consisting of a crosslinked network of nonlinear elastic filaments subjected to spatially anti-correlated motor kicks acting on motorized (fibril) crosslinks. The phase diagram suggests there can be arrested phase separation which provides a natural explanation for the aggregation and coalescence of actomyosin condensates. Simulation studies confirm the theoretical picture that a nonequilibrium many-body system driven by correlated motor kicks can behave as if it were at an effective equilibrium, but with modified interactions that account for the correlation of the motor driven motions of the actively bonded nodes. Regular aster patterns are observed both in Brownian dynamics simulations at effective equilibrium and in the complete stochastic simulations. The results show that large-scale contraction requires correlated kicking.
NASA Astrophysics Data System (ADS)
Fitch, W. Tecumseh
2014-09-01
Progress in understanding cognition requires a quantitative, theoretical framework, grounded in the other natural sciences and able to bridge between implementational, algorithmic and computational levels of explanation. I review recent results in neuroscience and cognitive biology that, when combined, provide key components of such an improved conceptual framework for contemporary cognitive science. Starting at the neuronal level, I first discuss the contemporary realization that single neurons are powerful tree-shaped computers, which implies a reorientation of computational models of learning and plasticity to a lower, cellular, level. I then turn to predictive systems theory (predictive coding and prediction-based learning) which provides a powerful formal framework for understanding brain function at a more global level. Although most formal models concerning predictive coding are framed in associationist terms, I argue that modern data necessitate a reinterpretation of such models in cognitive terms: as model-based predictive systems. Finally, I review the role of the theory of computation and formal language theory in the recent explosion of comparative biological research attempting to isolate and explore how different species differ in their cognitive capacities. Experiments to date strongly suggest that there is an important difference between humans and most other species, best characterized cognitively as a propensity by our species to infer tree structures from sequential data. Computationally, this capacity entails generative capacities above the regular (finite-state) level; implementationally, it requires some neural equivalent of a push-down stack. I dub this unusual human propensity "dendrophilia", and make a number of concrete suggestions about how such a system may be implemented in the human brain, about how and why it evolved, and what this implies for models of language acquisition. I conclude that, although much remains to be done, a neurally-grounded framework for theoretical cognitive science is within reach that can move beyond polarized debates and provide a more adequate theoretical future for cognitive biology.
Fitch, W Tecumseh
2014-09-01
Progress in understanding cognition requires a quantitative, theoretical framework, grounded in the other natural sciences and able to bridge between implementational, algorithmic and computational levels of explanation. I review recent results in neuroscience and cognitive biology that, when combined, provide key components of such an improved conceptual framework for contemporary cognitive science. Starting at the neuronal level, I first discuss the contemporary realization that single neurons are powerful tree-shaped computers, which implies a reorientation of computational models of learning and plasticity to a lower, cellular, level. I then turn to predictive systems theory (predictive coding and prediction-based learning) which provides a powerful formal framework for understanding brain function at a more global level. Although most formal models concerning predictive coding are framed in associationist terms, I argue that modern data necessitate a reinterpretation of such models in cognitive terms: as model-based predictive systems. Finally, I review the role of the theory of computation and formal language theory in the recent explosion of comparative biological research attempting to isolate and explore how different species differ in their cognitive capacities. Experiments to date strongly suggest that there is an important difference between humans and most other species, best characterized cognitively as a propensity by our species to infer tree structures from sequential data. Computationally, this capacity entails generative capacities above the regular (finite-state) level; implementationally, it requires some neural equivalent of a push-down stack. I dub this unusual human propensity "dendrophilia", and make a number of concrete suggestions about how such a system may be implemented in the human brain, about how and why it evolved, and what this implies for models of language acquisition. I conclude that, although much remains to be done, a neurally-grounded framework for theoretical cognitive science is within reach that can move beyond polarized debates and provide a more adequate theoretical future for cognitive biology. Copyright © 2014. Published by Elsevier B.V.
Computation of elementary modes: a unifying framework and the new binary approach
Gagneur, Julien; Klamt, Steffen
2004-01-01
Background Metabolic pathway analysis has been recognized as a central approach to the structural analysis of metabolic networks. The concept of elementary (flux) modes provides a rigorous formalism to describe and assess pathways and has proven to be valuable for many applications. However, computing elementary modes is a hard computational task. In recent years we assisted in a multiplication of algorithms dedicated to it. We require a summarizing point of view and a continued improvement of the current methods. Results We show that computing the set of elementary modes is equivalent to computing the set of extreme rays of a convex cone. This standard mathematical representation provides a unified framework that encompasses the most prominent algorithmic methods that compute elementary modes and allows a clear comparison between them. Taking lessons from this benchmark, we here introduce a new method, the binary approach, which computes the elementary modes as binary patterns of participating reactions from which the respective stoichiometric coefficients can be computed in a post-processing step. We implemented the binary approach in FluxAnalyzer 5.1, a software that is free for academics. The binary approach decreases the memory demand up to 96% without loss of speed giving the most efficient method available for computing elementary modes to date. Conclusions The equivalence between elementary modes and extreme ray computations offers opportunities for employing tools from polyhedral computation for metabolic pathway analysis. The new binary approach introduced herein was derived from this general theoretical framework and facilitates the computation of elementary modes in considerably larger networks. PMID:15527509
ERIC Educational Resources Information Center
Center for Mental Health in Schools at UCLA, 2005
2005-01-01
This report was developed to highlight the current state of affairs and illustrate the value of a unifying framework and integrated infrastructure for the many initiatives, projects, programs, and services schools pursue in addressing barriers to learning and promoting healthy development. Specifically, it highlights how initiatives can be…
Combining statistical inference and decisions in ecology.
Williams, Perry J; Hooten, Mevin B
2016-09-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.
User applications driven by the community contribution framework MPContribs in the Materials Project
Huck, P.; Gunter, D.; Cholia, S.; ...
2015-10-12
This paper discusses how the MPContribs framework in the Materials Project (MP) allows user-contributed data to be shown and analyzed alongside the core MP database. The MP is a searchable database of electronic structure properties of over 65,000 bulk solid materials, which is accessible through a web-based science-gateway. We describe the motivation for enabling user contributions to the materials data and present the framework's features and challenges in the context of two real applications. These use cases illustrate how scientific collaborations can build applications with their own 'user-contributed' data using MPContribs. The Nanoporous Materials Explorer application provides a unique searchmore » interface to a novel dataset of hundreds of thousands of materials, each with tables of user-contributed values related to material adsorption and density at varying temperature and pressure. The Unified Theoretical and Experimental X-ray Spectroscopy application discusses a full workflow for the association, dissemination, and combined analyses of experimental data from the Advanced Light Source with MP's theoretical core data, using MPContribs tools for data formatting, management, and exploration. The capabilities being developed for these collaborations are serving as the model for how new materials data can be incorporated into the MP website with minimal staff overhead while giving powerful tools for data search and display to the user community.« less
Toward a unifying framework for evolutionary processes.
Paixão, Tiago; Badkobeh, Golnaz; Barton, Nick; Çörüş, Doğan; Dang, Duc-Cuong; Friedrich, Tobias; Lehre, Per Kristian; Sudholt, Dirk; Sutton, Andrew M; Trubenová, Barbora
2015-10-21
The theory of population genetics and evolutionary computation have been evolving separately for nearly 30 years. Many results have been independently obtained in both fields and many others are unique to its respective field. We aim to bridge this gap by developing a unifying framework for evolutionary processes that allows both evolutionary algorithms and population genetics models to be cast in the same formal framework. The framework we present here decomposes the evolutionary process into its several components in order to facilitate the identification of similarities between different models. In particular, we propose a classification of evolutionary operators based on the defining properties of the different components. We cast several commonly used operators from both fields into this common framework. Using this, we map different evolutionary and genetic algorithms to different evolutionary regimes and identify candidates with the most potential for the translation of results between the fields. This provides a unified description of evolutionary processes and represents a stepping stone towards new tools and results to both fields. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Edwards statistical mechanics for jammed granular matter
NASA Astrophysics Data System (ADS)
Baule, Adrian; Morone, Flaviano; Herrmann, Hans J.; Makse, Hernán A.
2018-01-01
In 1989, Sir Sam Edwards made the visionary proposition to treat jammed granular materials using a volume ensemble of equiprobable jammed states in analogy to thermal equilibrium statistical mechanics, despite their inherent athermal features. Since then, the statistical mechanics approach for jammed matter—one of the very few generalizations of Gibbs-Boltzmann statistical mechanics to out-of-equilibrium matter—has garnered an extraordinary amount of attention by both theorists and experimentalists. Its importance stems from the fact that jammed states of matter are ubiquitous in nature appearing in a broad range of granular and soft materials such as colloids, emulsions, glasses, and biomatter. Indeed, despite being one of the simplest states of matter—primarily governed by the steric interactions between the constitutive particles—a theoretical understanding based on first principles has proved exceedingly challenging. Here a systematic approach to jammed matter based on the Edwards statistical mechanical ensemble is reviewed. The construction of microcanonical and canonical ensembles based on the volume function, which replaces the Hamiltonian in jammed systems, is discussed. The importance of approximation schemes at various levels is emphasized leading to quantitative predictions for ensemble averaged quantities such as packing fractions and contact force distributions. An overview of the phenomenology of jammed states and experiments, simulations, and theoretical models scrutinizing the strong assumptions underlying Edwards approach is given including recent results suggesting the validity of Edwards ergodic hypothesis for jammed states. A theoretical framework for packings whose constitutive particles range from spherical to nonspherical shapes such as dimers, polymers, ellipsoids, spherocylinders or tetrahedra, hard and soft, frictional, frictionless and adhesive, monodisperse, and polydisperse particles in any dimensions is discussed providing insight into a unifying phase diagram for all jammed matter. Furthermore, the connection between the Edwards ensemble of metastable jammed states and metastability in spin glasses is established. This highlights the fact that the packing problem can be understood as a constraint satisfaction problem for excluded volume and force and torque balance leading to a unifying framework between the Edwards ensemble of equiprobable jammed states and out-of-equilibrium spin glasses.
A Unified Classification Framework for FP, DP and CP Data at X-Band in Southern China
NASA Astrophysics Data System (ADS)
Xie, Lei; Zhang, Hong; Li, Hhongzhong; Wang, Chao
2015-04-01
The main objective of this paper is to introduce an unified framework for crop classification in Southern China using data in fully polarimetric (FP), dual-pol (DP) and compact polarimetric (CP) modes. The TerraSAR-X data acquired over the Leizhou Peninsula, South China are used in our experiments. The study site involves four main crops (rice, banana, sugarcane eucalyptus). Through exploring the similarities between data in these three modes, a knowledge-based characteristic space is created and the unified framework is presented. The overall classification accuracies for data in the FP, coherent HH/VV are about 95%, and is about 91% in CP modes, which suggests that the proposed classification scheme is effective and promising. Compared with the Wishart Maximum Likelihood (ML) classifier, the proposed method exhibits higher classification accuracy.
Statistical Mechanics of the Cytoskeleton
NASA Astrophysics Data System (ADS)
Wang, Shenshen
The mechanical integrity of eukaryotic cells along with their capability of dynamic remodeling depends on their cytoskeleton, a structural scaffold made up of a complex and dense network of filamentous proteins spanning the cytoplasm. Active force generation within the cytoskeletal networks by molecular motors is ultimately powered by the consumption of chemical energy and conversion of that energy into mechanical work. The resulting functional movements range from the collective cell migration in epithelial tissues responsible for wound healing to the changes of cell shape that occur during muscle contraction, as well as all the internal structural rearrangements essential for cell division. The role of the cytoskeleton as a dynamic versatile mesoscale "muscle", whose passive and active performance is both highly heterogeneous in space and time and intimately linked to diverse biological functions, allows it to serve as a sensitive indicator for the health and developmental state of the cell. By approaching this natural nonequilibrium many-body system from a variety of perspectives, researchers have made major progress toward understanding the cytoskeleton's unusual mechanical, dynamical and structural properties. Yet a unifying framework capable of capturing both the dynamics of active pattern formation and the emergence of spontaneous collective motion, that allows one to predict the dependence of the model's control parameters on motor properties, is still needed. In the following we construct a microscopic model and provide a theoretical framework to investigate the intricate interplay between local force generation, network architecture and collective motor action. This framework is able to accommodate both regular and heterogeneous pattern formation, as well as arrested coarsening and macroscopic contraction in a unified manner, through the notion of motor-driven effective interactions. Moreover a systematic expansion scheme combined with a variational stability analysis yields a threshold strength of motor kicking noise, below which the motorized system behaves as if it were at an effective equilibrium, but with a nontrivial effective temperature. Above the threshold, however, collective directed motion emerges spontaneously. Computer simulations support the theoretical predictions and highlight the essential role played in large-scale contraction by spatial correlation in motor kicking events.
The Unified Behavior Framework for the Simulation of Autonomous Agents
2015-03-01
1980s, researchers have designed a variety of robot control architectures intending to imbue robots with some degree of autonomy. A recently developed ...Identification Friend or Foe viii THE UNIFIED BEHAVIOR FRAMEWORK FOR THE SIMULATION OF AUTONOMOUS AGENTS I. Introduction The development of autonomy has...room for research by utilizing methods like simulation and modeling that consume less time and fewer monetary resources. A recently developed reactive
A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects
Slob, Wout
2015-01-01
Background When chemical health hazards have been identified, probabilistic dose–response assessment (“hazard characterization”) quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. Objectives We developed a unified framework for probabilistic dose–response assessment. Methods We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose–response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, “effect metrics” can be specified to define “toxicologically equivalent” sizes for this underlying individual response; and d) dose–response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose–response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Results Probabilistically derived exposure limits are based on estimating a “target human dose” (HDMI), which requires risk management–informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%–10% effect sizes. Conclusions Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions. Citation Chiu WA, Slob W. 2015. A unified probabilistic framework for dose–response assessment of human health effects. Environ Health Perspect 123:1241–1254; http://dx.doi.org/10.1289/ehp.1409385 PMID:26006063
Unification of small and large time scales for biological evolution: deviations from power law.
Chowdhury, Debashish; Stauffer, Dietrich; Kunwar, Ambarish
2003-02-14
We develop a unified model that describes both "micro" and "macro" evolutions within a single theoretical framework. The ecosystem is described as a dynamic network; the population dynamics at each node of this network describes the "microevolution" over ecological time scales (i.e., birth, ageing, and natural death of individual organisms), while the appearance of new nodes, the slow changes of the links, and the disappearance of existing nodes accounts for the "macroevolution" over geological time scales (i.e., the origination, evolution, and extinction of species). In contrast to several earlier claims in the literature, we observe strong deviations from power law in the regime of long lifetimes.
Cold fission description with constant and varying mass asymmetries
NASA Astrophysics Data System (ADS)
Duarte, S. B.; Rodríguez, O.; Tavares, O. A. P.; Gonçalves, M.; García, F.; Guzmán, F.
1998-05-01
Different descriptions for varying the mass asymmetry in the fragmentation process are used to calculate the cold fission barrier penetrability. The relevance of the appropriate choice for both the description of the prescission phase and inertia coefficient to unify alpha decay, cluster radioactivity, and spontaneous cold fission processes in the same theoretical framework is explicitly shown. We calculate the half-life of all possible partition modes of nuclei of A>200 following the most recent Mass Table by Audi and Wapstra. It is shown that if one uses the description in which the mass asymmetry is maintained constant during the fragmentation process, the experimental half-life values and mass yield of 234U cold fission are satisfactorily reproduced.
High-Contrast Gratings based Spoof Surface Plasmons
NASA Astrophysics Data System (ADS)
Li, Zhuo; Liu, Liangliang; Xu, Bingzheng; Ning, Pingping; Chen, Chen; Xu, Jia; Chen, Xinlei; Gu, Changqing; Qing, Quan
2016-02-01
In this work, we explore the existence of spoof surface plasmons (SSPs) supported by deep-subwavelength high-contrast gratings (HCGs) on a perfect electric conductor plane. The dispersion relation of the HCGs-based SSPs is derived analyt- ically by combining multimode network theory with rigorous mode matching method, which has nearly the same form with and can be degenerated into that of the SSPs arising from deep-subwavelength metallic gratings (MGs). Numerical simula- tions validate the analytical dispersion relation and an effective medium approximation is also presented to obtain the same analytical dispersion formula. This work sets up a unified theoretical framework for SSPs and opens up new vistas in surface plasmon optics.
NASA Astrophysics Data System (ADS)
Wang, Xu; Le, Anh-Thu; Zhou, Zhaoyan; Wei, Hui; Lin, C. D.
2017-08-01
We provide a unified theoretical framework for recently emerging experiments that retrieve fixed-in-space molecular information through time-domain rotational coherence spectroscopy. Unlike a previous approach by Makhija et al. (V. Makhija et al., arXiv:1611.06476), our method can be applied to the retrieval of both real-valued (e.g., ionization yield) and complex-valued (e.g., induced dipole moment) molecular response information. It is also a direct retrieval method without using iterations. We also demonstrate that experimental parameters, such as the fluence of the aligning laser pulse and the rotational temperature of the molecular ensemble, can be quite accurately determined using a statistical method.
NASA Technical Reports Server (NTRS)
Banks, H. T.; Silcox, R. J.; Keeling, S. L.; Wang, C.
1989-01-01
A unified treatment of the linear quadratic tracking (LQT) problem, in which a control system's dynamics are modeled by a linear evolution equation with a nonhomogeneous component that is linearly dependent on the control function u, is presented; the treatment proceeds from the theoretical formulation to a numerical approximation framework. Attention is given to two categories of LQT problems in an infinite time interval: the finite energy and the finite average energy. The behavior of the optimal solution for finite time-interval problems as the length of the interval tends to infinity is discussed. Also presented are the formulations and properties of LQT problems in a finite time interval.
Generation of Microbubbles with Applications to Industry and Medicine
NASA Astrophysics Data System (ADS)
Rodríguez-Rodríguez, Javier; Sevilla, Alejandro; Martínez-Bazán, Carlos; Gordillo, José Manuel
2015-01-01
We provide a comprehensive and systematic description of the diverse microbubble generation methods recently developed to satisfy emerging technological, pharmaceutical, and medical demands. We first introduce a theoretical framework unifying the physics of bubble formation in the wide variety of existing types of generators. These devices are then classified according to the way the bubbling process is controlled: outer liquid flows (e.g., coflows, cross flows, and flow-focusing flows), acoustic forcing, and electric fields. We also address modern techniques developed to produce bubbles coated with surfactants and liquid shells. The stringent requirements to precisely control the bubbling frequency, the bubble size, and the properties of the coating make microfluidics the natural choice to implement such techniques.
LHC-scale left-right symmetry and unification
NASA Astrophysics Data System (ADS)
Arbeláez, Carolina; Romão, Jorge C.; Hirsch, Martin; Malinský, Michal
2014-02-01
We construct a comprehensive list of nonsupersymmetric standard model extensions with a low-scale left-right (LR)-symmetric intermediate stage that may be obtained as simple low-energy effective theories within a class of renormalizable SO(10) grand unified theories. Unlike the traditional "minimal" LR models many of our example settings support a perfect gauge coupling unification even if the LR scale is in the LHC domain at a price of only (a few copies of) one or two types of extra fields pulled down to the TeV-scale ballpark. We discuss the main aspects of a potentially realistic model building conforming the basic constraints from the quark and lepton sector flavor structure, proton decay limits, etc. We pay special attention to the theoretical uncertainties related to the limited information about the underlying unified framework in the bottom-up approach, in particular, to their role in the possible extraction of the LR-breaking scale. We observe a general tendency for the models without new colored states in the TeV domain to be on the verge of incompatibility with the proton stability constraints.
Chang, I-Chiu; Hsu, Hui-Mei
2012-01-01
Barriers to report incident events using an online information system (IS) may be different from those of a paper-based reporting system. The nationwide online Patient-Safety Reporting System (PSRS) contains a value judgment behind use of the system, similar to the Value of Perceived Consequence (VPC), which is seldom discussed in ISs applications of other disciplines. This study developed a more adequate research framework by integrating the VPC construct into the well-known Unified Theory of Acceptance and Use of Technology (UTAUT) model as a theoretical base to explore the predictors of medical staff's intention to use online PSRS. The results showed that management support was an important factor to influence medical staff's intention of using PSRS. The effects of factors such as performance expectancy, perceived positive, and perceived negative consequence on medical staff's intention of using PSRS were moderated by gender, age, experience, and occupation. The results proved that the modified UTAUT model is significant and useful in predicting medical staff's intention of using the nationwide online PSRS.
NASA Astrophysics Data System (ADS)
Fang, Jin-Qing; Li, Yong
2010-02-01
A large unified hybrid network model with a variable speed growth (LUHNM-VSG) is proposed as third model of the unified hybrid network theoretical framework (UHNTF). A hybrid growth ratio vg of deterministic linking number to random linking number and variable speed growth index α are introduced in it. The main effects of vg and α on topological transition features of the LUHNM-VSG are revealed. For comparison with the other models, we construct a type of the network complexity pyramid with seven levels, in which from the bottom level-1 to the top level-7 of the pyramid simplicity-universality is increasing but complexity-diversity is decreasing. The transition relations between them depend on matching of four hybrid ratios (dr, fd, gr, vg). Thus the most of network models can be investigated in the unification way via four hybrid ratios (dr, fd, gr, vg). The LUHNM-VSG as the level-1 of the pyramid is much better and closer to description of real-world networks as well as has potential application.
Robopedia: Leveraging Sensorpedia for Web-Enabled Robot Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Resseguie, David R
There is a growing interest in building Internetscale sensor networks that integrate sensors from around the world into a single unified system. In contrast, robotics application development has primarily focused on building specialized systems. These specialized systems take scalability and reliability into consideration, but generally neglect exploring the key components required to build a large scale system. Integrating robotic applications with Internet-scale sensor networks will unify specialized robotics applications and provide answers to large scale implementation concerns. We focus on utilizing Internet-scale sensor network technology to construct a framework for unifying robotic systems. Our framework web-enables a surveillance robot smore » sensor observations and provides a webinterface to the robot s actuators. This lets robots seamlessly integrate into web applications. In addition, the framework eliminates most prerequisite robotics knowledge, allowing for the creation of general web-based robotics applications. The framework also provides mechanisms to create applications that can interface with any robot. Frameworks such as this one are key to solving large scale mobile robotics implementation problems. We provide an overview of previous Internetscale sensor networks, Sensorpedia (an ad-hoc Internet-scale sensor network), our framework for integrating robots with Sensorpedia, two applications which illustrate our frameworks ability to support general web-based robotic control, and offer experimental results that illustrate our framework s scalability, feasibility, and resource requirements.« less
Introduction to Theoretical Modelling
NASA Astrophysics Data System (ADS)
Davis, Matthew J.; Gardiner, Simon A.; Hanna, Thomas M.; Nygaard, Nicolai; Proukakis, Nick P.; Szymańska, Marzena H.
2013-02-01
We briefly overview commonly encountered theoretical notions arising in the modelling of quantum gases, intended to provide a unified background to the `language' and diverse theoretical models presented elsewhere in this book, and aimed particularly at researchers from outside the quantum gases community.
A Unified Framework for the Infection Dynamics of Zoonotic Spillover and Spread.
Lo Iacono, Giovanni; Cunningham, Andrew A; Fichet-Calvet, Elisabeth; Garry, Robert F; Grant, Donald S; Leach, Melissa; Moses, Lina M; Nichols, Gordon; Schieffelin, John S; Shaffer, Jeffrey G; Webb, Colleen T; Wood, James L N
2016-09-01
A considerable amount of disease is transmitted from animals to humans and many of these zoonoses are neglected tropical diseases. As outbreaks of SARS, avian influenza and Ebola have demonstrated, however, zoonotic diseases are serious threats to global public health and are not just problems confined to remote regions. There are two fundamental, and poorly studied, stages of zoonotic disease emergence: 'spillover', i.e. transmission of pathogens from animals to humans, and 'stuttering transmission', i.e. when limited human-to-human infections occur, leading to self-limiting chains of transmission. We developed a transparent, theoretical framework, based on a generalization of Poisson processes with memory of past human infections, that unifies these stages. Once we have quantified pathogen dynamics in the reservoir, with some knowledge of the mechanism of contact, the approach provides a tool to estimate the likelihood of spillover events. Comparisons with independent agent-based models demonstrates the ability of the framework to correctly estimate the relative contributions of human-to-human vs animal transmission. As an illustrative example, we applied our model to Lassa fever, a rodent-borne, viral haemorrhagic disease common in West Africa, for which data on human outbreaks were available. The approach developed here is general and applicable to a range of zoonoses. This kind of methodology is of crucial importance for the scientific, medical and public health communities working at the interface between animal and human diseases to assess the risk associated with the disease and to plan intervention and appropriate control measures. The Lassa case study revealed important knowledge gaps, and opportunities, arising from limited knowledge of the temporal patterns in reporting, abundance of and infection prevalence in, the host reservoir.
A Unified Framework for the Infection Dynamics of Zoonotic Spillover and Spread
Cunningham, Andrew A.; Fichet-Calvet, Elisabeth; Garry, Robert F.; Grant, Donald S.; Leach, Melissa; Moses, Lina M.; Nichols, Gordon; Schieffelin, John S.; Shaffer, Jeffrey G.; Webb, Colleen T.; Wood, James L. N.
2016-01-01
A considerable amount of disease is transmitted from animals to humans and many of these zoonoses are neglected tropical diseases. As outbreaks of SARS, avian influenza and Ebola have demonstrated, however, zoonotic diseases are serious threats to global public health and are not just problems confined to remote regions. There are two fundamental, and poorly studied, stages of zoonotic disease emergence: ‘spillover’, i.e. transmission of pathogens from animals to humans, and ‘stuttering transmission’, i.e. when limited human-to-human infections occur, leading to self-limiting chains of transmission. We developed a transparent, theoretical framework, based on a generalization of Poisson processes with memory of past human infections, that unifies these stages. Once we have quantified pathogen dynamics in the reservoir, with some knowledge of the mechanism of contact, the approach provides a tool to estimate the likelihood of spillover events. Comparisons with independent agent-based models demonstrates the ability of the framework to correctly estimate the relative contributions of human-to-human vs animal transmission. As an illustrative example, we applied our model to Lassa fever, a rodent-borne, viral haemorrhagic disease common in West Africa, for which data on human outbreaks were available. The approach developed here is general and applicable to a range of zoonoses. This kind of methodology is of crucial importance for the scientific, medical and public health communities working at the interface between animal and human diseases to assess the risk associated with the disease and to plan intervention and appropriate control measures. The Lassa case study revealed important knowledge gaps, and opportunities, arising from limited knowledge of the temporal patterns in reporting, abundance of and infection prevalence in, the host reservoir. PMID:27588425
An algorithm for hyperspectral remote sensing of aerosols: 1. Development of theoretical framework
NASA Astrophysics Data System (ADS)
Hou, Weizhen; Wang, Jun; Xu, Xiaoguang; Reid, Jeffrey S.; Han, Dong
2016-07-01
This paper describes the first part of a series of investigations to develop algorithms for simultaneous retrieval of aerosol parameters and surface reflectance from a newly developed hyperspectral instrument, the GEOstationary Trace gas and Aerosol Sensor Optimization (GEO-TASO), by taking full advantage of available hyperspectral measurement information in the visible bands. We describe the theoretical framework of an inversion algorithm for the hyperspectral remote sensing of the aerosol optical properties, in which major principal components (PCs) for surface reflectance is assumed known, and the spectrally dependent aerosol refractive indices are assumed to follow a power-law approximation with four unknown parameters (two for real and two for imaginary part of refractive index). New capabilities for computing the Jacobians of four Stokes parameters of reflected solar radiation at the top of the atmosphere with respect to these unknown aerosol parameters and the weighting coefficients for each PC of surface reflectance are added into the UNified Linearized Vector Radiative Transfer Model (UNL-VRTM), which in turn facilitates the optimization in the inversion process. Theoretical derivations of the formulas for these new capabilities are provided, and the analytical solutions of Jacobians are validated against the finite-difference calculations with relative error less than 0.2%. Finally, self-consistency check of the inversion algorithm is conducted for the idealized green-vegetation and rangeland surfaces that were spectrally characterized by the U.S. Geological Survey digital spectral library. It shows that the first six PCs can yield the reconstruction of spectral surface reflectance with errors less than 1%. Assuming that aerosol properties can be accurately characterized, the inversion yields a retrieval of hyperspectral surface reflectance with an uncertainty of 2% (and root-mean-square error of less than 0.003), which suggests self-consistency in the inversion framework. The next step of using this framework to study the aerosol information content in GEO-TASO measurements is also discussed.
Non-commutative Chern numbers for generic aperiodic discrete systems
NASA Astrophysics Data System (ADS)
Bourne, Chris; Prodan, Emil
2018-06-01
The search for strong topological phases in generic aperiodic materials and meta-materials is now vigorously pursued by the condensed matter physics community. In this work, we first introduce the concept of patterned resonators as a unifying theoretical framework for topological electronic, photonic, phononic etc (aperiodic) systems. We then discuss, in physical terms, the philosophy behind an operator theoretic analysis used to systematize such systems. A model calculation of the Hall conductance of a 2-dimensional amorphous lattice is given, where we present numerical evidence of its quantization in the mobility gap regime. Motivated by such facts, we then present the main result of our work, which is the extension of the Chern number formulas to Hamiltonians associated to lattices without a canonical labeling of the sites, together with index theorems that assure the quantization and stability of these Chern numbers in the mobility gap regime. Our results cover a broad range of applications, in particular, those involving quasi-crystalline, amorphous as well as synthetic (i.e. algorithmically generated) lattices.
Sampled-data consensus in switching networks of integrators based on edge events
NASA Astrophysics Data System (ADS)
Xiao, Feng; Meng, Xiangyu; Chen, Tongwen
2015-02-01
This paper investigates the event-driven sampled-data consensus in switching networks of multiple integrators and studies both the bidirectional interaction and leader-following passive reaction topologies in a unified framework. In these topologies, each information link is modelled by an edge of the information graph and assigned a sequence of edge events, which activate the mutual data sampling and controller updates of the two linked agents. Two kinds of edge-event-detecting rules are proposed for the general asynchronous data-sampling case and the synchronous periodic event-detecting case. They are implemented in a distributed fashion, and their effectiveness in reducing communication costs and solving consensus problems under a jointly connected topology condition is shown by both theoretical analysis and simulation examples.
[Mediate evaluation of replicating a Training Program in Nonverbal Communication in Gerontology].
Schimidt, Teresa Cristina Gioia; Duarte, Yeda Aparecida de Oliveira; Silva, Maria Julia Paes da
2015-04-01
Replicating the training program in non-verbal communication based on the theoretical framework of interpersonal communication; non-verbal coding, valuing the aging aspects in the perspective of active aging, checking its current relevance through the content assimilation index after 90 days (mediate) of its application. A descriptive and exploratory field study was conducted in three hospitals under direct administration of the state of São Paulo that caters exclusively to Unified Health System (SUS) patients. The training lasted 12 hours divided in three meetings, applied to 102 health professionals. Revealed very satisfactory and satisfactory mediate content assimilation index in 82.9%. The program replication proved to be relevant and updated the setting of hospital services, while remaining efficient for healthcare professionals.
Beauty photoproduction at HERA: k{sub T}-factorization versus experimental data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lipatov, A.V.; Zotov, N.P.
We present calculations of the beauty photoproduction at HERA collider in the framework of the k{sub T}-factorization approach. Both direct and resolved photon contributions are taken into account. The unintegrated gluon densities in a proton and in a photon are obtained from the full CCFM, from unified BFKL-DGLAP evolution equations as well as from the Kimber-Martin-Ryskin prescription. We investigate different production rates (both inclusive and associated with hadronic jets) and compare our theoretical predictions with the recent experimental data taken by the H1 and ZEUS collaborations. Special attention is put on the x{sub {gamma}}{sup obs} variable which is sensitive tomore » the relative contributions to the beauty production cross section.« less
New observations, new theoretical results and controversies regarding PC 3-5 waves
NASA Astrophysics Data System (ADS)
Takahashi, K.
Observations and theories of medium- to long-period (Pc 3-5) magnetic pulsations excited by magnetospheric particles are described. Satellite observations indicate that most pulsations can be classified into two groups according to their magnetic field polarization. One group has a transverse magnetic perturbation and the other strongly compressional perturbation. Despite this difference in polarization they share common characteristics, including large azimuthal wave number, westward propagation, and antisymmetric field-aligned structure. Recent theories describe these observations in a unified framework. It has been pointed out that trapped energetic ions play an important role in determining the instability threshold and the mode structure of the pulsations. Observations and theories of energetic particle response to the excited pulsations are also described.
Answering Schrödinger's question: A free-energy formulation
NASA Astrophysics Data System (ADS)
Ramstead, Maxwell James Désormeau; Badcock, Paul Benjamin; Friston, Karl John
2018-03-01
The free-energy principle (FEP) is a formal model of neuronal processes that is widely recognised in neuroscience as a unifying theory of the brain and biobehaviour. More recently, however, it has been extended beyond the brain to explain the dynamics of living systems, and their unique capacity to avoid decay. The aim of this review is to synthesise these advances with a meta-theoretical ontology of biological systems called variational neuroethology, which integrates the FEP with Tinbergen's four research questions to explain biological systems across spatial and temporal scales. We exemplify this framework by applying it to Homo sapiens, before translating variational neuroethology into a systematic research heuristic that supplies the biological, cognitive, and social sciences with a computationally tractable guide to discovery.
NASA Astrophysics Data System (ADS)
Bialas, A.; Peschanski, R.; Royon, Ch.
1998-06-01
It is argued that the QCD dipole picture allows us to build a unified theoretical description, based on Balitskii-Fadin-Kuraev-Lipatov dynamics, of the total and diffractive nucleon structure functions. This description is in qualitative agreement with the present collection of data obtained by the H1 Collaboration. More precise theoretical estimates, in particular the determination of the normalizations and proton transverse momentum behavior of the diffractive components, are shown to be required in order to reach definite conclusions.
Recent Theoretical Studies On Excitation and Recombination
NASA Technical Reports Server (NTRS)
Pradhan, Anil K.
2000-01-01
New advances in the theoretical treatment of atomic processes in plasmas are described. These enable not only an integrated, unified, and self-consistent treatment of important radiative and collisional processes, but also large-scale computation of atomic data with high accuracy. An extension of the R-matrix work, from excitation and photoionization to electron-ion recombination, includes a unified method that subsumes both the radiative and the di-electronic recombination processes in an ab initio manner. The extensive collisional calculations for iron and iron-peak elements under the Iron Project are also discussed.
An Unified Multiscale Framework for Planar, Surface, and Curve Skeletonization.
Jalba, Andrei C; Sobiecki, Andre; Telea, Alexandru C
2016-01-01
Computing skeletons of 2D shapes, and medial surface and curve skeletons of 3D shapes, is a challenging task. In particular, there is no unified framework that detects all types of skeletons using a single model, and also produces a multiscale representation which allows to progressively simplify, or regularize, all skeleton types. In this paper, we present such a framework. We model skeleton detection and regularization by a conservative mass transport process from a shape's boundary to its surface skeleton, next to its curve skeleton, and finally to the shape center. The resulting density field can be thresholded to obtain a multiscale representation of progressively simplified surface, or curve, skeletons. We detail a numerical implementation of our framework which is demonstrably stable and has high computational efficiency. We demonstrate our framework on several complex 2D and 3D shapes.
General description and understanding of the nonlinear dynamics of mode-locked fiber lasers.
Wei, Huai; Li, Bin; Shi, Wei; Zhu, Xiushan; Norwood, Robert A; Peyghambarian, Nasser; Jian, Shuisheng
2017-05-02
As a type of nonlinear system with complexity, mode-locked fiber lasers are known for their complex behaviour. It is a challenging task to understand the fundamental physics behind such complex behaviour, and a unified description for the nonlinear behaviour and the systematic and quantitative analysis of the underlying mechanisms of these lasers have not been developed. Here, we present a complexity science-based theoretical framework for understanding the behaviour of mode-locked fiber lasers by going beyond reductionism. This hierarchically structured framework provides a model with variable dimensionality, resulting in a simple view that can be used to systematically describe complex states. Moreover, research into the attractors' basins reveals the origin of stochasticity, hysteresis and multistability in these systems and presents a new method for quantitative analysis of these nonlinear phenomena. These findings pave the way for dynamics analysis and system designs of mode-locked fiber lasers. We expect that this paradigm will also enable potential applications in diverse research fields related to complex nonlinear phenomena.
The Aeroacoustics of Turbulent Flows
NASA Technical Reports Server (NTRS)
Goldstein, M. E.
2008-01-01
Aerodynamic noise prediction has been an important and challenging research area since James Lighthill first introduced his Acoustic Analogy Approach over fifty years ago. This talk attempts to provide a unified framework for the subsequent theoretical developments in this field. It assumes that there is no single approach that is optimal in all situations and uses the framework as a basis for discussing the strengths weaknesses of the various approaches to this topic. But the emphasis here will be on the important problem of predicting the noise from high speed air jets. Specific results will presented for round jets in the 0.5 to 1.4 Mach number range and compared with experimental data taken on the Glenn SHAR rig. It is demonstrated that nonparallel mean flow effects play an important role in predicting the noise at the supersonic Mach numbers. The results explain the failure of previous attempts based on the parallel flow Lilley model (which has served as the foundation for most jet noise analyses during past two decades).
Prior expectations facilitate metacognition for perceptual decision.
Sherman, M T; Seth, A K; Barrett, A B; Kanai, R
2015-09-01
The influential framework of 'predictive processing' suggests that prior probabilistic expectations influence, or even constitute, perceptual contents. This notion is evidenced by the facilitation of low-level perceptual processing by expectations. However, whether expectations can facilitate high-level components of perception remains unclear. We addressed this question by considering the influence of expectations on perceptual metacognition. To isolate the effects of expectation from those of attention we used a novel factorial design: expectation was manipulated by changing the probability that a Gabor target would be presented; attention was manipulated by instructing participants to perform or ignore a concurrent visual search task. We found that, independently of attention, metacognition improved when yes/no responses were congruent with expectations of target presence/absence. Results were modeled under a novel Bayesian signal detection theoretic framework which integrates bottom-up signal propagation with top-down influences, to provide a unified description of the mechanisms underlying perceptual decision and metacognition. Copyright © 2015 Elsevier Inc. All rights reserved.
Pattern formation in mass conserving reaction-diffusion systems
NASA Astrophysics Data System (ADS)
Brauns, Fridtjof; Halatek, Jacob; Frey, Erwin
We present a rigorous theoretical framework able to generalize and unify pattern formation for quantitative mass conserving reaction-diffusion models. Mass redistribution controls chemical equilibria locally. Separation of diffusive mass redistribution on the level of conserved species provides a general mathematical procedure to decompose complex reaction-diffusion systems into effectively independent functional units, and to reveal the general underlying bifurcation scenarios. We apply this framework to Min protein pattern formation and identify the mechanistic roles of both involved protein species. MinD generates polarity through phase separation, whereas MinE takes the role of a control variable regulating the existence of MinD phases. Hence, polarization and not oscillations is the generic core dynamics of Min proteins in vivo. This establishes an intrinsic mechanistic link between the Min system and a broad class of intracellular pattern forming systems based on bistability and phase separation (wave-pinning). Oscillations are facilitated by MinE redistribution and can be understood mechanistically as relaxation oscillations of the polarization direction.
A Unified Framework for Association Analysis with Multiple Related Phenotypes
Stephens, Matthew
2013-01-01
We consider the problem of assessing associations between multiple related outcome variables, and a single explanatory variable of interest. This problem arises in many settings, including genetic association studies, where the explanatory variable is genotype at a genetic variant. We outline a framework for conducting this type of analysis, based on Bayesian model comparison and model averaging for multivariate regressions. This framework unifies several common approaches to this problem, and includes both standard univariate and standard multivariate association tests as special cases. The framework also unifies the problems of testing for associations and explaining associations – that is, identifying which outcome variables are associated with genotype. This provides an alternative to the usual, but conceptually unsatisfying, approach of resorting to univariate tests when explaining and interpreting significant multivariate findings. The method is computationally tractable genome-wide for modest numbers of phenotypes (e.g. 5–10), and can be applied to summary data, without access to raw genotype and phenotype data. We illustrate the methods on both simulated examples, and to a genome-wide association study of blood lipid traits where we identify 18 potential novel genetic associations that were not identified by univariate analyses of the same data. PMID:23861737
A Unified Probabilistic Framework for Dose-Response Assessment of Human Health Effects.
Chiu, Weihsueh A; Slob, Wout
2015-12-01
When chemical health hazards have been identified, probabilistic dose-response assessment ("hazard characterization") quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. We developed a unified framework for probabilistic dose-response assessment. We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose-response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, "effect metrics" can be specified to define "toxicologically equivalent" sizes for this underlying individual response; and d) dose-response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose-response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Probabilistically derived exposure limits are based on estimating a "target human dose" (HDMI), which requires risk management-informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%-10% effect sizes. Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions.
Anatomy of the Higgs fits: A first guide to statistical treatments of the theoretical uncertainties
NASA Astrophysics Data System (ADS)
Fichet, Sylvain; Moreau, Grégory
2016-04-01
The studies of the Higgs boson couplings based on the recent and upcoming LHC data open up a new window on physics beyond the Standard Model. In this paper, we propose a statistical guide to the consistent treatment of the theoretical uncertainties entering the Higgs rate fits. Both the Bayesian and frequentist approaches are systematically analysed in a unified formalism. We present analytical expressions for the marginal likelihoods, useful to implement simultaneously the experimental and theoretical uncertainties. We review the various origins of the theoretical errors (QCD, EFT, PDF, production mode contamination…). All these individual uncertainties are thoroughly combined with the help of moment-based considerations. The theoretical correlations among Higgs detection channels appear to affect the location and size of the best-fit regions in the space of Higgs couplings. We discuss the recurrent question of the shape of the prior distributions for the individual theoretical errors and find that a nearly Gaussian prior arises from the error combinations. We also develop the bias approach, which is an alternative to marginalisation providing more conservative results. The statistical framework to apply the bias principle is introduced and two realisations of the bias are proposed. Finally, depending on the statistical treatment, the Standard Model prediction for the Higgs signal strengths is found to lie within either the 68% or 95% confidence level region obtained from the latest analyses of the 7 and 8 TeV LHC datasets.
Are There Unifying Trends in the Psychologies of 1990?
ERIC Educational Resources Information Center
Anastasi, Anne
The complexity and rapid expansion of the entire field of psychology make it appropriate to speak of "psychologies" when acknowledging the need for specialization of training and expertise. Nevertheless, unifying trends (UTs) exist in psychology, even though there can be no single set of theoretical principles to account for all empirical…
The Unified Core: A "Major" Learning Community Model in Action
ERIC Educational Resources Information Center
Powell, Gwynn M.; Johnson, Corey W.; James, J. Joy; Dunlap, Rudy
2011-01-01
The Unified Core is an innovative approach to higher education that blends content through linked courses within a major to create a community of learners. This article offers the theoretical background for the approach, describes the implementation, and offers suggestions to educators who would like to design their own version of this innovative…
Unifying theory for terrestrial research infrastructures
NASA Astrophysics Data System (ADS)
Mirtl, Michael
2016-04-01
The presentation will elaborate on basic steps needed for building a common theoretical base between Research Infrastructures focusing on terrestrial ecosystems. This theoretical base is needed for developing a better cooperation and integrating in the near future. An overview of different theories will be given and ways to a unifying approach explored. In the second step more practical implications of a theory-guided integration will be developed alongside the following guiding questions: • How do the existing and planned European environmental RIs map on a possible unifying theory on terrestrial ecosystems (covered structures and functions, scale; overlaps and gaps) • Can a unifying theory improve the consistent definition of RÍs scientific scope and focal science questions? • How could a division of tasks between RIs be organized in order to minimize parallel efforts? • Where concretely do existing and planned European environmental RIs need to interact to respond to overarching questions (top down component)? • What practical fora and mechanisms (across RIs) would be needed to bridge the gap between PI driven (bottom up) efforts and the centralistic RI design and operations?
NASA Astrophysics Data System (ADS)
St-Onge, Guillaume; Young, Jean-Gabriel; Laurence, Edward; Murphy, Charles; Dubé, Louis J.
2018-02-01
We present a degree-based theoretical framework to study the susceptible-infected-susceptible (SIS) dynamics on time-varying (rewired) configuration model networks. Using this framework on a given degree distribution, we provide a detailed analysis of the stationary state using the rewiring rate to explore the whole range of the time variation of the structure relative to that of the SIS process. This analysis is suitable for the characterization of the phase transition and leads to three main contributions: (1) We obtain a self-consistent expression for the absorbing-state threshold, able to capture both collective and hub activation. (2) We recover the predictions of a number of existing approaches as limiting cases of our analysis, providing thereby a unifying point of view for the SIS dynamics on random networks. (3) We obtain bounds for the critical exponents of a number of quantities in the stationary state. This allows us to reinterpret the concept of hub-dominated phase transition. Within our framework, it appears as a heterogeneous critical phenomenon: observables for different degree classes have a different scaling with the infection rate. This phenomenon is followed by the successive activation of the degree classes beyond the epidemic threshold.
In Search of a Unified Model of Language Contact
ERIC Educational Resources Information Center
Winford, Donald
2013-01-01
Much previous research has pointed to the need for a unified framework for language contact phenomena -- one that would include social factors and motivations, structural factors and linguistic constraints, and psycholinguistic factors involved in processes of language processing and production. While Contact Linguistics has devoted a great deal…
An empowerment framework for nursing leadership development: supporting evidence.
Macphee, Maura; Skelton-Green, Judith; Bouthillette, France; Suryaprakash, Nitya
2012-01-01
This article is a report on a descriptive study of nurse leaders' perspectives of the outcomes of a formal leadership programme. Effective nurse leaders are necessary to address complex issues associated with healthcare systems reforms. Little is known about the types of leadership development programmes that most effectively prepare nurse leaders for healthcare challenges. When nurse leaders use structural and psychological empowerment strategies, the results are safer work environments and better nurse outcomes. The leadership development programme associated with this study is based on a unifying theoretical empowerment framework to empower nurse leaders and enable them to empower others. Twenty seven front-line and mid-level nurse leaders with variable years of experience were interviewed for 1 year after participating in a formal leadership development programme. Data were gathered in 2008-2009 from four programme cohorts. Four researchers independently developed code categories and themes using qualitative content analysis. Evidence of leadership development programme empowerment included nurse leader reports of increased self-confidence with respect to carrying out their roles and responsibilities; positive changes in their leadership styles; and perceptions of staff recognition of positive stylistic changes. Regardless of years of experience, mid-level leaders had a broader appreciation of practice environment issues than front-line leaders. Time for reflection was valuable to all participants, and front-line leaders, in particular, appreciated the time to discuss nurse-specific issues with their colleagues. This study provides evidence that a theoretical empowerment framework and strategies can empower nurse leaders, potentially resulting in staff empowerment. © 2011 The Authors. Journal of Advanced Nursing © 2011 Blackwell Publishing Ltd.
BIRCH: a user-oriented, locally-customizable, bioinformatics system.
Fristensky, Brian
2007-02-09
Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere.
BIRCH: A user-oriented, locally-customizable, bioinformatics system
Fristensky, Brian
2007-01-01
Background Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. Results BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. Conclusion BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere. PMID:17291351
A Theory of Utility Conditionals: Paralogical Reasoning from Decision-Theoretic Leakage
ERIC Educational Resources Information Center
Bonnefon, Jean-Francois
2009-01-01
Many "if p, then q" conditionals have decision-theoretic features, such as antecedents or consequents that relate to the utility functions of various agents. These decision-theoretic features leak into reasoning processes, resulting in various paralogical conclusions. The theory of utility conditionals offers a unified account of the various forms…
Regionalization and political dynamics of Brazilian health federalism.
Dourado, Daniel de Araujo; Elias, Paulo Eduardo Mangeon
2011-02-01
The implications from the Brazilian federal structure on the regionalization of health actions and services in the National Unified Health System (SUS) were analyzed, considering that the regional health planning in Brazil takes place within the context of intergovernmental relations as an expression of cooperative federalism in health. The analysis was based on a historical approach to Brazilian health federalism, recognizing two development periods, decentralization and regionalization. Regional health planning of SUS was explored in light of the theoretical framework of federalism. It is concluded that relative centralization of the process is needed in intergovernmental committees to actualize federal coordination and that it is essential to consider formalizing opportunities for dissent, both in regional management boards and in the intergovernmental committees, so that the consensus decision-making can be accomplished in healthcare regionalization.
The visual system’s internal model of the world
Lee, Tai Sing
2015-01-01
The Bayesian paradigm has provided a useful conceptual theory for understanding perceptual computation in the brain. While the detailed neural mechanisms of Bayesian inference are not fully understood, recent computational and neurophysiological works have illuminated the underlying computational principles and representational architecture. The fundamental insights are that the visual system is organized as a modular hierarchy to encode an internal model of the world, and that perception is realized by statistical inference based on such internal model. In this paper, I will discuss and analyze the varieties of representational schemes of these internal models and how they might be used to perform learning and inference. I will argue for a unified theoretical framework for relating the internal models to the observed neural phenomena and mechanisms in the visual cortex. PMID:26566294
False Discovery Control in Large-Scale Spatial Multiple Testing
Sun, Wenguang; Reich, Brian J.; Cai, T. Tony; Guindani, Michele; Schwartzman, Armin
2014-01-01
Summary This article develops a unified theoretical and computational framework for false discovery control in multiple testing of spatial signals. We consider both point-wise and cluster-wise spatial analyses, and derive oracle procedures which optimally control the false discovery rate, false discovery exceedance and false cluster rate, respectively. A data-driven finite approximation strategy is developed to mimic the oracle procedures on a continuous spatial domain. Our multiple testing procedures are asymptotically valid and can be effectively implemented using Bayesian computational algorithms for analysis of large spatial data sets. Numerical results show that the proposed procedures lead to more accurate error control and better power performance than conventional methods. We demonstrate our methods for analyzing the time trends in tropospheric ozone in eastern US. PMID:25642138
With or without you: predictive coding and Bayesian inference in the brain
Aitchison, Laurence; Lengyel, Máté
2018-01-01
Two theoretical ideas have emerged recently with the ambition to provide a unifying functional explanation of neural population coding and dynamics: predictive coding and Bayesian inference. Here, we describe the two theories and their combination into a single framework: Bayesian predictive coding. We clarify how the two theories can be distinguished, despite sharing core computational concepts and addressing an overlapping set of empirical phenomena. We argue that predictive coding is an algorithmic / representational motif that can serve several different computational goals of which Bayesian inference is but one. Conversely, while Bayesian inference can utilize predictive coding, it can also be realized by a variety of other representations. We critically evaluate the experimental evidence supporting Bayesian predictive coding and discuss how to test it more directly. PMID:28942084
Embedding Quantum Mechanics Into a Broader Noncontextual Theory: A Conciliatory Result
NASA Astrophysics Data System (ADS)
Garola, Claudio; Sozzo, Sandro
2010-12-01
The extended semantic realism ( ESR) model embodies the mathematical formalism of standard (Hilbert space) quantum mechanics in a noncontextual framework, reinterpreting quantum probabilities as conditional instead of absolute. We provide here an improved version of this model and show that it predicts that, whenever idealized measurements are performed, a modified Bell-Clauser-Horne-Shimony-Holt ( BCHSH) inequality holds if one takes into account all individual systems that are prepared, standard quantum predictions hold if one considers only the individual systems that are detected, and a standard BCHSH inequality holds at a microscopic (purely theoretical) level. These results admit an intuitive explanation in terms of an unconventional kind of unfair sampling and constitute a first example of the unified perspective that can be attained by adopting the ESR model.
Scoring functions for protein-protein interactions.
Moal, Iain H; Moretti, Rocco; Baker, David; Fernández-Recio, Juan
2013-12-01
The computational evaluation of protein-protein interactions will play an important role in organising the wealth of data being generated by high-throughput initiatives. Here we discuss future applications, report recent developments and identify areas requiring further investigation. Many functions have been developed to quantify the structural and energetic properties of interacting proteins, finding use in interrelated challenges revolving around the relationship between sequence, structure and binding free energy. These include loop modelling, side-chain refinement, docking, multimer assembly, affinity prediction, affinity change upon mutation, hotspots location and interface design. Information derived from models optimised for one of these challenges can be used to benefit the others, and can be unified within the theoretical frameworks of multi-task learning and Pareto-optimal multi-objective learning. Copyright © 2013 Elsevier Ltd. All rights reserved.
Lynch, Brighide M; McCance, Tanya; McCormack, Brendan; Brown, Donna
2018-01-01
To implement and evaluate the effect of using the Person-Centred Situational Leadership Framework to develop person-centred care within nursing homes. Many models of nursing leadership have been developed internationally in recent years but do not fit with the emergent complex philosophy of nursing home care. This study develops the Person-Centred Situational Leadership Framework that supports this philosophy. It forms the theoretical basis of the action research study described in this article. This was a complex action research study using the following multiple methods: nonparticipatory observation using the Workplace Culture Critical Analysis Tool (n = 30); critical and reflective dialogues with participants (n = 39) at time 1 (beginning of study), time 2 (end of study) and time 3 (6 months after study had ended); narratives from residents at time 1 and time 2 (n = 8); focus groups with staff at time 2 (n = 12) and reflective field notes. Different approaches to analyse the data were adopted for the different data sources, and the overall results of the thematic analysis were brought together using cognitive mapping. The Person-Centred Situational Leadership Framework captures seven core attributes of the leader that facilitate person-centredness in others: relating to the essence of being; harmonising actions with the vision; balancing concern for compliance with concern for person-centredness; connecting with the other person in the instant; intentionally enthusing the other person to act; listening to the other person with the heart; and unifying through collaboration, appreciation and trust. This study led to a theoretical contribution in relation to the Person-Centred Practice Framework. It makes an important key contribution internationally to the gap in knowledge about leadership in residential care facilities for older people. The findings can be seen to have significant applicability internationally, across other care settings and contexts. © 2017 John Wiley & Sons Ltd.
Stam, Henderikus J.
2015-01-01
The search for a so-called unified or integrated theory has long served as a goal for some psychologists, even if the search is often implicit. But if the established sciences do not have an explicitly unified set of theories, then why should psychology? After examining this question again I argue that psychology is in fact reasonably unified around its methods and its commitment to functional explanations, an indeterminate functionalism. The question of the place of the neurosciences in this framework is complex. On the one hand, the neuroscientific project will not likely renew and synthesize the disparate arms of psychology. On the other hand, their reformulation of what it means to be human will exert an influence in multiple ways. One way to capture that influence is to conceptualize the brain in terms of a technology that we interact with in a manner that we do not yet fully understand. In this way we maintain both a distance from neuro-reductionism and refrain from committing to an unfettered subjectivity. PMID:26500571
NASA Astrophysics Data System (ADS)
Abdi, Daniel S.; Giraldo, Francis X.
2016-09-01
A unified approach for the numerical solution of the 3D hyperbolic Euler equations using high order methods, namely continuous Galerkin (CG) and discontinuous Galerkin (DG) methods, is presented. First, we examine how classical CG that uses a global storage scheme can be constructed within the DG framework using constraint imposition techniques commonly used in the finite element literature. Then, we implement and test a simplified version in the Non-hydrostatic Unified Model of the Atmosphere (NUMA) for the case of explicit time integration and a diagonal mass matrix. Constructing CG within the DG framework allows CG to benefit from the desirable properties of DG such as, easier hp-refinement, better stability etc. Moreover, this representation allows for regional mixing of CG and DG depending on the flow regime in an area. The different flavors of CG and DG in the unified implementation are then tested for accuracy and performance using a suite of benchmark problems representative of cloud-resolving scale, meso-scale and global-scale atmospheric dynamics. The value of our unified approach is that we are able to show how to carry both CG and DG methods within the same code and also offer a simple recipe for modifying an existing CG code to DG and vice versa.
U.S. History Framework for the 2010 National Assessment of Educational Progress
ERIC Educational Resources Information Center
National Assessment Governing Board, 2009
2009-01-01
This framework identifies the main ideas, major events, key individuals, and unifying themes of American history as a basis for preparing the 2010 assessment. The framework recognizes that U.S. history includes powerful ideas, common and diverse traditions, economic developments, technological and scientific innovations, philosophical debates,…
Applying Laban's Movement Framework in Elementary Physical Education
ERIC Educational Resources Information Center
Langton, Terence W.
2007-01-01
This article recommends raising the bar in elementary physical education by using Laban's movement framework to develop curriculum content in the areas of games, gymnastics, and dance (with physical fitness concepts blended in) in order to help students achieve the NASPE content standards. The movement framework can permeate and unify an…
Photoionization and Recombination
NASA Technical Reports Server (NTRS)
Nahar, Sultana N.
2000-01-01
Theoretically self-consistent calculations for photoionization and (e + ion) recombination are described. The same eigenfunction expansion for the ion is employed in coupled channel calculations for both processes, thus ensuring consistency between cross sections and rates. The theoretical treatment of (e + ion) recombination subsumes both the non-resonant recombination ("radiative recombination"), and the resonant recombination ("di-electronic recombination") processes in a unified scheme. In addition to the total, unified recombination rates, level-specific recombination rates and photoionization cross sections are obtained for a large number of atomic levels. Both relativistic Breit-Pauli, and non-relativistic LS coupling, calculations are carried out in the close coupling approximation using the R-matrix method. Although the calculations are computationally intensive, they yield nearly all photoionization and recombination parameters needed for astrophysical photoionization models with higher precision than hitherto possible, estimated at about 10-20% from comparison with experimentally available data (including experimentally derived DR rates). Results are electronically available for over 40 atoms and ions. Photoionization and recombination of He-, and Li-like C and Fe are described for X-ray modeling. The unified method yields total and complete (e+ion) recombination rate coefficients, that can not otherwise be obtained theoretically or experimentally.
NASA Astrophysics Data System (ADS)
Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The
2015-11-01
While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.
From recording discrete actions to studying continuous goal-directed behaviours in team sports.
Correia, Vanda; Araújo, Duarte; Vilar, Luís; Davids, Keith
2013-01-01
This paper highlights the importance of examining interpersonal interactions in performance analysis of team sports, predicated on the relationship between perception and action, compared to the traditional cataloguing of actions by individual performers. We discuss how ecological dynamics may provide a potential unifying theoretical and empirical framework to achieve this re-emphasis in research. With reference to data from illustrative studies on performance analysis and sport expertise, we critically evaluate some of the main assumptions and methodological approaches with regard to understanding how information influences action and decision-making during team sports performance. Current data demonstrate how the understanding of performance behaviours in team sports by sport scientists and practitioners may be enhanced with a re-emphasis in research on the dynamics of emergent ongoing interactions. Ecological dynamics provides formal and theoretically grounded descriptions of player-environment interactions with respect to key performance goals and the unfolding information of competitive performance. Developing these formal descriptions and explanations of sport performance may provide a significant contribution to the field of performance analysis, supporting design and intervention in both research and practice.
The melting of stable glasses is governed by nucleation-and-growth dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jack, Robert L.; Berthier, Ludovic
2016-06-28
We discuss the microscopic mechanisms by which low-temperature amorphous states, such as ultrastable glasses, transform into equilibrium fluids, after a sudden temperature increase. Experiments suggest that this process is similar to the melting of crystals, thus differing from the behaviour found in ordinary glasses. We rationalize these observations using the physical idea that the transformation process takes place close to a “hidden” equilibrium first-order phase transition, which is observed in systems of coupled replicas. We illustrate our views using simulation results for a simple two-dimensional plaquette spin model, which is known to exhibit a range of glassy behaviour. Our resultsmore » suggest that nucleation-and-growth dynamics, as found near ordinary first-order transitions, is also the correct theoretical framework to analyse the melting of ultrastable glasses. Our approach provides a unified understanding of multiple experimental observations, such as propagating melting fronts, large kinetic stability ratios, and “giant” dynamic length scales. We also provide a comprehensive discussion of available theoretical pictures proposed in the context of ultrastable glass melting.« less
NASA Astrophysics Data System (ADS)
Ophaug, Vegard; Gerlach, Christian
2017-11-01
This work is an investigation of three methods for regional geoid computation: Stokes's formula, least-squares collocation (LSC), and spherical radial base functions (RBFs) using the spline kernel (SK). It is a first attempt to compare the three methods theoretically and numerically in a unified framework. While Stokes integration and LSC may be regarded as classic methods for regional geoid computation, RBFs may still be regarded as a modern approach. All methods are theoretically equal when applied globally, and we therefore expect them to give comparable results in regional applications. However, it has been shown by de Min (Bull Géod 69:223-232, 1995. doi: 10.1007/BF00806734) that the equivalence of Stokes's formula and LSC does not hold in regional applications without modifying the cross-covariance function. In order to make all methods comparable in regional applications, the corresponding modification has been introduced also in the SK. Ultimately, we present numerical examples comparing Stokes's formula, LSC, and SKs in a closed-loop environment using synthetic noise-free data, to verify their equivalence. All agree on the millimeter level.
Nonlocal transport in the presence of transport barriers
NASA Astrophysics Data System (ADS)
Del-Castillo-Negrete, D.
2013-10-01
There is experimental, numerical, and theoretical evidence that transport in plasmas can, under certain circumstances, depart from the standard local, diffusive description. Examples include fast pulse propagation phenomena in perturbative experiments, non-diffusive scaling in L-mode plasmas, and non-Gaussian statistics of fluctuations. From the theoretical perspective, non-diffusive transport descriptions follow from the relaxation of the restrictive assumptions (locality, scale separation, and Gaussian/Markovian statistics) at the foundation of diffusive models. We discuss an alternative class of models able to capture some of the observed non-diffusive transport phenomenology. The models are based on a class of nonlocal, integro-differential operators that provide a unifying framework to describe non- Fickian scale-free transport, and non-Markovian (memory) effects. We study the interplay between nonlocality and internal transport barriers (ITBs) in perturbative transport including cold edge pulses and power modulation. Of particular interest in the nonlocal ``tunnelling'' of perturbations through ITBs. Also, flux-gradient diagrams are discussed as diagnostics to detect nonlocal transport processes in numerical simulations and experiments. Work supported by the US Department of Energy.
Call, Matthew L; Nyberg, Anthony J; Thatcher, Sherry M B
2015-05-01
Stars--employees with disproportionately high and prolonged (a) performance, (b) visibility, and (c) relevant social capital--have garnered attention in economics, sociology, and management. However, star research is often isolated within these research disciplines. Thus, 3 distinct star research streams are evolving, each disconnected from the others and each bringing siloed theoretical perspectives, terms, and assumptions. A conceptual review of these perspectives reveals a focus on the expost effects that stars exert in organizations with little explanation of who a star is and how one becomes a star. To synthesize the stars literature across these 3 disciplines, we apply psychological theories, specifically motivation theories, to create an integrative framework for stars research. Thus, we present a unified stars definition and extend theory on the making, managing, and mobility of stars. We extend research about how and why employees may be motivated to become stars, how best to manage stars and their relationships with colleagues, and how to motivate star retention. We then outline directions for future research. (c) 2015 APA, all rights reserved.
NASA Astrophysics Data System (ADS)
Zhu, Qi-Zhi
2017-02-01
A proper criterion describing when material fails is essential for deep understanding and constitutive modeling of rock damage and failure by microcracking. Physically, such a criterion should be the global effect of local mechanical response and microstructure evolution inside the material. This paper aims at deriving a new mechanisms-based failure criterion for brittle rocks, based on micromechanical unilateral damage-friction coupling analyses rather than on the basic results from the classical linear elastic fracture mechanics. The failure functions respectively describing three failure modes (purely tensile mode, tensile-shear mode as well as compressive-shear mode) are achieved in a unified upscaling framework and illustrated in the Mohr plane and also in the plane of principal stresses. The strength envelope is proved to be continuous and smooth with a compressive to tensile strength ratio dependent on material properties. Comparisons with experimental data are finally carried out. By this work, we also provide a theoretical evidence on the hybrid failure and the smooth transition from tensile failure to compressive-shear failure.
Sheldon Glashow, the Electroweak Theory, and the Grand Unified Theory
] 'Glashow shared the 1979 Nobel Prize for physics with Steven Weinberg and Abdus Salam for unifying the particle physics and provides a framework for understanding how the early universe evolved and how the our universe came into being," says Lawrence R. Sulak, chairman of the Boston University physics
"UNICERT," or: Towards the Development of a Unified Language Certificate for German Universities.
ERIC Educational Resources Information Center
Voss, Bernd
The standardization of second language proficiency levels for university students in Germany is discussed. Problems with the current system, in which each university has developed its own program of study and proficiency certification, are examined and a framework for development of a unified language certificate for all universities is outlined.…
Quantum Behavior of an Autonomous Maxwell Demon
NASA Astrophysics Data System (ADS)
Chapman, Adrian; Miyake, Akimasa
2015-03-01
A Maxwell Demon is an agent that can exploit knowledge of a system's microstate to perform useful work. The second law of thermodynamics is only recovered upon taking into account the work required to irreversibly update the demon's memory, bringing information theoretic concepts into a thermodynamic framework. Recently, there has been interest in modeling a classical Maxwell demon as an autonomous physical system to study this information-work tradeoff explicitly. Motivated by the idea that states with non-local entanglement structure can be used as a computational resource, we ask whether these states have thermodynamic resource quality as well by generalizing a particular classical autonomous Maxwell demon to the quantum regime. We treat the full quantum description using a matrix product operator formalism, which allows us to handle quantum and classical correlations in a unified framework. Applying this, together with techniques from statistical mechanics, we are able to approximate nonlocal quantities such as the erasure performed on the demon's memory register when correlations are present. Finally, we examine how the demon may use these correlations as a resource to outperform its classical counterpart.
Dynamical Systems Theory: Application to Pedagogy
NASA Astrophysics Data System (ADS)
Abraham, Jane L.
Theories of learning affect how cognition is viewed, and this subsequently leads to the style of pedagogical practice that is used in education. Traditionally, educators have relied on a variety of theories on which to base pedagogy. Behavioral learning theories influenced the teaching/learning process for over 50 years. In the 1960s, the information processing approach brought the mind back into the learning process. The current emphasis on constructivism integrates the views of Piaget, Vygotsky, and cognitive psychology. Additionally, recent scientific advances have allowed researchers to shift attention to biological processes in cognition. The problem is that these theories do not provide an integrated approach to understanding principles responsible for differences among students in cognitive development and learning ability. Dynamical systems theory offers a unifying theoretical framework to explain the wider context in which learning takes place and the processes involved in individual learning. This paper describes how principles of Dynamic Systems Theory can be applied to cognitive processes of students, the classroom community, motivation to learn, and the teaching/learning dynamic giving educational psychologists a framework for research and pedagogy.
Representing Thoughts, Words, and Things in the UMLS
Campbell, Keith E.; Oliver, Diane E.; Spackman, Kent A.; Shortliffe, Edward H.
1998-01-01
The authors describe a framework, based on the Ogden-Richards semiotic triangle, for understanding the relationship between the Unified Medical Language System (UMLS) and the source terminologies from which the UMLS derives its content. They pay particular attention to UMLS's Concept Unique Identifier (CUI) and the sense of “meaning” it represents as contrasted with the sense of “meaning” represented by the source terminologies. The CUI takes on emergent meaning through linkage to terms in different terminology systems. In some cases, a CUI's emergent meaning can differ significantly from the original sources' intended meanings of terms linked by that CUI. Identification of these different senses of meaning within the UMLS is consistent with historical themes of semantic interpretation of language. Examination of the UMLS within such a historical framework makes it possible to better understand the strengths and limitations of the UMLS approach for integrating disparate terminologic systems and to provide a model, or theoretic foundation, for evaluating the UMLS as a Possible World—that is, as a mathematical formalism that represents propositions about some perspective or interpretation of the physical world. PMID:9760390
A unified framework for heat and mass transport at the atomic scale
NASA Astrophysics Data System (ADS)
Ponga, Mauricio; Sun, Dingyi
2018-04-01
We present a unified framework to simulate heat and mass transport in systems of particles. The proposed framework is based on kinematic mean field theory and uses a phenomenological master equation to compute effective transport rates between particles without the need to evaluate operators. We exploit this advantage and apply the model to simulate transport phenomena at the nanoscale. We demonstrate that, when calibrated to experimentally-measured transport coefficients, the model can accurately predict transient and steady state temperature and concentration profiles even in scenarios where the length of the device is comparable to the mean free path of the carriers. Through several example applications, we demonstrate the validity of our model for all classes of materials, including ones that, until now, would have been outside the domain of computational feasibility.
ERIC Educational Resources Information Center
Partnership for 21st Century Skills, 2009
2009-01-01
To help practitioners integrate skills into the teaching of core academic subjects, the Partnership for 21st Century Skills has developed a unified, collective vision for learning known as the Framework for 21st Century Learning. This Framework describes the skills, knowledge and expertise students must master to succeed in work and life; it is a…
Toward a Unified Validation Framework in Mixed Methods Research
ERIC Educational Resources Information Center
Dellinger, Amy B.; Leech, Nancy L.
2007-01-01
The primary purpose of this article is to further discussions of validity in mixed methods research by introducing a validation framework to guide thinking about validity in this area. To justify the use of this framework, the authors discuss traditional terminology and validity criteria for quantitative and qualitative research, as well as…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang Yinan; Shi Handuo; Xiong Zhaoxi
We present a unified universal quantum cloning machine, which combines several different existing universal cloning machines together, including the asymmetric case. In this unified framework, the identical pure states are projected equally into each copy initially constituted by input and one half of the maximally entangled states. We show explicitly that the output states of those universal cloning machines are the same. One importance of this unified cloning machine is that the cloning procession is always the symmetric projection, which reduces dramatically the difficulties for implementation. Also, it is found that this unified cloning machine can be directly modified tomore » the general asymmetric case. Besides the global fidelity and the single-copy fidelity, we also present all possible arbitrary-copy fidelities.« less
[Arabian food pyramid: unified framework for nutritional health messages].
Shokr, Adel M
2008-01-01
There are several ways to present nutritional health messages, particularly pyramidic indices, but they have many deficiencies such as lack of agreement on a unified or clear methodology for food grouping and ignoring nutritional group inter-relation and integration. This causes confusion for health educators and target individuals. This paper presents an Arabian food pyramid that aims to unify the bases of nutritional health messages, bringing together the function, contents, source and nutritional group servings and indicating the inter-relation and integration of nutritional groups. This provides comprehensive, integrated, simple and flexible health messages.
Theoretical aspects of cellular decision-making and information-processing.
Kobayashi, Tetsuya J; Kamimura, Atsushi
2012-01-01
Microscopic biological processes have extraordinary complexity and variety at the sub-cellular, intra-cellular, and multi-cellular levels. In dealing with such complex phenomena, conceptual and theoretical frameworks are crucial, which enable us to understand seemingly different intra- and inter-cellular phenomena from unified viewpoints. Decision-making is one such concept that has attracted much attention recently. Since a number of cellular behavior can be regarded as processes to make specific actions in response to external stimuli, decision-making can cover and has been used to explain a broad range of different cellular phenomena [Balázsi et al. (Cell 144(6):910, 2011), Zeng et al. (Cell 141(4):682, 2010)]. Decision-making is also closely related to cellular information-processing because appropriate decisions cannot be made without exploiting the information that the external stimuli contain. Efficiency of information transduction and processing by intra-cellular networks determines the amount of information obtained, which in turn limits the efficiency of subsequent decision-making. Furthermore, information-processing itself can serve as another concept that is crucial for understanding of other biological processes than decision-making. In this work, we review recent theoretical developments on cellular decision-making and information-processing by focusing on the relation between these two concepts.
Wiltshire, Travis J; Lobato, Emilio J C; McConnell, Daniel S; Fiore, Stephen M
2014-01-01
In this paper we suggest that differing approaches to the science of social cognition mirror the arguments between radical embodied and traditional approaches to cognition. We contrast the use in social cognition of theoretical inference and mental simulation mechanisms with approaches emphasizing a direct perception of others' mental states. We build from a recent integrative framework unifying these divergent perspectives through the use of dual-process theory and supporting social neuroscience research. Our elaboration considers two complementary notions of direct perception: one primarily stemming from ecological psychology and the other from enactive cognition theory. We use this as the foundation from which to offer an account of the informational basis for social information and assert a set of research propositions to further the science of social cognition. In doing so, we point out how perception of the minds of others can be supported in some cases by lawful information, supporting direct perception of social affordances and perhaps, mental states, and in other cases by cues that support indirect perceptual inference. Our goal is to extend accounts of social cognition by integrating advances across disciplines to provide a multi-level and multi-theoretic description that can advance this field and offer a means through which to reconcile radical embodied and traditional approaches to cognitive neuroscience.
Analysis and Management of Animal Populations: Modeling, Estimation and Decision Making
Williams, B.K.; Nichols, J.D.; Conroy, M.J.
2002-01-01
This book deals with the processes involved in making informed decisions about the management of animal populations. It covers the modeling of population responses to management actions, the estimation of quantities needed in the modeling effort, and the application of these estimates and models to the development of sound management decisions. The book synthesizes and integrates in a single volume the methods associated with these themes, as they apply to ecological assessment and conservation of animal populations. KEY FEATURES * Integrates population modeling, parameter estimation and * decision-theoretic approaches to management in a single, cohesive framework * Provides authoritative, state-of-the-art descriptions of quantitative * approaches to modeling, estimation and decision-making * Emphasizes the role of mathematical modeling in the conduct of science * and management * Utilizes a unifying biological context, consistent mathematical notation, * and numerous biological examples
NASA Astrophysics Data System (ADS)
Taousser, Fatima; Defoort, Michael; Djemai, Mohamed
2016-01-01
This paper investigates the consensus problem for linear multi-agent system with fixed communication topology in the presence of intermittent communication using the time-scale theory. Since each agent can only obtain relative local information intermittently, the proposed consensus algorithm is based on a discontinuous local interaction rule. The interaction among agents happens at a disjoint set of continuous-time intervals. The closed-loop multi-agent system can be represented using mixed linear continuous-time and linear discrete-time models due to intermittent information transmissions. The time-scale theory provides a powerful tool to combine continuous-time and discrete-time cases and study the consensus protocol under a unified framework. Using this theory, some conditions are derived to achieve exponential consensus under intermittent information transmissions. Simulations are performed to validate the theoretical results.
Valley Topological Phases in Bilayer Sonic Crystals
NASA Astrophysics Data System (ADS)
Lu, Jiuyang; Qiu, Chunyin; Deng, Weiyin; Huang, Xueqin; Li, Feng; Zhang, Fan; Chen, Shuqi; Liu, Zhengyou
2018-03-01
Recently, the topological physics in artificial crystals for classical waves has become an emerging research area. In this Letter, we propose a unique bilayer design of sonic crystals that are constructed by two layers of coupled hexagonal array of triangular scatterers. Assisted by the additional layer degree of freedom, a rich topological phase diagram is achieved by simply rotating scatterers in both layers. Under a unified theoretical framework, two kinds of valley-projected topological acoustic insulators are distinguished analytically, i.e., the layer-mixed and layer-polarized topological valley Hall phases, respectively. The theory is evidently confirmed by our numerical and experimental observations of the nontrivial edge states that propagate along the interfaces separating different topological phases. Various applications such as sound communications in integrated devices can be anticipated by the intriguing acoustic edge states enriched by the layer information.
Liu, Jian; Liu, Kexin; Liu, Shutang
2017-01-01
In this paper, adaptive control is extended from real space to complex space, resulting in a new control scheme for a class of n-dimensional time-dependent strict-feedback complex-variable chaotic (hyperchaotic) systems (CVCSs) in the presence of uncertain complex parameters and perturbations, which has not been previously reported in the literature. In detail, we have developed a unified framework for designing the adaptive complex scalar controller to ensure this type of CVCSs asymptotically stable and for selecting complex update laws to estimate unknown complex parameters. In particular, combining Lyapunov functions dependent on complex-valued vectors and back-stepping technique, sufficient criteria on stabilization of CVCSs are derived in the sense of Wirtinger calculus in complex space. Finally, numerical simulation is presented to validate our theoretical results. PMID:28467431
The intrapsychics of gender: a model of self-socialization.
Tobin, Desiree D; Menon, Meenakshi; Menon, Madhavi; Spatta, Brooke C; Hodges, Ernest V E; Perry, David G
2010-04-01
This article outlines a model of the structure and the dynamics of gender cognition in childhood. The model incorporates 3 hypotheses featured in different contemporary theories of childhood gender cognition and unites them under a single theoretical framework. Adapted from Greenwald et al. (2002), the model distinguishes three constructs: gender identity, gender stereotypes, and attribute self-perceptions. The model specifies 3 causal processes among the constructs: Gender identity and stereotypes interactively influence attribute self-perceptions (stereotype emulation hypothesis); gender identity and attribute self-perceptions interactively influence gender stereotypes (stereotype construction hypothesis); and gender stereotypes and attribute self-perceptions interactively influence identity (identity construction hypothesis). The model resolves nagging ambiguities in terminology, organizes diverse hypotheses and empirical findings under a unifying conceptual umbrella, and stimulates many new research directions. PsycINFO Database Record (c) 2010 APA, all rights reserved.
Liu, Jian; Liu, Kexin; Liu, Shutang
2017-01-01
In this paper, adaptive control is extended from real space to complex space, resulting in a new control scheme for a class of n-dimensional time-dependent strict-feedback complex-variable chaotic (hyperchaotic) systems (CVCSs) in the presence of uncertain complex parameters and perturbations, which has not been previously reported in the literature. In detail, we have developed a unified framework for designing the adaptive complex scalar controller to ensure this type of CVCSs asymptotically stable and for selecting complex update laws to estimate unknown complex parameters. In particular, combining Lyapunov functions dependent on complex-valued vectors and back-stepping technique, sufficient criteria on stabilization of CVCSs are derived in the sense of Wirtinger calculus in complex space. Finally, numerical simulation is presented to validate our theoretical results.
Collusion-resistant multimedia fingerprinting: a unified framework
NASA Astrophysics Data System (ADS)
Wu, Min; Trappe, Wade; Wang, Z. Jane; Liu, K. J. Ray
2004-06-01
Digital fingerprints are unique labels inserted in different copies of the same content before distribution. Each digital fingerprint is assigned to an inteded recipient, and can be used to trace the culprits who use their content for unintended purposes. Attacks mounted by multiple users, known as collusion attacks, provide a cost-effective method for attenuating the identifying fingerprint from each coluder, thus collusion poses a reeal challenge to protect the digital media data and enforce usage policies. This paper examines a few major design methodologies for collusion-resistant fingerprinting of multimedia, and presents a unified framework that helps highlight the common issues and the uniqueness of different fingerprinting techniques.
A development framework for semantically interoperable health information systems.
Lopez, Diego M; Blobel, Bernd G M E
2009-02-01
Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.
NASA Astrophysics Data System (ADS)
Kim, Yeong E.; Zubarev, Alexander L.
The most basic theoretical challenge for understanding low-energy nuclear reaction (LENR) and transmutation reaction (LETR) in condensed matters is to find mechanisms by which the large Coulomb barrier between fusing nuclei can be overcome. A unifying theory of LENR and LETR has been developed to provide possible mechanisms for the LENR and LETR processes in matters based on high-density nano-scale and micro-scale quantum plasmas. It is shown that recently developed theoretical models based on Bose-Einstein Fusion (BEF) mechanism and Quantum Plasma Nuclear Fusion (QPNF) mechanism are applicable to the results of many different types of LENR and LETR experiments.
On the neural implementation of the speed-accuracy trade-off
Standage, Dominic; Blohm, Gunnar; Dorris, Michael C.
2014-01-01
Decisions are faster and less accurate when conditions favor speed, and are slower and more accurate when they favor accuracy. This phenomenon is referred to as the speed-accuracy trade-off (SAT). Behavioral studies of the SAT have a long history, and the data from these studies are well characterized within the framework of bounded integration. According to this framework, decision makers accumulate noisy evidence until the running total for one of the alternatives reaches a bound. Lower and higher bounds favor speed and accuracy respectively, each at the expense of the other. Studies addressing the neural implementation of these computations are a recent development in neuroscience. In this review, we describe the experimental and theoretical evidence provided by these studies. We structure the review according to the framework of bounded integration, describing evidence for (1) the modulation of the encoding of evidence under conditions favoring speed or accuracy, (2) the modulation of the integration of encoded evidence, and (3) the modulation of the amount of integrated evidence sufficient to make a choice. We discuss commonalities and differences between the proposed neural mechanisms, some of their assumptions and simplifications, and open questions for future work. We close by offering a unifying hypothesis on the present state of play in this nascent research field. PMID:25165430
On the neural implementation of the speed-accuracy trade-off.
Standage, Dominic; Blohm, Gunnar; Dorris, Michael C
2014-01-01
Decisions are faster and less accurate when conditions favor speed, and are slower and more accurate when they favor accuracy. This phenomenon is referred to as the speed-accuracy trade-off (SAT). Behavioral studies of the SAT have a long history, and the data from these studies are well characterized within the framework of bounded integration. According to this framework, decision makers accumulate noisy evidence until the running total for one of the alternatives reaches a bound. Lower and higher bounds favor speed and accuracy respectively, each at the expense of the other. Studies addressing the neural implementation of these computations are a recent development in neuroscience. In this review, we describe the experimental and theoretical evidence provided by these studies. We structure the review according to the framework of bounded integration, describing evidence for (1) the modulation of the encoding of evidence under conditions favoring speed or accuracy, (2) the modulation of the integration of encoded evidence, and (3) the modulation of the amount of integrated evidence sufficient to make a choice. We discuss commonalities and differences between the proposed neural mechanisms, some of their assumptions and simplifications, and open questions for future work. We close by offering a unifying hypothesis on the present state of play in this nascent research field.
Rosenfeld, Daniel L; Burrow, Anthony L
2017-05-01
By departing from social norms regarding food behaviors, vegetarians acquire membership in a distinct social group and can develop a salient vegetarian identity. However, vegetarian identities are diverse, multidimensional, and unique to each individual. Much research has identified fundamental psychological aspects of vegetarianism, and an identity framework that unifies these findings into common constructs and conceptually defines variables is needed. Integrating psychological theories of identity with research on food choices and vegetarianism, this paper proposes a conceptual model for studying vegetarianism: The Unified Model of Vegetarian Identity (UMVI). The UMVI encompasses ten dimensions-organized into three levels (contextual, internalized, and externalized)-that capture the role of vegetarianism in an individual's self-concept. Contextual dimensions situate vegetarianism within contexts; internalized dimensions outline self-evaluations; and externalized dimensions describe enactments of identity through behavior. Together, these dimensions form a coherent vegetarian identity, characterizing one's thoughts, feelings, and behaviors regarding being vegetarian. By unifying dimensions that capture psychological constructs universally, the UMVI can prevent discrepancies in operationalization, capture the inherent diversity of vegetarian identities, and enable future research to generate greater insight into how people understand themselves and their food choices. Copyright © 2017 Elsevier Ltd. All rights reserved.
Franz, A; Triesch, J
2010-12-01
The perception of the unity of objects, their permanence when out of sight, and the ability to perceive continuous object trajectories even during occlusion belong to the first and most important capacities that infants have to acquire. Despite much research a unified model of the development of these abilities is still missing. Here we make an attempt to provide such a unified model. We present a recurrent artificial neural network that learns to predict the motion of stimuli occluding each other and that develops representations of occluded object parts. It represents completely occluded, moving objects for several time steps and successfully predicts their reappearance after occlusion. This framework allows us to account for a broad range of experimental data. Specifically, the model explains how the perception of object unity develops, the role of the width of the occluders, and it also accounts for differences between data for moving and stationary stimuli. We demonstrate that these abilities can be acquired by learning to predict the sensory input. The model makes specific predictions and provides a unifying framework that has the potential to be extended to other visual event categories. Copyright © 2010 Elsevier Inc. All rights reserved.
The Pursuit of a "Better" Explanation as an Organizing Framework for Science Teaching and Learning
ERIC Educational Resources Information Center
Papadouris, Nicos; Vokos, Stamatis; Constantinou, Constantinos P.
2018-01-01
This article seeks to make the case for the pursuit of a "better" explanation being a productive organizing framework for science teaching and learning. Underlying this position is the idea that this framework allows promoting, in a unified manner, facility with the scientific practice of constructing explanations, appreciation of its…
NASA Technical Reports Server (NTRS)
Saleeb, Atef F.; Li, Wei
1995-01-01
This two-part report is concerned with the development of a general framework for the implicit time-stepping integrators for the flow and evolution equations in generalized viscoplastic models. The primary goal is to present a complete theoretical formulation, and to address in detail the algorithmic and numerical analysis aspects involved in its finite element implementation, as well as to critically assess the numerical performance of the developed schemes in a comprehensive set of test cases. On the theoretical side, the general framework is developed on the basis of the unconditionally-stable, backward-Euler difference scheme as a starting point. Its mathematical structure is of sufficient generality to allow a unified treatment of different classes of viscoplastic models with internal variables. In particular, two specific models of this type, which are representative of the present start-of-art in metal viscoplasticity, are considered in applications reported here; i.e., fully associative (GVIPS) and non-associative (NAV) models. The matrix forms developed for both these models are directly applicable for both initially isotropic and anisotropic materials, in general (three-dimensional) situations as well as subspace applications (i.e., plane stress/strain, axisymmetric, generalized plane stress in shells). On the computational side, issues related to efficiency and robustness are emphasized in developing the (local) interative algorithm. In particular, closed-form expressions for residual vectors and (consistent) material tangent stiffness arrays are given explicitly for both GVIPS and NAV models, with their maximum sizes 'optimized' to depend only on the number of independent stress components (but independent of the number of viscoplastic internal state parameters). Significant robustness of the local iterative solution is provided by complementing the basic Newton-Raphson scheme with a line-search strategy for convergence. In the present first part of the report, we focus on the theoretical developments, and discussions of the results of numerical-performance studies using the integration schemes for GVIPS and NAV models.
Theoretical Convergence in Assessment of Cognition
ERIC Educational Resources Information Center
Bowden, Stephen C.
2013-01-01
In surveying the literature on assessment of cognitive abilities in adults and children, it is easy to assume that the proliferation of test batteries and terminology reflects a poverty of unifying models. However, the lack of recognition accorded good models of cognitive abilities may reflect inattention to theoretical development and injudicious…
Space-Time Processing for Tactical Mobile Ad Hoc Networks
2008-08-01
vision for multiple concurrent communication settings, i.e., a many-to-many framework where multi-packet transmissions (MPTs) and multi-packet...modelling framework of capacity-delay tradeoffs We have introduced the first unified modeling framework for the computation of fundamental limits o We...dalities in wireless n twor i-packet modelling framework to account for the use of m lti-packet reception (MPR) f ad hoc networks with MPT under
Unified formalism for higher order non-autonomous dynamical systems
NASA Astrophysics Data System (ADS)
Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso
2012-03-01
This work is devoted to giving a geometric framework for describing higher order non-autonomous mechanical systems. The starting point is to extend the Lagrangian-Hamiltonian unified formalism of Skinner and Rusk for these kinds of systems, generalizing previous developments for higher order autonomous mechanical systems and first-order non-autonomous mechanical systems. Then, we use this unified formulation to derive the standard Lagrangian and Hamiltonian formalisms, including the Legendre-Ostrogradsky map and the Euler-Lagrange and the Hamilton equations, both for regular and singular systems. As applications of our model, two examples of regular and singular physical systems are studied.
Colaborated Architechture Framework for Composition UML 2.0 in Zachman Framework
NASA Astrophysics Data System (ADS)
Hermawan; Hastarista, Fika
2016-01-01
Zachman Framework (ZF) is the framework of enterprise architechture that most widely adopted in the Enterprise Information System (EIS) development. In this study, has been developed Colaborated Architechture Framework (CAF) to collaborate ZF with Unified Modeling Language (UML) 2.0 modeling. The CAF provides the composition of ZF matrix that each cell is consist of the Model Driven architechture (MDA) from the various UML models and many Software Requirement Specification (SRS) documents. Implementation of this modeling is used to develops Enterprise Resource Planning (ERP). Because ERP have a coverage of applications in large numbers and complexly relations, it is necessary to use Agile Model Driven Design (AMDD) approach as an advanced method to transforms MDA into components of application modules with efficiently and accurately. Finally, through the using of the CAF, give good achievement in fullfilment the needs from all stakeholders that are involved in the overall process stage of Rational Unified Process (RUP), and also obtaining a high satisfaction to fullfiled the functionality features of the ERP software in PT. Iglas (Persero) Gresik.
[Research on tumor information grid framework].
Zhang, Haowei; Qin, Zhu; Liu, Ying; Tan, Jianghao; Cao, Haitao; Chen, Youping; Zhang, Ke; Ding, Yuqing
2013-10-01
In order to realize tumor disease information sharing and unified management, we utilized grid technology to make the data and software resources which distributed in various medical institutions for effective integration so that we could make the heterogeneous resources consistent and interoperable in both semantics and syntax aspects. This article describes the tumor grid framework, the type of the service being packaged in Web Service Description Language (WSDL) and extensible markup language schemas definition (XSD), the client use the serialized document to operate the distributed resources. The service objects could be built by Unified Modeling Language (UML) as middle ware to create application programming interface. All of the grid resources are registered in the index and released in the form of Web Services based on Web Services Resource Framework (WSRF). Using the system we can build a multi-center, large sample and networking tumor disease resource sharing framework to improve the level of development in medical scientific research institutions and the patient's quality of life.
Developing a theoretical framework for complex community-based interventions.
Angeles, Ricardo N; Dolovich, Lisa; Kaczorowski, Janusz; Thabane, Lehana
2014-01-01
Applying existing theories to research, in the form of a theoretical framework, is necessary to advance knowledge from what is already known toward the next steps to be taken. This article proposes a guide on how to develop a theoretical framework for complex community-based interventions using the Cardiovascular Health Awareness Program as an example. Developing a theoretical framework starts with identifying the intervention's essential elements. Subsequent steps include the following: (a) identifying and defining the different variables (independent, dependent, mediating/intervening, moderating, and control); (b) postulating mechanisms how the independent variables will lead to the dependent variables; (c) identifying existing theoretical models supporting the theoretical framework under development; (d) scripting the theoretical framework into a figure or sets of statements as a series of hypotheses, if/then logic statements, or a visual model; (e) content and face validation of the theoretical framework; and (f) revising the theoretical framework. In our example, we combined the "diffusion of innovation theory" and the "health belief model" to develop our framework. Using the Cardiovascular Health Awareness Program as the model, we demonstrated a stepwise process of developing a theoretical framework. The challenges encountered are described, and an overview of the strategies employed to overcome these challenges is presented.
Torfs, Elena; Martí, M Carmen; Locatelli, Florent; Balemans, Sophie; Bürger, Raimund; Diehl, Stefan; Laurent, Julien; Vanrolleghem, Peter A; François, Pierre; Nopens, Ingmar
2017-02-01
A new perspective on the modelling of settling behaviour in water resource recovery facilities is introduced. The ultimate goal is to describe in a unified way the processes taking place both in primary settling tanks (PSTs) and secondary settling tanks (SSTs) for a more detailed operation and control. First, experimental evidence is provided, pointing out distributed particle properties (such as size, shape, density, porosity, and flocculation state) as an important common source of distributed settling behaviour in different settling unit processes and throughout different settling regimes (discrete, hindered and compression settling). Subsequently, a unified model framework that considers several particle classes is proposed in order to describe distributions in settling behaviour as well as the effect of variations in particle properties on the settling process. The result is a set of partial differential equations (PDEs) that are valid from dilute concentrations, where they correspond to discrete settling, to concentrated suspensions, where they correspond to compression settling. Consequently, these PDEs model both PSTs and SSTs.
Information Geometry for Landmark Shape Analysis: Unifying Shape Representation and Deformation
Peter, Adrian M.; Rangarajan, Anand
2010-01-01
Shape matching plays a prominent role in the comparison of similar structures. We present a unifying framework for shape matching that uses mixture models to couple both the shape representation and deformation. The theoretical foundation is drawn from information geometry wherein information matrices are used to establish intrinsic distances between parametric densities. When a parameterized probability density function is used to represent a landmark-based shape, the modes of deformation are automatically established through the information matrix of the density. We first show that given two shapes parameterized by Gaussian mixture models (GMMs), the well-known Fisher information matrix of the mixture model is also a Riemannian metric (actually, the Fisher-Rao Riemannian metric) and can therefore be used for computing shape geodesics. The Fisher-Rao metric has the advantage of being an intrinsic metric and invariant to reparameterization. The geodesic—computed using this metric—establishes an intrinsic deformation between the shapes, thus unifying both shape representation and deformation. A fundamental drawback of the Fisher-Rao metric is that it is not available in closed form for the GMM. Consequently, shape comparisons are computationally very expensive. To address this, we develop a new Riemannian metric based on generalized ϕ-entropy measures. In sharp contrast to the Fisher-Rao metric, the new metric is available in closed form. Geodesic computations using the new metric are considerably more efficient. We validate the performance and discriminative capabilities of these new information geometry-based metrics by pairwise matching of corpus callosum shapes. We also study the deformations of fish shapes that have various topological properties. A comprehensive comparative analysis is also provided using other landmark-based distances, including the Hausdorff distance, the Procrustes metric, landmark-based diffeomorphisms, and the bending energies of the thin-plate (TPS) and Wendland splines. PMID:19110497
The evolving concept of "patient-centeredness" in patient-physician communication research.
Ishikawa, Hirono; Hashimoto, Hideki; Kiuchi, Takahiro
2013-11-01
Over the past few decades, the concept of "patient-centeredness" has been intensively studied in health communication research on patient-physician interaction. Despite its popularity, this concept has often been criticized for lacking a unified definition and operationalized measurement. This article reviews how health communication research on patient-physician interaction has conceptualized and operationalized patient-centered communication based on four major theoretical perspectives in sociology (i.e., functionalism, conflict theory, utilitarianism, and social constructionism), and discusses the agenda for future research in this field. Each theory addresses different aspects of the patient-physician relationship and communication from different theoretical viewpoints. Patient-centeredness is a multifaceted construct with no single theory that can sufficiently define the whole concept. Different theoretical perspectives of patient-centered communication can be selectively adopted according to the context and nature of problems in the patient-physician relationship that a particular study aims to explore. The present study may provide a useful framework: it offers an overview of the differing models of patient-centered communication and the expected roles and goals in each model; it does so toward identifying a communication model that fits the patient and the context and toward theoretically reconstructing existing measures of patient-centered communication. Furthermore, although patient-centered communication has been defined mainly from the viewpoint of physician's behaviors aimed at achieving patient-centered care, patient competence is also required for patient-centered communication. This needs to be examined in current medical practice. Copyright © 2013 Elsevier Ltd. All rights reserved.
Groundwater modelling in decision support: reflections on a unified conceptual framework
NASA Astrophysics Data System (ADS)
Doherty, John; Simmons, Craig T.
2013-11-01
Groundwater models are commonly used as basis for environmental decision-making. There has been discussion and debate in recent times regarding the issue of model simplicity and complexity. This paper contributes to this ongoing discourse. The selection of an appropriate level of model structural and parameterization complexity is not a simple matter. Although the metrics on which such selection should be based are simple, there are many competing, and often unquantifiable, considerations which must be taken into account as these metrics are applied. A unified conceptual framework is introduced and described which is intended to underpin groundwater modelling in decision support with a direct focus on matters regarding model simplicity and complexity.
Pattern-oriented modeling of agent-based complex systems: Lessons from ecology
Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.
2005-01-01
Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.
Pattern-Oriented Modeling of Agent-Based Complex Systems: Lessons from Ecology
NASA Astrophysics Data System (ADS)
Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.
2005-11-01
Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.
A Unified Model of Geostrophic Adjustment and Frontogenesis
NASA Astrophysics Data System (ADS)
Taylor, John; Shakespeare, Callum
2013-11-01
Fronts, or regions with strong horizontal density gradients, are ubiquitous and dynamically important features of the ocean and atmosphere. In the ocean, fronts are associated with enhanced air-sea fluxes, turbulence, and biological productivity, while atmospheric fronts are associated with some of the most extreme weather events. Here, we describe a new mathematical framework for describing the formation of fronts, or frontogenesis. This framework unifies two classical problems in geophysical fluid dynamics, geostrophic adjustment and strain-driven frontogenesis, and provides a number of important extensions beyond previous efforts. The model solutions closely match numerical simulations during the early stages of frontogenesis, and provide a means to describe the development of turbulence at mature fronts.
Integrating diverse databases into an unified analysis framework: a Galaxy approach
Blankenberg, Daniel; Coraor, Nathan; Von Kuster, Gregory; Taylor, James; Nekrutenko, Anton
2011-01-01
Recent technological advances have lead to the ability to generate large amounts of data for model and non-model organisms. Whereas, in the past, there have been a relatively small number of central repositories that serve genomic data, an increasing number of distinct specialized data repositories and resources have been established. Here, we describe a generic approach that provides for the integration of a diverse spectrum of data resources into a unified analysis framework, Galaxy (http://usegalaxy.org). This approach allows the simplified coupling of external data resources with the data analysis tools available to Galaxy users, while leveraging the native data mining facilities of the external data resources. Database URL: http://usegalaxy.org PMID:21531983
Putting the School Interoperability Framework to the Test
ERIC Educational Resources Information Center
Mercurius, Neil; Burton, Glenn; Hopkins, Bill; Larsen, Hans
2004-01-01
The Jurupa Unified School District in Southern California recently partnered with Microsoft, Dell and the Zone Integration Group for the implementation of a School Interoperability Framework (SIF) database repository model throughout the district (Magner 2002). A two-week project--the Integrated District Education Applications System, better known…
Generalized Multilevel Structural Equation Modeling
ERIC Educational Resources Information Center
Rabe-Hesketh, Sophia; Skrondal, Anders; Pickles, Andrew
2004-01-01
A unifying framework for generalized multilevel structural equation modeling is introduced. The models in the framework, called generalized linear latent and mixed models (GLLAMM), combine features of generalized linear mixed models (GLMM) and structural equation models (SEM) and consist of a response model and a structural model for the latent…
A Unified Model of Knowledge Sharing Behaviours: Theoretical Development and Empirical Test
ERIC Educational Resources Information Center
Chennamaneni, Anitha; Teng, James T. C.; Raja, M. K.
2012-01-01
Research and practice on knowledge management (KM) have shown that information technology alone cannot guarantee that employees will volunteer and share knowledge. While previous studies have linked motivational factors to knowledge sharing (KS), we took a further step to thoroughly examine this theoretically and empirically. We developed a…
Rule-Governed Behavior and Self-Control in Children with ADHD: A Theoretical Interpretation
ERIC Educational Resources Information Center
Barry, Leasha M.; Kelly, Melissa A.
2006-01-01
Three theoretical models of ADHD are reviewed and interpreted in light of educational and behavioral research findings specifically in respect to interventions using self-management to address a deficit in rule-governed behavior. The perspectives considered in this paper are (a) the unified theory of behavioral inhibition, sustained attention, and…
ERIC Educational Resources Information Center
O'Keeffe, Shawn Edward
2013-01-01
The author developed a unified nD framework and process ontology for Building Information Modeling (BIM). The research includes a framework developed for 6D BIM, nD BIM, and nD ontology that defines the domain and sub-domain constructs for future nD BIM dimensions. The nD ontology defines the relationships of kinds within any new proposed…
2017-05-25
Operations, and Unified Land Operations) and the US Army’s leader development model identifies how the education , training, and experience of field-grade...officers have failed in their incorporation of the framework because they lack the education , training, and experience for the use of the framework... education , training, and experience of field-grade officers at the division level have influenced their use of the operational framework. The cause for
Parametric models to relate spike train and LFP dynamics with neural information processing.
Banerjee, Arpan; Dean, Heather L; Pesaran, Bijan
2012-01-01
Spike trains and local field potentials (LFPs) resulting from extracellular current flows provide a substrate for neural information processing. Understanding the neural code from simultaneous spike-field recordings and subsequent decoding of information processing events will have widespread applications. One way to demonstrate an understanding of the neural code, with particular advantages for the development of applications, is to formulate a parametric statistical model of neural activity and its covariates. Here, we propose a set of parametric spike-field models (unified models) that can be used with existing decoding algorithms to reveal the timing of task or stimulus specific processing. Our proposed unified modeling framework captures the effects of two important features of information processing: time-varying stimulus-driven inputs and ongoing background activity that occurs even in the absence of environmental inputs. We have applied this framework for decoding neural latencies in simulated and experimentally recorded spike-field sessions obtained from the lateral intraparietal area (LIP) of awake, behaving monkeys performing cued look-and-reach movements to spatial targets. Using both simulated and experimental data, we find that estimates of trial-by-trial parameters are not significantly affected by the presence of ongoing background activity. However, including background activity in the unified model improves goodness of fit for predicting individual spiking events. Uncovering the relationship between the model parameters and the timing of movements offers new ways to test hypotheses about the relationship between neural activity and behavior. We obtained significant spike-field onset time correlations from single trials using a previously published data set where significantly strong correlation was only obtained through trial averaging. We also found that unified models extracted a stronger relationship between neural response latency and trial-by-trial behavioral performance than existing models of neural information processing. Our results highlight the utility of the unified modeling framework for characterizing spike-LFP recordings obtained during behavioral performance.
A unifying framework for systems modeling, control systems design, and system operation
NASA Technical Reports Server (NTRS)
Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.
2005-01-01
Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.
Toward a unified approach to dose-response modeling in ecotoxicology.
Ritz, Christian
2010-01-01
This study reviews dose-response models that are used in ecotoxicology. The focus lies on clarification of differences and similarities between models, and as a side effect, their different guises in ecotoxicology are unravelled. A look at frequently used dose-response models reveals major discrepancies, among other things in naming conventions. Therefore, there is a need for a unified view on dose-response modeling in order to improve the understanding of it and to facilitate communication and comparison of findings across studies, thus realizing its full potential. This study attempts to establish a general framework that encompasses most dose-response models that are of interest to ecotoxicologists in practice. The framework includes commonly used models such as the log-logistic and Weibull models, but also features entire suites of models as found in various guidance documents. An outline on how the proposed framework can be implemented in statistical software systems is also provided.
Attempt to probe nuclear charge radii by cluster and proton emissions
NASA Astrophysics Data System (ADS)
Qian, Yibin; Ren, Zhongzhou; Ni, Dongdong
2013-05-01
We deduce the rms nuclear charge radii for ground states of light and medium-mass nuclei from experimental data of cluster radioactivity and proton emission in a unified framework. On the basis of the density-dependent cluster model, the calculated decay half-lives are obtained within the modified two-potential approach. The charge distribution of emitted clusters in the cluster decay and that of daughter nuclei in the proton emission are determined to correspondingly reproduce the experimental half-lives within the folding model. The obtained charge distribution is then employed to give the rms charge radius of the studied nuclei. Satisfactory agreement between theory and experiment is achieved for available experimental data, and the present results are found to be consistent with theoretical estimations. This study is expected to be helpful in the future detection of nuclear sizes, especially for these exotic nuclei near the proton dripline.
Unifying cost and information in information-theoretic competitive learning.
Kamimura, Ryotaro
2005-01-01
In this paper, we introduce costs into the framework of information maximization and try to maximize the ratio of information to its associated cost. We have shown that competitive learning is realized by maximizing mutual information between input patterns and competitive units. One shortcoming of the method is that maximizing information does not necessarily produce representations faithful to input patterns. Information maximizing primarily focuses on some parts of input patterns that are used to distinguish between patterns. Therefore, we introduce the cost, which represents average distance between input patterns and connection weights. By minimizing the cost, final connection weights reflect input patterns well. We applied the method to a political data analysis, a voting attitude problem and a Wisconsin cancer problem. Experimental results confirmed that, when the cost was introduced, representations faithful to input patterns were obtained. In addition, improved generalization performance was obtained within a relatively short learning time.
Finding order in complexity: themes from the career of Dr. Robert F. Wagner
NASA Astrophysics Data System (ADS)
Myers, Kyle J.
2009-02-01
Over the course of his long and productive career, Dr. Robert F. Wagner built a framework for the evaluation of imaging systems based on a task-based, decision theoretic approach. His most recent contributions involved the consideration of the random effects associated with multiple readers of medical images and the logical extension of this work to the problem of the evaluation of multiple competing classifiers in statistical pattern recognition. This contemporary work expanded on familiar themes from Bob's many SPIE presentations in earlier years. It was driven by the need for practical solutions to current problems facing FDA'S Center for Devices and Radiological Health and the medical imaging community regarding the assessment of new computer-aided diagnosis tools and Bob's unique ability to unify concepts across a range of disciplines as he gave order to increasingly complex problems in our field.
NASA Astrophysics Data System (ADS)
Weng, Tongfeng; Zhang, Jie; Small, Michael; Harandizadeh, Bahareh; Hui, Pan
2018-03-01
We propose a unified framework to evaluate and quantify the search time of multiple random searchers traversing independently and concurrently on complex networks. We find that the intriguing behaviors of multiple random searchers are governed by two basic principles—the logarithmic growth pattern and the harmonic law. Specifically, the logarithmic growth pattern characterizes how the search time increases with the number of targets, while the harmonic law explores how the search time of multiple random searchers varies relative to that needed by individual searchers. Numerical and theoretical results demonstrate these two universal principles established across a broad range of random search processes, including generic random walks, maximal entropy random walks, intermittent strategies, and persistent random walks. Our results reveal two fundamental principles governing the search time of multiple random searchers, which are expected to facilitate investigation of diverse dynamical processes like synchronization and spreading.
A new model for fluid velocity slip on a solid surface.
Shu, Jian-Jun; Teo, Ji Bin Melvin; Chan, Weng Kong
2016-10-12
A general adsorption model is developed to describe the interactions between near-wall fluid molecules and solid surfaces. This model serves as a framework for the theoretical modelling of boundary slip phenomena. Based on this adsorption model, a new general model for the slip velocity of fluids on solid surfaces is introduced. The slip boundary condition at a fluid-solid interface has hitherto been considered separately for gases and liquids. In this paper, we show that the slip velocity in both gases and liquids may originate from dynamical adsorption processes at the interface. A unified analytical model that is valid for both gas-solid and liquid-solid slip boundary conditions is proposed based on surface science theory. The corroboration with the experimental data extracted from the literature shows that the proposed model provides an improved prediction compared to existing analytical models for gases at higher shear rates and close agreement for liquid-solid interfaces in general.
Unified theory of inertial granular flows and non-Brownian suspensions.
DeGiuli, E; Düring, G; Lerner, E; Wyart, M
2015-06-01
Rheological properties of dense flows of hard particles are singular as one approaches the jamming threshold where flow ceases both for aerial granular flows dominated by inertia and for over-damped suspensions. Concomitantly, the length scale characterizing velocity correlations appears to diverge at jamming. Here we introduce a theoretical framework that proposes a tentative, but potentially complete, scaling description of stationary flows. Our analysis, which focuses on frictionless particles, applies both to suspensions and inertial flows of hard particles. We compare our predictions with the empirical literature, as well as with novel numerical data. Overall, we find a very good agreement between theory and observations, except for frictional inertial flows whose scaling properties clearly differ from frictionless systems. For overdamped flows, more observations are needed to decide if friction is a relevant perturbation. Our analysis makes several new predictions on microscopic dynamical quantities that should be accessible experimentally.
Causal inference and the data-fusion problem
Bareinboim, Elias; Pearl, Judea
2016-01-01
We review concepts, principles, and tools that unify current approaches to causal analysis and attend to new challenges presented by big data. In particular, we address the problem of data fusion—piecing together multiple datasets collected under heterogeneous conditions (i.e., different populations, regimes, and sampling methods) to obtain valid answers to queries of interest. The availability of multiple heterogeneous datasets presents new opportunities to big data analysts, because the knowledge that can be acquired from combined data would not be possible from any individual source alone. However, the biases that emerge in heterogeneous environments require new analytical tools. Some of these biases, including confounding, sampling selection, and cross-population biases, have been addressed in isolation, largely in restricted parametric models. We here present a general, nonparametric framework for handling these biases and, ultimately, a theoretical solution to the problem of data fusion in causal inference tasks. PMID:27382148
Fulop, Sean A; Fitz, Kelly
2006-01-01
A modification of the spectrogram (log magnitude of the short-time Fourier transform) to more accurately show the instantaneous frequencies of signal components was first proposed in 1976 [Kodera et al., Phys. Earth Planet. Inter. 12, 142-150 (1976)], and has been considered or reinvented a few times since but never widely adopted. This paper presents a unified theoretical picture of this time-frequency analysis method, the time-corrected instantaneous frequency spectrogram, together with detailed implementable algorithms comparing three published techniques for its computation. The new representation is evaluated against the conventional spectrogram for its superior ability to track signal components. The lack of a uniform framework for either mathematics or implementation details which has characterized the disparate literature on the schemes has been remedied here. Fruitful application of the method is shown in the realms of speech phonation analysis, whale song pitch tracking, and additive sound modeling.
Ultra-fast consensus of discrete-time multi-agent systems with multi-step predictive output feedback
NASA Astrophysics Data System (ADS)
Zhang, Wenle; Liu, Jianchang
2016-04-01
This article addresses the ultra-fast consensus problem of high-order discrete-time multi-agent systems based on a unified consensus framework. A novel multi-step predictive output mechanism is proposed under a directed communication topology containing a spanning tree. By predicting the outputs of a network several steps ahead and adding this information into the consensus protocol, it is shown that the asymptotic convergence factor is improved by a power of q + 1 compared to the routine consensus. The difficult problem of selecting the optimal control gain is solved well by introducing a variable called convergence step. In addition, the ultra-fast formation achievement is studied on the basis of this new consensus protocol. Finally, the ultra-fast consensus with respect to a reference model and robust consensus is discussed. Some simulations are performed to illustrate the effectiveness of the theoretical results.
A Critical Analysis and Applied Intersectionality Framework with Intercultural Queer Couples.
Chan, Christian D; Erby, Adrienne N
2018-01-01
Intercultural queer couples are growing at an extensive rate in the United States, exemplifying diversity across multiple dimensions (e.g., race, ethnicity, sexuality, affectional identity, gender identity) while experiencing multiple converging forms of oppression (e.g., racism, heterosexism, genderism). Given the dearth of conceptual and empirical literature that unifies both dimensions related to intercultural and queer, applied practices and research contend with a unilateral approach focusing exclusively on either intercultural or queer couples. Intersectionality theory has revolutionized critical scholarship to determine overlapping forms of oppression, decenter hegemonic structures of power relations and social contexts, and enact a social justice agenda. This article addresses the following aims: (1) an overview of the gaps eliciting unilateral approaches to intercultural queer couples; (2) an illustration of intersectionality's theoretical underpinnings as a critical approach; and (3) applications for insights in practices and research with intercultural queer couples.
Probabilistic self-organizing maps for continuous data.
Lopez-Rubio, Ezequiel
2010-10-01
The original self-organizing feature map did not define any probability distribution on the input space. However, the advantages of introducing probabilistic methodologies into self-organizing map models were soon evident. This has led to a wide range of proposals which reflect the current emergence of probabilistic approaches to computational intelligence. The underlying estimation theories behind them derive from two main lines of thought: the expectation maximization methodology and stochastic approximation methods. Here, we present a comprehensive view of the state of the art, with a unifying perspective of the involved theoretical frameworks. In particular, we examine the most commonly used continuous probability distributions, self-organization mechanisms, and learning schemes. Special emphasis is given to the connections among them and their relative advantages depending on the characteristics of the problem at hand. Furthermore, we evaluate their performance in two typical applications of self-organizing maps: classification and visualization.
Lens and dendrite formation during colloidal solidification
NASA Astrophysics Data System (ADS)
Worster, Grae; You, Jiaxue
2017-11-01
Colloidal particles in suspension are forced into a variety of morphologies when the suspending fluid medium is frozen: soil is compacted between ice lenses during frost heave; ice templating is a recent and growing technology to produce bio-inspired, micro-porous materials; cells and tissue can be damaged during cryosurgery; and metal-matrix composites with tailored microstructure can be fabricated by controlled casting. Various instabilities that affect the microscopic morphology are controlled by fluid flow through the compacted layer of particles that accumulates ahead of the solidification front. By analysing the flow in connection with equilibrium phase relationships, we develop a theoretical framework that identifies two different mechanisms for ice-lens formation, with and without a frozen fringe, identifies the external parameters that differentiates between them and the possibility of dendritic formations, and unifies a range of apparently disparate conclusions drawn from previous experimental studies. China Scholarship Council and the British Council.
Physics of Alfvén waves and energetic particles in burning plasmas
NASA Astrophysics Data System (ADS)
Chen, Liu; Zonca, Fulvio
2016-01-01
Dynamics of shear Alfvén waves and energetic particles are crucial to the performance of burning fusion plasmas. This article reviews linear as well as nonlinear physics of shear Alfvén waves and their self-consistent interaction with energetic particles in tokamak fusion devices. More specifically, the review on the linear physics deals with wave spectral properties and collective excitations by energetic particles via wave-particle resonances. The nonlinear physics deals with nonlinear wave-wave interactions as well as nonlinear wave-energetic particle interactions. Both linear as well as nonlinear physics demonstrate the qualitatively important roles played by realistic equilibrium nonuniformities, magnetic field geometries, and the specific radial mode structures in determining the instability evolution, saturation, and, ultimately, energetic-particle transport. These topics are presented within a single unified theoretical framework, where experimental observations and numerical simulation results are referred to elucidate concepts and physics processes.
Giordano, Bruno L.; Kayser, Christoph; Rousselet, Guillaume A.; Gross, Joachim; Schyns, Philippe G.
2016-01-01
Abstract We begin by reviewing the statistical framework of information theory as applicable to neuroimaging data analysis. A major factor hindering wider adoption of this framework in neuroimaging is the difficulty of estimating information theoretic quantities in practice. We present a novel estimation technique that combines the statistical theory of copulas with the closed form solution for the entropy of Gaussian variables. This results in a general, computationally efficient, flexible, and robust multivariate statistical framework that provides effect sizes on a common meaningful scale, allows for unified treatment of discrete, continuous, unidimensional and multidimensional variables, and enables direct comparisons of representations from behavioral and brain responses across any recording modality. We validate the use of this estimate as a statistical test within a neuroimaging context, considering both discrete stimulus classes and continuous stimulus features. We also present examples of analyses facilitated by these developments, including application of multivariate analyses to MEG planar magnetic field gradients, and pairwise temporal interactions in evoked EEG responses. We show the benefit of considering the instantaneous temporal derivative together with the raw values of M/EEG signals as a multivariate response, how we can separately quantify modulations of amplitude and direction for vector quantities, and how we can measure the emergence of novel information over time in evoked responses. Open‐source Matlab and Python code implementing the new methods accompanies this article. Hum Brain Mapp 38:1541–1573, 2017. © 2016 Wiley Periodicals, Inc. PMID:27860095
Generalizability Theory as a Unifying Framework of Measurement Reliability in Adolescent Research
ERIC Educational Resources Information Center
Fan, Xitao; Sun, Shaojing
2014-01-01
In adolescence research, the treatment of measurement reliability is often fragmented, and it is not always clear how different reliability coefficients are related. We show that generalizability theory (G-theory) is a comprehensive framework of measurement reliability, encompassing all other reliability methods (e.g., Pearson "r,"…
Coffman, Valerie C.; McDermott, Matthew B. A.; Shtylla, Blerta; Dawes, Adriana T.
2016-01-01
Positioning of microtubule-organizing centers (MTOCs) incorporates biochemical and mechanical cues for proper alignment of the mitotic spindle and cell division site. Current experimental and theoretical studies in the early Caenorhabditis elegans embryo assume remarkable changes in the origin and polarity of forces acting on the MTOCs. These changes must occur over a few minutes, between initial centration and rotation of the pronuclear complex and entry into mitosis, and the models do not replicate in vivo timing of centration and rotation. Here we propose a model that incorporates asymmetry in the microtubule arrays generated by each MTOC, which we demonstrate with in vivo measurements, and a similar asymmetric force profile to that required for posterior-directed spindle displacement during mitosis. We find that these asymmetries are capable of and important for recapitulating the simultaneous centration and rotation of the pronuclear complex observed in vivo. The combination of theoretical and experimental evidence provided here offers a unified framework for the spatial organization and forces needed for pronuclear centration, rotation, and spindle displacement in the early C. elegans embryo. PMID:27733624
RT-18: Value of Flexibility. Phase 1
2010-09-25
an analytical framework based on sound mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory...framework that is mathematically consistent, domain independent and applicable under varying information levels. This report presents our advances in...During this period, we also explored the development of an analytical framework based on sound mathematical constructs. A review of the current state
Framework Design of Unified Cross-Authentication Based on the Fourth Platform Integrated Payment
NASA Astrophysics Data System (ADS)
Yong, Xu; Yujin, He
The essay advances a unified authentication based on the fourth integrated payment platform. The research aims at improving the compatibility of the authentication in electronic business and providing a reference for the establishment of credit system by seeking a way to carry out a standard unified authentication on a integrated payment platform. The essay introduces the concept of the forth integrated payment platform and finally put forward the whole structure and different components. The main issue of the essay is about the design of the credit system of the fourth integrated payment platform and the PKI/CA structure design.
Wiltshire, Travis J.; Lobato, Emilio J. C.; McConnell, Daniel S.; Fiore, Stephen M.
2015-01-01
In this paper we suggest that differing approaches to the science of social cognition mirror the arguments between radical embodied and traditional approaches to cognition. We contrast the use in social cognition of theoretical inference and mental simulation mechanisms with approaches emphasizing a direct perception of others’ mental states. We build from a recent integrative framework unifying these divergent perspectives through the use of dual-process theory and supporting social neuroscience research. Our elaboration considers two complementary notions of direct perception: one primarily stemming from ecological psychology and the other from enactive cognition theory. We use this as the foundation from which to offer an account of the informational basis for social information and assert a set of research propositions to further the science of social cognition. In doing so, we point out how perception of the minds of others can be supported in some cases by lawful information, supporting direct perception of social affordances and perhaps, mental states, and in other cases by cues that support indirect perceptual inference. Our goal is to extend accounts of social cognition by integrating advances across disciplines to provide a multi-level and multi-theoretic description that can advance this field and offer a means through which to reconcile radical embodied and traditional approaches to cognitive neuroscience. PMID:25709572
Structural design using equilibrium programming formulations
NASA Technical Reports Server (NTRS)
Scotti, Stephen J.
1995-01-01
Solutions to increasingly larger structural optimization problems are desired. However, computational resources are strained to meet this need. New methods will be required to solve increasingly larger problems. The present approaches to solving large-scale problems involve approximations for the constraints of structural optimization problems and/or decomposition of the problem into multiple subproblems that can be solved in parallel. An area of game theory, equilibrium programming (also known as noncooperative game theory), can be used to unify these existing approaches from a theoretical point of view (considering the existence and optimality of solutions), and be used as a framework for the development of new methods for solving large-scale optimization problems. Equilibrium programming theory is described, and existing design techniques such as fully stressed design and constraint approximations are shown to fit within its framework. Two new structural design formulations are also derived. The first new formulation is another approximation technique which is a general updating scheme for the sensitivity derivatives of design constraints. The second new formulation uses a substructure-based decomposition of the structure for analysis and sensitivity calculations. Significant computational benefits of the new formulations compared with a conventional method are demonstrated.
Energy efficiency as a unifying principle for human, environmental, and global health
Fontana, Luigi; Atella, Vincenzo; Kammen, Daniel M
2013-01-01
A strong analogy exists between over/under consumption of energy at the level of the human body and of the industrial metabolism of humanity. Both forms of energy consumption have profound implications for human, environmental, and global health. Globally, excessive fossil-fuel consumption, and individually, excessive food energy consumption are both responsible for a series of interrelated detrimental effects, including global warming, extreme weather conditions, damage to ecosystems, loss of biodiversity, widespread pollution, obesity, cancer, chronic respiratory disease, and other lethal chronic diseases. In contrast, data show that the efficient use of energy—in the form of food as well as fossil fuels and other resources—is vital for promoting human, environmental, and planetary health and sustainable economic development. While it is not new to highlight how efficient use of energy and food can address some of the key problems our world is facing, little research and no unifying framework exists to harmonize these concepts of sustainable system management across diverse scientific fields into a single theoretical body. Insights beyond reductionist views of efficiency are needed to encourage integrated changes in the use of the world’s natural resources, with the aim of achieving a wiser use of energy, better farming systems, and healthier dietary habits. This perspective highlights a range of scientific-based opportunities for cost-effective pro-growth and pro-health policies while using less energy and natural resources. PMID:24555053
A State Space Model for Spatial Updating of Remembered Visual Targets during Eye Movements
Mohsenzadeh, Yalda; Dash, Suryadeep; Crawford, J. Douglas
2016-01-01
In the oculomotor system, spatial updating is the ability to aim a saccade toward a remembered visual target position despite intervening eye movements. Although this has been the subject of extensive experimental investigation, there is still no unifying theoretical framework to explain the neural mechanism for this phenomenon, and how it influences visual signals in the brain. Here, we propose a unified state-space model (SSM) to account for the dynamics of spatial updating during two types of eye movement; saccades and smooth pursuit. Our proposed model is a non-linear SSM and implemented through a recurrent radial-basis-function neural network in a dual Extended Kalman filter (EKF) structure. The model parameters and internal states (remembered target position) are estimated sequentially using the EKF method. The proposed model replicates two fundamental experimental observations: continuous gaze-centered updating of visual memory-related activity during smooth pursuit, and predictive remapping of visual memory activity before and during saccades. Moreover, our model makes the new prediction that, when uncertainty of input signals is incorporated in the model, neural population activity and receptive fields expand just before and during saccades. These results suggest that visual remapping and motor updating are part of a common visuomotor mechanism, and that subjective perceptual constancy arises in part from training the visual system on motor tasks. PMID:27242452
Energy efficiency as a unifying principle for human, environmental, and global health.
Fontana, Luigi; Atella, Vincenzo; Kammen, Daniel M
2013-01-01
A strong analogy exists between over/under consumption of energy at the level of the human body and of the industrial metabolism of humanity. Both forms of energy consumption have profound implications for human, environmental, and global health. Globally, excessive fossil-fuel consumption, and individually, excessive food energy consumption are both responsible for a series of interrelated detrimental effects, including global warming, extreme weather conditions, damage to ecosystems, loss of biodiversity, widespread pollution, obesity, cancer, chronic respiratory disease, and other lethal chronic diseases. In contrast, data show that the efficient use of energy-in the form of food as well as fossil fuels and other resources-is vital for promoting human, environmental, and planetary health and sustainable economic development. While it is not new to highlight how efficient use of energy and food can address some of the key problems our world is facing, little research and no unifying framework exists to harmonize these concepts of sustainable system management across diverse scientific fields into a single theoretical body. Insights beyond reductionist views of efficiency are needed to encourage integrated changes in the use of the world's natural resources, with the aim of achieving a wiser use of energy, better farming systems, and healthier dietary habits. This perspective highlights a range of scientific-based opportunities for cost-effective pro-growth and pro-health policies while using less energy and natural resources.
Intuitive and deliberate judgments are based on common principles.
Kruglanski, Arie W; Gigerenzer, Gerd
2011-01-01
A popular distinction in cognitive and social psychology has been between intuitive and deliberate judgments. This juxtaposition has aligned in dual-process theories of reasoning associative, unconscious, effortless, heuristic, and suboptimal processes (assumed to foster intuitive judgments) versus rule-based, conscious, effortful, analytic, and rational processes (assumed to characterize deliberate judgments). In contrast, we provide convergent arguments and evidence for a unified theoretical approach to both intuitive and deliberative judgments. Both are rule-based, and in fact, the very same rules can underlie both intuitive and deliberate judgments. The important open question is that of rule selection, and we propose a 2-step process in which the task itself and the individual's memory constrain the set of applicable rules, whereas the individual's processing potential and the (perceived) ecological rationality of the rule for the task guide the final selection from that set. Deliberate judgments are not generally more accurate than intuitive judgments; in both cases, accuracy depends on the match between rule and environment: the rules' ecological rationality. Heuristics that are less effortful and in which parts of the information are ignored can be more accurate than cognitive strategies that have more information and computation. The proposed framework adumbrates a unified approach that specifies the critical dimensions on which judgmental situations may vary and the environmental conditions under which rules can be expected to be successful.
Toward a unifying taxonomy and definition for meditation
Nash, Jonathan D.; Newberg, Andrew
2013-01-01
One of the well-documented concerns confronting scholarly discourse about meditation is the plethora of semantic constructs and the lack of a unified definition and taxonomy. In recent years there have been several notable attempts to formulate new lexicons in order to define and categorize meditation methods. While these constructs have been useful and have encountered varying degrees of acceptance, they have also been subject to misinterpretation and debate, leaving the field devoid of a consensual paradigm. This paper attempts to influence this ongoing discussion by proposing two new models which hold the potential for enhanced scientific reliability and acceptance. Regarding the quest for a universally acceptable taxonomy, we suggest a paradigm shift away from the norm of fabricatIng new terminology from a first-person perspective. As an alternative, we propose a new taxonomic system based on the historically well-established and commonly accepted third-person paradigm of Affect and Cognition, borrowed, in part, from the psychological and cognitive sciences. With regard to the elusive definitional problem, we propose a model of meditation which clearly distinguishes “method” from “state” and is conceptualized as a dynamic process which is inclusive of six related but distinct stages. The overall goal is to provide researchers with a reliable nomenclature with which to categorize and classify diverse meditation methods, and a conceptual framework which can provide direction for their research and a theoretical basis for their findings. PMID:24312060
A unified framework for building high performance DVEs
NASA Astrophysics Data System (ADS)
Lei, Kaibin; Ma, Zhixia; Xiong, Hua
2011-10-01
A unified framework for integrating PC cluster based parallel rendering with distributed virtual environments (DVEs) is presented in this paper. While various scene graphs have been proposed in DVEs, it is difficult to enable collaboration of different scene graphs. This paper proposes a technique for non-distributed scene graphs with the capability of object and event distribution. With the increase of graphics data, DVEs require more powerful rendering ability. But general scene graphs are inefficient in parallel rendering. The paper also proposes a technique to connect a DVE and a PC cluster based parallel rendering environment. A distributed multi-player video game is developed to show the interaction of different scene graphs and the parallel rendering performance on a large tiled display wall.
Unified treatment of the luminosity distance in cosmology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoo, Jaiyul; Scaccabarozzi, Fulvio, E-mail: jyoo@physik.uzh.ch, E-mail: fulvio@physik.uzh.ch
Comparing the luminosity distance measurements to its theoretical predictions is one of the cornerstones in establishing the modern cosmology. However, as shown in Biern and Yoo, its theoretical predictions in literature are often plagued with infrared divergences and gauge-dependences. This trend calls into question the sanity of the methods used to derive the luminosity distance. Here we critically investigate four different methods—the geometric approach, the Sachs approach, the Jacobi mapping approach, and the geodesic light cone (GLC) approach to modeling the luminosity distance, and we present a unified treatment of such methods, facilitating the comparison among the methods and checkingmore » their sanity. All of these four methods, if exercised properly, can be used to reproduce the correct description of the luminosity distance.« less
Unified Behavior Framework for Discrete Event Simulation Systems
2015-03-26
I would like to thank Dr. Hodson for his guidance and direction throughout the AFIT program. I also would like to thank my thesis committee members...SPA Sense-Plan-Act SSL System Service Layer TCA Task Control Architecture TRP Teleo-Reactive Program UAV Unmanned Aerial Vehicle UBF Unified Behavior...a teleo-reactive architecture [11]. Teleo-Reactive Programs ( TRPs ) are composed of a list of rules, where each has a condition and an action. When the
The need for international nursing diagnosis research and a theoretical framework.
Lunney, Margaret
2008-01-01
To describe the need for nursing diagnosis research and a theoretical framework for such research. A linguistics theory served as the foundation for the theoretical framework. Reasons for additional nursing diagnosis research are: (a) file names are needed for implementation of electronic health records, (b) international consensus is needed for an international classification, and (c) continuous changes occur in clinical practice. A theoretical framework used by the author is explained. Theoretical frameworks provide support for nursing diagnosis research. Linguistics theory served as an appropriate exemplar theory to support nursing research. Additional nursing diagnosis studies based upon a theoretical framework are needed and linguistics theory can provide an appropriate structure for this research.
Evolutionary game theory meets social science: is there a unifying rule for human cooperation?
Rosas, Alejandro
2010-05-21
Evolutionary game theory has shown that human cooperation thrives in different types of social interactions with a PD structure. Models treat the cooperative strategies within the different frameworks as discrete entities and sometimes even as contenders. Whereas strong reciprocity was acclaimed as superior to classic reciprocity for its ability to defeat defectors in public goods games, recent experiments and simulations show that costly punishment fails to promote cooperation in the IR and DR games, where classic reciprocity succeeds. My aim is to show that cooperative strategies across frameworks are capable of a unified treatment, for they are governed by a common underlying rule or norm. An analysis of the reputation and action rules that govern some representative cooperative strategies both in models and in economic experiments confirms that the different frameworks share a conditional action rule and several reputation rules. The common conditional rule contains an option between costly punishment and withholding benefits that provides alternative enforcement methods against defectors. Depending on the framework, individuals can switch to the appropriate strategy and method of enforcement. The stability of human cooperation looks more promising if one mechanism controls successful strategies across frameworks. Published by Elsevier Ltd.
Fraction of a dose absorbed estimation for structurally diverse low solubility compounds.
Sugano, Kiyohiko
2011-02-28
The purpose of the present study was to investigate the prediction accuracy of the fully mechanistic gastrointestinal unified theoretical (GUT) framework for in vivo oral absorption of low solubility drugs. Solubility in biorelevant media, molecular weight, logP(oct), pK(a), Caco-2 permeability, dose and particle size were used as the input parameters. To neglect the effect of the low stomach pH on dissolution of a drug, the fraction of a dose absorbed (Fa%) of undissociable and free acids were used. In addition, Fa% of free base drugs with the high pH stomach was also included to increase the number of model drugs. In total twenty nine structurally diverse compounds were used as the model drugs. Fa% data at several doses and particle sizes in humans and dogs were collated from the literature (total 110 Fa% data). In approximately 80% cases, the prediction error was within 2 fold, suggesting that the GUT framework has practical predictability for drug discovery, but not for drug development. The GUT framework appropriately captured the dose and particle size dependency of Fa% as the particle drifting effect was taken into account. It should be noted that the present validation results cannot be applied for salt form cases and other special formulations such as solid dispersions and emulsion formulations. Copyright © 2010 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
MacLean, Justine; Mulholland, Rosemary; Gray, Shirley; Horrell, Andrew
2015-01-01
Background: Curriculum for Excellence, a new national policy initiative in Scottish Schools, provides a unified curricular framework for children aged 3-18. Within this framework, Physical Education (PE) now forms part of a collective alongside physical activity and sport, subsumed by the newly created curriculum area of "Health and…
NASA Astrophysics Data System (ADS)
Odintsov, S. D.; Oikonomou, V. K.
2016-06-01
We present some cosmological models which unify the late- and early-time acceleration eras with the radiation and the matter domination era, and we realize the cosmological models by using the theoretical framework of F(R) gravity. Particularly, the first model unifies the late- and early-time acceleration with the matter domination era, and the second model unifies all the evolution eras of our Universe. The two models are described in the same way at early and late times, and only the intermediate stages of the evolution have some differences. Each cosmological model contains two Type IV singularities which are chosen to occur one at the end of the inflationary era and one at the end of the matter domination era. The cosmological models at early times are approximately identical to the R 2 inflation model, so these describe a slow-roll inflationary era which ends when the slow-roll parameters become of order one. The inflationary era is followed by the radiation era and after that the matter domination era follows, which lasts until the second Type IV singularity, and then the late-time acceleration era follows. The models have two appealing features: firstly they produce a nearly scale invariant power spectrum of primordial curvature perturbations and a scalar-to-tensor ratio which are compatible with the most recent observational data and secondly, it seems that the deceleration-acceleration transition is crucially affected by the presence of the second Type IV singularity which occurs at the end of the matter domination era. As we demonstrate, the Hubble horizon at early times shrinks, as expected for an initially accelerating Universe, then during the matter domination era, it expands and finally after the Type IV singularity, the Hubble horizon starts to shrink again, during the late-time acceleration era. Intriguingly enough, the deceleration-acceleration transition, occurs after the second Type IV singularity. In addition, we investigate which F(R) gravity can successfully realize each of the four cosmological epochs.
Chao, Anne; Chiu, Chun-Huo; Colwell, Robert K; Magnago, Luiz Fernando S; Chazdon, Robin L; Gotelli, Nicholas J
2017-11-01
Estimating the species, phylogenetic, and functional diversity of a community is challenging because rare species are often undetected, even with intensive sampling. The Good-Turing frequency formula, originally developed for cryptography, estimates in an ecological context the true frequencies of rare species in a single assemblage based on an incomplete sample of individuals. Until now, this formula has never been used to estimate undetected species, phylogenetic, and functional diversity. Here, we first generalize the Good-Turing formula to incomplete sampling of two assemblages. The original formula and its two-assemblage generalization provide a novel and unified approach to notation, terminology, and estimation of undetected biological diversity. For species richness, the Good-Turing framework offers an intuitive way to derive the non-parametric estimators of the undetected species richness in a single assemblage, and of the undetected species shared between two assemblages. For phylogenetic diversity, the unified approach leads to an estimator of the undetected Faith's phylogenetic diversity (PD, the total length of undetected branches of a phylogenetic tree connecting all species), as well as a new estimator of undetected PD shared between two phylogenetic trees. For functional diversity based on species traits, the unified approach yields a new estimator of undetected Walker et al.'s functional attribute diversity (FAD, the total species-pairwise functional distance) in a single assemblage, as well as a new estimator of undetected FAD shared between two assemblages. Although some of the resulting estimators have been previously published (but derived with traditional mathematical inequalities), all taxonomic, phylogenetic, and functional diversity estimators are now derived under the same framework. All the derived estimators are theoretically lower bounds of the corresponding undetected diversities; our approach reveals the sufficient conditions under which the estimators are nearly unbiased, thus offering new insights. Simulation results are reported to numerically verify the performance of the derived estimators. We illustrate all estimators and assess their sampling uncertainty with an empirical dataset for Brazilian rain forest trees. These estimators should be widely applicable to many current problems in ecology, such as the effects of climate change on spatial and temporal beta diversity and the contribution of trait diversity to ecosystem multi-functionality. © 2017 by the Ecological Society of America.
Statistical learning as an individual ability: Theoretical perspectives and empirical evidence
Siegelman, Noam; Frost, Ram
2015-01-01
Although the power of statistical learning (SL) in explaining a wide range of linguistic functions is gaining increasing support, relatively little research has focused on this theoretical construct from the perspective of individual differences. However, to be able to reliably link individual differences in a given ability such as language learning to individual differences in SL, three critical theoretical questions should be posed: Is SL a componential or unified ability? Is it nested within other general cognitive abilities? Is it a stable capacity of an individual? Following an initial mapping sentence outlining the possible dimensions of SL, we employed a battery of SL tasks in the visual and auditory modalities, using verbal and non-verbal stimuli, with adjacent and non-adjacent contingencies. SL tasks were administered along with general cognitive tasks in a within-subject design at two time points to explore our theoretical questions. We found that SL, as measured by some tasks, is a stable and reliable capacity of an individual. Moreover, we found SL to be independent of general cognitive abilities such as intelligence or working memory. However, SL is not a unified capacity, so that individual sensitivity to conditional probabilities is not uniform across modalities and stimuli. PMID:25821343
Tomalia, Donald A; Khanna, Shiv N
2016-02-24
Development of a central paradigm is undoubtedly the single most influential force responsible for advancing Dalton's 19th century atomic/molecular chemistry concepts to the current maturity enjoyed by traditional chemistry. A similar central dogma for guiding and unifying nanoscience has been missing. This review traces the origins, evolution, and current status of such a critical nanoperiodic concept/framework for defining and unifying nanoscience. Based on parallel efforts and a mutual consensus now shared by both chemists and physicists, a nanoperiodic/systematic framework concept has emerged. This concept is based on the well-documented existence of discrete, nanoscale collections of traditional inorganic/organic atoms referred to as hard and soft superatoms (i.e., nanoelement categories). These nanometric entities are widely recognized to exhibit nanoscale atom mimicry features reminiscent of traditional picoscale atoms. All unique superatom/nanoelement physicochemical features are derived from quantized structural control defined by six critical nanoscale design parameters (CNDPs), namely, size, shape, surface chemistry, flexibility/rigidity, architecture, and elemental composition. These CNDPs determine all intrinsic superatom properties, their combining behavior to form stoichiometric nanocompounds/assemblies as well as to exhibit nanoperiodic properties leading to new nanoperiodic rules and predictive Mendeleev-like nanoperiodic tables, and they portend possible extension of these principles to larger quantized building blocks including meta-atoms.
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties observed in biological systems at different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be enough to confer intrinsic robustness in order to tolerate intrinsic parameter fluctuations, genetic robustness for buffering genetic variations, and environmental robustness for resisting environmental disturbances. With this, the phenotypic stability of biological network can be maintained, thus guaranteeing phenotype robustness. This paper presents a survey on biological systems and then develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation in systems and evolutionary biology. Further, from the unifying mathematical framework, it was discovered that the phenotype robustness criterion for biological networks at different levels relies upon intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness. When this is true, the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in systems and evolutionary biology can also be investigated through their corresponding phenotype robustness criterion from the systematic point of view. PMID:23515240
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties observed in biological systems at different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be enough to confer intrinsic robustness in order to tolerate intrinsic parameter fluctuations, genetic robustness for buffering genetic variations, and environmental robustness for resisting environmental disturbances. With this, the phenotypic stability of biological network can be maintained, thus guaranteeing phenotype robustness. This paper presents a survey on biological systems and then develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation in systems and evolutionary biology. Further, from the unifying mathematical framework, it was discovered that the phenotype robustness criterion for biological networks at different levels relies upon intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness. When this is true, the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in systems and evolutionary biology can also be investigated through their corresponding phenotype robustness criterion from the systematic point of view.
A new view of Baryon symmetric cosmology based on grand unified theories
NASA Technical Reports Server (NTRS)
Stecker, F. W.
1981-01-01
Within the framework of grand unified theories, it is shown how spontaneous CP violation leads to a domain structure in the universe with the domains evolving into separate regions of matter and antimatter excesses. Subsequent to exponential horizon growth, this can result in a universe of matter galaxies and antimatter galaxies. Various astrophysical data appear to favor this form of big bang cosmology. Future direct tests for cosmologically significant antimatter are discussed.
Pelowski, Matthew; Markey, Patrick S.; Lauring, Jon O.; Leder, Helmut
2016-01-01
The last decade has witnessed a renaissance of empirical and psychological approaches to art study, especially regarding cognitive models of art processing experience. This new emphasis on modeling has often become the basis for our theoretical understanding of human interaction with art. Models also often define areas of focus and hypotheses for new empirical research, and are increasingly important for connecting psychological theory to discussions of the brain. However, models are often made by different researchers, with quite different emphases or visual styles. Inputs and psychological outcomes may be differently considered, or can be under-reported with regards to key functional components. Thus, we may lose the major theoretical improvements and ability for comparison that can be had with models. To begin addressing this, this paper presents a theoretical assessment, comparison, and new articulation of a selection of key contemporary cognitive or information-processing-based approaches detailing the mechanisms underlying the viewing of art. We review six major models in contemporary psychological aesthetics. We in turn present redesigns of these models using a unified visual form, in some cases making additions or creating new models where none had previously existed. We also frame these approaches in respect to their targeted outputs (e.g., emotion, appraisal, physiological reaction) and their strengths within a more general framework of early, intermediate, and later processing stages. This is used as a basis for general comparison and discussion of implications and future directions for modeling, and for theoretically understanding our engagement with visual art. PMID:27199697
Characterization of protein folding by a Φ-value calculation with a statistical-mechanical model.
Wako, Hiroshi; Abe, Haruo
2016-01-01
The Φ-value analysis approach provides information about transition-state structures along the folding pathway of a protein by measuring the effects of an amino acid mutation on folding kinetics. Here we compared the theoretically calculated Φ values of 27 proteins with their experimentally observed Φ values; the theoretical values were calculated using a simple statistical-mechanical model of protein folding. The theoretically calculated Φ values reflected the corresponding experimentally observed Φ values with reasonable accuracy for many of the proteins, but not for all. The correlation between the theoretically calculated and experimentally observed Φ values strongly depends on whether the protein-folding mechanism assumed in the model holds true in real proteins. In other words, the correlation coefficient can be expected to illuminate the folding mechanisms of proteins, providing the answer to the question of which model more accurately describes protein folding: the framework model or the nucleation-condensation model. In addition, we tried to characterize protein folding with respect to various properties of each protein apart from the size and fold class, such as the free-energy profile, contact-order profile, and sensitivity to the parameters used in the Φ-value calculation. The results showed that any one of these properties alone was not enough to explain protein folding, although each one played a significant role in it. We have confirmed the importance of characterizing protein folding from various perspectives. Our findings have also highlighted that protein folding is highly variable and unique across different proteins, and this should be considered while pursuing a unified theory of protein folding.
Characterization of protein folding by a Φ-value calculation with a statistical-mechanical model
Wako, Hiroshi; Abe, Haruo
2016-01-01
The Φ-value analysis approach provides information about transition-state structures along the folding pathway of a protein by measuring the effects of an amino acid mutation on folding kinetics. Here we compared the theoretically calculated Φ values of 27 proteins with their experimentally observed Φ values; the theoretical values were calculated using a simple statistical-mechanical model of protein folding. The theoretically calculated Φ values reflected the corresponding experimentally observed Φ values with reasonable accuracy for many of the proteins, but not for all. The correlation between the theoretically calculated and experimentally observed Φ values strongly depends on whether the protein-folding mechanism assumed in the model holds true in real proteins. In other words, the correlation coefficient can be expected to illuminate the folding mechanisms of proteins, providing the answer to the question of which model more accurately describes protein folding: the framework model or the nucleation-condensation model. In addition, we tried to characterize protein folding with respect to various properties of each protein apart from the size and fold class, such as the free-energy profile, contact-order profile, and sensitivity to the parameters used in the Φ-value calculation. The results showed that any one of these properties alone was not enough to explain protein folding, although each one played a significant role in it. We have confirmed the importance of characterizing protein folding from various perspectives. Our findings have also highlighted that protein folding is highly variable and unique across different proteins, and this should be considered while pursuing a unified theory of protein folding. PMID:28409079
Pelowski, Matthew; Markey, Patrick S; Lauring, Jon O; Leder, Helmut
2016-01-01
The last decade has witnessed a renaissance of empirical and psychological approaches to art study, especially regarding cognitive models of art processing experience. This new emphasis on modeling has often become the basis for our theoretical understanding of human interaction with art. Models also often define areas of focus and hypotheses for new empirical research, and are increasingly important for connecting psychological theory to discussions of the brain. However, models are often made by different researchers, with quite different emphases or visual styles. Inputs and psychological outcomes may be differently considered, or can be under-reported with regards to key functional components. Thus, we may lose the major theoretical improvements and ability for comparison that can be had with models. To begin addressing this, this paper presents a theoretical assessment, comparison, and new articulation of a selection of key contemporary cognitive or information-processing-based approaches detailing the mechanisms underlying the viewing of art. We review six major models in contemporary psychological aesthetics. We in turn present redesigns of these models using a unified visual form, in some cases making additions or creating new models where none had previously existed. We also frame these approaches in respect to their targeted outputs (e.g., emotion, appraisal, physiological reaction) and their strengths within a more general framework of early, intermediate, and later processing stages. This is used as a basis for general comparison and discussion of implications and future directions for modeling, and for theoretically understanding our engagement with visual art.
Grand unified brane world scenario
NASA Astrophysics Data System (ADS)
Arai, Masato; Blaschke, Filip; Eto, Minoru; Sakai, Norisuke
2017-12-01
We present a field theoretical model unifying grand unified theory (GUT) and brane world scenario. As a concrete example, we consider S U (5 ) GUT in 4 +1 dimensions where our 3 +1 dimensional spacetime spontaneously arises on five domain walls. A field-dependent gauge kinetic term is used to localize massless non-Abelian gauge fields on the domain walls and to assure the charge universality of matter fields. We find the domain walls with the symmetry breaking S U (5 )→S U (3 )×S U (2 )×U (1 ) as a global minimum and all the undesirable moduli are stabilized with the mass scale of MGUT. Profiles of massless standard model particles are determined as a consequence of wall dynamics. The proton decay can be exponentially suppressed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larson, Vincent; Gettelman, Andrew; Morrison, Hugh
In state-of-the-art climate models, each cloud type is treated using its own separate cloud parameterization and its own separate microphysics parameterization. This use of separate schemes for separate cloud regimes is undesirable because it is theoretically unfounded, it hampers interpretation of results, and it leads to the temptation to overtune parameters. In this grant, we are creating a climate model that contains a unified cloud parameterization and a unified microphysics parameterization. This model will be used to address the problems of excessive frequency of drizzle in climate models and excessively early onset of deep convection in the Tropics over land.more » The resulting model will be compared with ARM observations.« less
Celedonio Aguirre-Bravo; Carlos Rodriguez Franco
1999-01-01
The general objective of this Symposium was to build on the best science and technology available to assure that the data and information produced in future inventory and monitoring programs are comparable, quality assured, available, and adequate for their intended purposes, thereby providing a reliable framework for characterization, assessment, and management of...
ERIC Educational Resources Information Center
Molina, Otilia Alejandro; Ratté, Sylvie
2017-01-01
This research introduces a method to construct a unified representation of teachers and students perspectives based on the actionable knowledge discovery (AKD) and delivery framework. The representation is constructed using two models: one obtained from student evaluations and the other obtained from teachers' reflections about their teaching…
Metzger, Marc J.; Bunce, Robert G.H.; Jongman, Rob H.G.; Sayre, Roger G.; Trabucco, Antonio; Zomer, Robert
2013-01-01
Main conclusions: The GEnS provides a robust spatial analytical framework for the aggregation of local observations, identification of gaps in current monitoring efforts and systematic design of complementary and new monitoring and research. The dataset is available for non-commercial use through the GEO portal (http://www.geoportal.org).
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…
Teaching Introductory Business Statistics Using the DCOVA Framework
ERIC Educational Resources Information Center
Levine, David M.; Stephan, David F.
2011-01-01
Introductory business statistics students often receive little guidance on how to apply the methods they learn to further business objectives they may one day face. And those students may fail to see the continuity among the topics taught in an introductory course if they learn those methods outside a context that provides a unifying framework.…
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…
Evaluating Health Information Systems Using Ontologies
Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan
2016-01-01
Background There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. Objectives The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems—whether similar or heterogeneous—by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. Methods On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. Results The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. Conclusions The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems. PMID:27311735
Evaluating Health Information Systems Using Ontologies.
Eivazzadeh, Shahryar; Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan
2016-06-16
There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems-whether similar or heterogeneous-by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems.
The information geometry of chaos
NASA Astrophysics Data System (ADS)
Cafaro, Carlo
2008-10-01
In this Thesis, we propose a new theoretical information-geometric framework (IGAC, Information Geometrodynamical Approach to Chaos) suitable to characterize chaotic dynamical behavior of arbitrary complex systems. First, the problem being investigated is defined; its motivation and relevance are discussed. The basic tools of information physics and the relevant mathematical tools employed in this work are introduced. The basic aspects of Entropic Dynamics (ED) are reviewed. ED is an information-constrained dynamics developed by Ariel Caticha to investigate the possibility that laws of physics---either classical or quantum---may emerge as macroscopic manifestations of underlying microscopic statistical structures. ED is of primary importance in our IGAC. The notion of chaos in classical and quantum physics is introduced. Special focus is devoted to the conventional Riemannian geometrodynamical approach to chaos (Jacobi geometrodynamics) and to the Zurek-Paz quantum chaos criterion of linear entropy growth. After presenting this background material, we show that the ED formalism is not purely an abstract mathematical framework, but is indeed a general theoretical scheme from which conventional Newtonian dynamics is obtained as a special limiting case. The major elements of our IGAC and the novel notion of information geometrodynamical entropy (IGE) are introduced by studying two "toy models". To illustrate the potential power of our IGAC, one application is presented. An information-geometric analogue of the Zurek-Paz quantum chaos criterion of linear entropy growth is suggested. Finally, concluding remarks emphasizing strengths and weak points of our approach are presented and possible further research directions are addressed. At this stage of its development, IGAC remains an ambitious unifying information-geometric theoretical construct for the study of chaotic dynamics with several unsolved problems. However, based on our recent findings, we believe it already provides an interesting, innovative and potentially powerful way to study and understand the very important and challenging problems of classical and quantum chaos.
Ince, Robin A A; Giordano, Bruno L; Kayser, Christoph; Rousselet, Guillaume A; Gross, Joachim; Schyns, Philippe G
2017-03-01
We begin by reviewing the statistical framework of information theory as applicable to neuroimaging data analysis. A major factor hindering wider adoption of this framework in neuroimaging is the difficulty of estimating information theoretic quantities in practice. We present a novel estimation technique that combines the statistical theory of copulas with the closed form solution for the entropy of Gaussian variables. This results in a general, computationally efficient, flexible, and robust multivariate statistical framework that provides effect sizes on a common meaningful scale, allows for unified treatment of discrete, continuous, unidimensional and multidimensional variables, and enables direct comparisons of representations from behavioral and brain responses across any recording modality. We validate the use of this estimate as a statistical test within a neuroimaging context, considering both discrete stimulus classes and continuous stimulus features. We also present examples of analyses facilitated by these developments, including application of multivariate analyses to MEG planar magnetic field gradients, and pairwise temporal interactions in evoked EEG responses. We show the benefit of considering the instantaneous temporal derivative together with the raw values of M/EEG signals as a multivariate response, how we can separately quantify modulations of amplitude and direction for vector quantities, and how we can measure the emergence of novel information over time in evoked responses. Open-source Matlab and Python code implementing the new methods accompanies this article. Hum Brain Mapp 38:1541-1573, 2017. © 2016 Wiley Periodicals, Inc. 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.
A Budyko-type Model for Human Water Consumption
NASA Astrophysics Data System (ADS)
Lei, X.; Zhao, J.; Wang, D.; Sivapalan, M.
2017-12-01
With the expansion of human water footprint, water crisis is no longer only a conflict or competition for water between different economic sectors, but also increasingly between human and the environment. In order to describe the emergent dynamics and patterns of the interaction, a theoretical framework that encapsulates the physical and societal controls impacting human water consumption is needed. In traditional hydrology, Budyko-type models are simple but efficient descriptions of vegetation-mediated hydrologic cycle in catchments, i.e., the partitioning of mean annual precipitation into runoff and evapotranspiration. Plant water consumption plays a crucial role in the process. Hypothesized similarities between human-water and vegetation-water interactions, including water demand, constraints and system functioning, give the idea of corresponding Budyko-type framework for human water consumption at the catchment scale. Analogous to variables of Budyko-type models for hydrologic cycle, water demand, water consumption, environmental water use and available water are corresponding to potential evaporation, actual evaporation, runoff and precipitation respectively. Human water consumption data, economic and hydro-meteorological data for 51 human-impacted catchments and 10 major river basins in China are assembled to look for the existence of a Budyko-type relationship for human water consumption, and to seek explanations for the spread in the observed relationship. Guided by this, a Budyko-type analytical model is derived based on application of an optimality principle, that of maximum water benefit. The model derived has the same functional form and mathematical features as those that apply for the original Budyko model. Parameters of the new Budyko-type model for human consumption are linked to economic and social factors. The results of this paper suggest that the functioning of both social and hydrologic subsystems within catchment systems can be explored within a common conceptual framework, thus providing a unified socio-hydrologic basis for the study of coupled human-water systems. The exploration of the theoretical connections between the two subsystems pushes the water system modeling from a problem-solving orientation to puzzle-solving orientation.
Beyond Containment and Deterrence: A Security Framework for Europe in the 21st Century
1990-04-02
decades of the 21st Century in Europe, and examines DDO FJoA 1473 E. T1O. Of INOV 65 IS OBSOLETE Uaf eSECRIT CUnclassified SECURITY CLASSIFICATION’ OF THIS... Poland , and parts of France and Russia, but it did not truely unify Germany. Bismarck unified only parts of Germany which he could constrain under...Europe, Central Europe, the Balkans, and the Soviet Union. Central Europe includes Vest Germany, East Germany, Austria, Czechoslavakia, Poland , and
Towards a Unified Description of the Electroweak Nuclear Response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benhar, Omar; Lovato, Alessandro
2015-06-01
We briefly review the growing efforts to set up a unified framework for the description of neutrino interactions with atomic nuclei and nuclear matter, applicable in the broad kinematical region corresponding to neutrino energies ranging between few MeV and few GeV. The emerging picture suggests that the formalism of nuclear many-body theory (NMBT) can be exploited to obtain the neutrino-nucleus cross-sections needed for both the interpretation of oscillation signals and simulations of neutrino transport in compact stars
1986-06-01
Energy and Natural Resources SWS Contract Report 391 FINAL REPORT A THEORETICAL FRAMEWORK FOR EXAMINING GEOGRAPHICAL VARIABILITY IN THE MICROPHYSICAL...U) A Theoretical Framework for Examining Geographical Variability in the Microphysical Mechanisms of Precipitation Development 12. PERSONAL AUTHOR(S...concentration. Oter key parameters include the degree of entrainment and stability of the environment. I 5 - T17 Unclassified ,.-. . A THEORETICAL FRAMEWORK FOR
In quest of a systematic framework for unifying and defining nanoscience
2009-01-01
This article proposes a systematic framework for unifying and defining nanoscience based on historic first principles and step logic that led to a “central paradigm” (i.e., unifying framework) for traditional elemental/small-molecule chemistry. As such, a Nanomaterials classification roadmap is proposed, which divides all nanomatter into Category I: discrete, well-defined and Category II: statistical, undefined nanoparticles. We consider only Category I, well-defined nanoparticles which are >90% monodisperse as a function of Critical Nanoscale Design Parameters (CNDPs) defined according to: (a) size, (b) shape, (c) surface chemistry, (d) flexibility, and (e) elemental composition. Classified as either hard (H) (i.e., inorganic-based) or soft (S) (i.e., organic-based) categories, these nanoparticles were found to manifest pervasive atom mimicry features that included: (1) a dominance of zero-dimensional (0D) core–shell nanoarchitectures, (2) the ability to self-assemble or chemically bond as discrete, quantized nanounits, and (3) exhibited well-defined nanoscale valencies and stoichiometries reminiscent of atom-based elements. These discrete nanoparticle categories are referred to as hard or soft particle nanoelements. Many examples describing chemical bonding/assembly of these nanoelements have been reported in the literature. We refer to these hard:hard (H-n:H-n), soft:soft (S-n:S-n), or hard:soft (H-n:S-n) nanoelement combinations as nanocompounds. Due to their quantized features, many nanoelement and nanocompound categories are reported to exhibit well-defined nanoperiodic property patterns. These periodic property patterns are dependent on their quantized nanofeatures (CNDPs) and dramatically influence intrinsic physicochemical properties (i.e., melting points, reactivity/self-assembly, sterics, and nanoencapsulation), as well as important functional/performance properties (i.e., magnetic, photonic, electronic, and toxicologic properties). We propose this perspective as a modest first step toward more clearly defining synthetic nanochemistry as well as providing a systematic framework for unifying nanoscience. With further progress, one should anticipate the evolution of future nanoperiodic table(s) suitable for predicting important risk/benefit boundaries in the field of nanoscience. Electronic supplementary material The online version of this article (doi:10.1007/s11051-009-9632-z) contains supplementary material, which is available to authorized users. PMID:21170133
Carmena, Jose M.
2016-01-01
Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain’s behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user’s motor intention during CLDA—a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to parameter initialization. Finally, the architecture extended control to tasks beyond those used for CLDA training. These results have significant implications towards the development of clinically-viable neuroprosthetics. PMID:27035820
Development and application of unified algorithms for problems in computational science
NASA Technical Reports Server (NTRS)
Shankar, Vijaya; Chakravarthy, Sukumar
1987-01-01
A framework is presented for developing computationally unified numerical algorithms for solving nonlinear equations that arise in modeling various problems in mathematical physics. The concept of computational unification is an attempt to encompass efficient solution procedures for computing various nonlinear phenomena that may occur in a given problem. For example, in Computational Fluid Dynamics (CFD), a unified algorithm will be one that allows for solutions to subsonic (elliptic), transonic (mixed elliptic-hyperbolic), and supersonic (hyperbolic) flows for both steady and unsteady problems. The objectives are: development of superior unified algorithms emphasizing accuracy and efficiency aspects; development of codes based on selected algorithms leading to validation; application of mature codes to realistic problems; and extension/application of CFD-based algorithms to problems in other areas of mathematical physics. The ultimate objective is to achieve integration of multidisciplinary technologies to enhance synergism in the design process through computational simulation. Specific unified algorithms for a hierarchy of gas dynamics equations and their applications to two other areas: electromagnetic scattering, and laser-materials interaction accounting for melting.
Theory of Remote Image Formation
NASA Astrophysics Data System (ADS)
Blahut, Richard E.
2004-11-01
In many applications, images, such as ultrasonic or X-ray signals, are recorded and then analyzed with digital or optical processors in order to extract information. Such processing requires the development of algorithms of great precision and sophistication. This book presents a unified treatment of the mathematical methods that underpin the various algorithms used in remote image formation. The author begins with a review of transform and filter theory. He then discusses two- and three-dimensional Fourier transform theory, the ambiguity function, image construction and reconstruction, tomography, baseband surveillance systems, and passive systems (where the signal source might be an earthquake or a galaxy). Information-theoretic methods in image formation are also covered, as are phase errors and phase noise. Throughout the book, practical applications illustrate theoretical concepts, and there are many homework problems. The book is aimed at graduate students of electrical engineering and computer science, and practitioners in industry. Presents a unified treatment of the mathematical methods that underpin the algorithms used in remote image formation Illustrates theoretical concepts with reference to practical applications Provides insights into the design parameters of real systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eccleston, C.H.
1997-09-05
The National Environmental Policy Act (NEPA) of 1969 was established by Congress more than a quarter of a century ago, yet there is a surprising lack of specific tools, techniques, and methodologies for effectively implementing these regulatory requirements. Lack of professionally accepted techniques is a principal factor responsible for many inefficiencies. Often, decision makers do not fully appreciate or capitalize on the true potential which NEPA provides as a platform for planning future actions. New approaches and modem management tools must be adopted to fully achieve NEPA`s mandate. A new strategy, referred to as Total Federal Planning, is proposed formore » unifying large-scale federal planning efforts under a single, systematic, structured, and holistic process. Under this approach, the NEPA planning process provides a unifying framework for integrating all early environmental and nonenvironmental decision-making factors into a single comprehensive planning process. To promote effectiveness and efficiency, modem tools and principles from the disciplines of Value Engineering, Systems Engineering, and Total Quality Management are incorporated. Properly integrated and implemented, these planning tools provide the rigorous, structured, and disciplined framework essential in achieving effective planning. Ultimately, the goal of a Total Federal Planning strategy is to construct a unified and interdisciplinary framework that substantially improves decision-making, while reducing the time, cost, redundancy, and effort necessary to comply with environmental and other planning requirements. At a time when Congress is striving to re-engineer the governmental framework, apparatus, and process, a Total Federal Planning philosophy offers a systematic approach for uniting the disjointed and often convoluted planning process currently used by most federal agencies. Potentially this approach has widespread implications in the way federal planning is approached.« less
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…
Analysis of Implicit Uncertain Systems. Part 1: Theoretical Framework
1994-12-07
Analysis of Implicit Uncertain Systems Part I: Theoretical Framework Fernando Paganini * John Doyle 1 December 7, 1994 Abst rac t This paper...Analysis of Implicit Uncertain Systems Part I: Theoretical Framework 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...model and a number of constraints relevant to the analysis problem under consideration. In Part I of this paper we propose a theoretical framework which
Schalock, Robert L; Luckasson, Ruth; Tassé, Marc J; Verdugo, Miguel Angel
2018-04-01
This article describes a holistic theoretical framework that can be used to explain intellectual disability (ID) and organize relevant information into a usable roadmap to guide understanding and application. Developing the framework involved analyzing the four current perspectives on ID and synthesizing this information into a holistic theoretical framework. Practices consistent with the framework are described, and examples are provided of how multiple stakeholders can apply the framework. The article concludes with a discussion of the advantages and implications of a holistic theoretical approach to ID.
A unifying framework for quantifying the nature of animal interactions.
Potts, Jonathan R; Mokross, Karl; Lewis, Mark A
2014-07-06
Collective phenomena, whereby agent-agent interactions determine spatial patterns, are ubiquitous in the animal kingdom. On the other hand, movement and space use are also greatly influenced by the interactions between animals and their environment. Despite both types of interaction fundamentally influencing animal behaviour, there has hitherto been no unifying framework for the models proposed in both areas. Here, we construct a general method for inferring population-level spatial patterns from underlying individual movement and interaction processes, a key ingredient in building a statistical mechanics for ecological systems. We show that resource selection functions, as well as several examples of collective motion models, arise as special cases of our framework, thus bringing together resource selection analysis and collective animal behaviour into a single theory. In particular, we focus on combining the various mechanistic models of territorial interactions in the literature with step selection functions, by incorporating interactions into the step selection framework and demonstrating how to derive territorial patterns from the resulting models. We demonstrate the efficacy of our model by application to a population of insectivore birds in the Amazon rainforest. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
A unified framework for image retrieval using keyword and visual features.
Jing, Feng; Li, Mingling; Zhang, Hong-Jiang; Zhang, Bo
2005-07-01
In this paper, a unified image retrieval framework based on both keyword annotations and visual features is proposed. In this framework, a set of statistical models are built based on visual features of a small set of manually labeled images to represent semantic concepts and used to propagate keywords to other unlabeled images. These models are updated periodically when more images implicitly labeled by users become available through relevance feedback. In this sense, the keyword models serve the function of accumulation and memorization of knowledge learned from user-provided relevance feedback. Furthermore, two sets of effective and efficient similarity measures and relevance feedback schemes are proposed for query by keyword scenario and query by image example scenario, respectively. Keyword models are combined with visual features in these schemes. In particular, a new, entropy-based active learning strategy is introduced to improve the efficiency of relevance feedback for query by keyword. Furthermore, a new algorithm is proposed to estimate the keyword features of the search concept for query by image example. It is shown to be more appropriate than two existing relevance feedback algorithms. Experimental results demonstrate the effectiveness of the proposed framework.
Chemical library subset selection algorithms: a unified derivation using spatial statistics.
Hamprecht, Fred A; Thiel, Walter; van Gunsteren, Wilfred F
2002-01-01
If similar compounds have similar activity, rational subset selection becomes superior to random selection in screening for pharmacological lead discovery programs. Traditional approaches to this experimental design problem fall into two classes: (i) a linear or quadratic response function is assumed (ii) some space filling criterion is optimized. The assumptions underlying the first approach are clear but not always defendable; the second approach yields more intuitive designs but lacks a clear theoretical foundation. We model activity in a bioassay as realization of a stochastic process and use the best linear unbiased estimator to construct spatial sampling designs that optimize the integrated mean square prediction error, the maximum mean square prediction error, or the entropy. We argue that our approach constitutes a unifying framework encompassing most proposed techniques as limiting cases and sheds light on their underlying assumptions. In particular, vector quantization is obtained, in dimensions up to eight, in the limiting case of very smooth response surfaces for the integrated mean square error criterion. Closest packing is obtained for very rough surfaces under the integrated mean square error and entropy criteria. We suggest to use either the integrated mean square prediction error or the entropy as optimization criteria rather than approximations thereof and propose a scheme for direct iterative minimization of the integrated mean square prediction error. Finally, we discuss how the quality of chemical descriptors manifests itself and clarify the assumptions underlying the selection of diverse or representative subsets.
Describing and understanding behavioral responses to multiple stressors and multiple stimuli.
Hale, Robin; Piggott, Jeremy J; Swearer, Stephen E
2017-01-01
Understanding the effects of environmental change on natural ecosystems is a major challenge, particularly when multiple stressors interact to produce unexpected "ecological surprises" in the form of complex, nonadditive effects that can amplify or reduce their individual effects. Animals often respond behaviorally to environmental change, and multiple stressors can have both population-level and community-level effects. However, the individual, not combined, effects of stressors on animal behavior are commonly studied. There is a need to understand how animals respond to the more complex combinations of stressors that occur in nature, which requires a systematic and rigorous approach to quantify the various potential behavioral responses to the independent and interactive effects of stressors. We illustrate a robust, systematic approach for understanding behavioral responses to multiple stressors based on integrating schemes used to quantitatively classify interactions in multiple-stressor research and to qualitatively view interactions between multiple stimuli in behavioral experiments. We introduce and unify the two frameworks, highlighting their conceptual and methodological similarities, and use four case studies to demonstrate how this unification could improve our interpretation of interactions in behavioral experiments and guide efforts to manage the effects of multiple stressors. Our unified approach: (1) provides behavioral ecologists with a more rigorous and systematic way to quantify how animals respond to interactions between multiple stimuli, an important theoretical advance, (2) helps us better understand how animals behave when they encounter multiple, potentially interacting stressors, and (3) contributes more generally to the understanding of "ecological surprises" in multiple stressors research.
Evolving the future: Toward a science of intentional change
Wilson, David Sloan; Hayes, Steven C.; Biglan, Anthony; Embry, Dennis D.
2015-01-01
Humans possess great capacity for behavioral and cultural change, but our ability to manage change is still limited. This article has two major objectives: first, to sketch a basic science of intentional change centered on evolution; second, to provide examples of intentional behavioral and cultural change from the applied behavioral sciences, which are largely unknown to the basic sciences community. All species have evolved mechanisms of phenotypic plasticity that enable them to respond adaptively to their environments. Some mechanisms of phenotypic plasticity count as evolutionary processes in their own right. The human capacity for symbolic thought provides an inheritance system having the same kind of combinatorial diversity as does genetic recombination and antibody formation. Taking these propositions seriously allows an integration of major traditions within the basic behavioral sciences, such as behaviorism, social constructivism, social psychology, cognitive psychology, and evolutionary psychology, which are often isolated and even conceptualized as opposed to one another. The applied behavioral sciences include well-validated examples of successfully managing behavioral and cultural change at scales ranging from individuals to small groups to large populations. However, these examples are largely unknown beyond their disciplinary boundaries, for lack of a unifying theoretical framework. Viewed from an evolutionary perspective, they are examples of managing evolved mechanisms of phenotypic plasticity, including open-ended processes of variation and selection. Once the many branches of the basic and applied behavioral sciences become conceptually unified, we are closer to a science of intentional change than one might think. PMID:24826907
40 CFR 300.105 - General organization concepts.
Code of Federal Regulations, 2010 CFR
2010-07-01
... capabilities. (b) Three fundamental kinds of activities are performed pursuant to the NCP: (1) Preparedness....205(c). (d) The basic framework for the response management structure is a system (e.g., a unified...
NASA Astrophysics Data System (ADS)
Wu, Min
2016-07-01
The development of anti-fibrotic therapies in diversities of diseases becomes more and more urgent recently, such as in pulmonary, renal and liver fibrosis [1,2], as well as in malignant tumor growths [3]. As reviewed by Ben Amar and Bianca [4], various theoretical, experimental and in-silico models have been developed to understand the fibrosis process, where the implication on therapeutic strategies has also been frequently demonstrated (e.g., [5-7]). In [4], these models are analyzed and sorted according to their approaches, and in the end of [4], a unified multi-scale approach was proposed to understand fibrosis. While one of the major purposes of extensive modeling of fibrosis is to shed light on therapeutic strategies, the theoretical, experimental and in-silico studies of anti-fibrosis therapies should be conducted more intensively.
Final Technical Report for "Reducing tropical precipitation biases in CESM"
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larson, Vincent
In state-of-the-art climate models, each cloud type is treated using its own separate cloud parameterization and its own separate microphysics parameterization. This use of separate schemes for separate cloud regimes is undesirable because it is theoretically unfounded, it hampers interpretation of results, and it leads to the temptation to overtune parameters. In this grant, we have created a climate model that contains a unified cloud parameterization (“CLUBB”) and a unified microphysics parameterization (“MG2”). In this model, all cloud types --- including marine stratocumulus, shallow cumulus, and deep cumulus --- are represented with a single equation set. This model improves themore » representation of convection in the Tropics. The model has been compared with ARM observations. The chief benefit of the project is to provide a climate model that is based on a more theoretically rigorous formulation.« less
A unified and efficient framework for court-net sports video analysis using 3D camera modeling
NASA Astrophysics Data System (ADS)
Han, Jungong; de With, Peter H. N.
2007-01-01
The extensive amount of video data stored on available media (hard and optical disks) necessitates video content analysis, which is a cornerstone for different user-friendly applications, such as, smart video retrieval and intelligent video summarization. This paper aims at finding a unified and efficient framework for court-net sports video analysis. We concentrate on techniques that are generally applicable for more than one sports type to come to a unified approach. To this end, our framework employs the concept of multi-level analysis, where a novel 3-D camera modeling is utilized to bridge the gap between the object-level and the scene-level analysis. The new 3-D camera modeling is based on collecting features points from two planes, which are perpendicular to each other, so that a true 3-D reference is obtained. Another important contribution is a new tracking algorithm for the objects (i.e. players). The algorithm can track up to four players simultaneously. The complete system contributes to summarization by various forms of information, of which the most important are the moving trajectory and real-speed of each player, as well as 3-D height information of objects and the semantic event segments in a game. We illustrate the performance of the proposed system by evaluating it for a variety of court-net sports videos containing badminton, tennis and volleyball, and we show that the feature detection performance is above 92% and events detection about 90%.
Generic-distributed framework for cloud services marketplace based on unified ontology.
Hasan, Samer; Valli Kumari, V
2017-11-01
Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo) to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors' knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.
Use of theoretical and conceptual frameworks in qualitative research.
Green, Helen Elise
2014-07-01
To debate the definition and use of theoretical and conceptual frameworks in qualitative research. There is a paucity of literature to help the novice researcher to understand what theoretical and conceptual frameworks are and how they should be used. This paper acknowledges the interchangeable usage of these terms and researchers' confusion about the differences between the two. It discusses how researchers have used theoretical and conceptual frameworks and the notion of conceptual models. Detail is given about how one researcher incorporated a conceptual framework throughout a research project, the purpose for doing so and how this led to a resultant conceptual model. Concepts from Abbott (1988) and Witz ( 1992 ) were used to provide a framework for research involving two case study sites. The framework was used to determine research questions and give direction to interviews and discussions to focus the research. Some research methods do not overtly use a theoretical framework or conceptual framework in their design, but this is implicit and underpins the method design, for example in grounded theory. Other qualitative methods use one or the other to frame the design of a research project or to explain the outcomes. An example is given of how a conceptual framework was used throughout a research project. Theoretical and conceptual frameworks are terms that are regularly used in research but rarely explained. Textbooks should discuss what they are and how they can be used, so novice researchers understand how they can help with research design. Theoretical and conceptual frameworks need to be more clearly understood by researchers and correct terminology used to ensure clarity for novice researchers.
Heffernan, Eithne; Coulson, Neil S.; Henshaw, Helen; Barry, Johanna G.; Ferguson, Melanie A
2017-01-01
Objective This study explored the psychosocial experiences of adults with hearing loss using the self-regulatory model as a theoretical framework. The primary components of the model, namely cognitive representations, emotional representations, and coping responses, were examined. Design Individual semi-structured interviews were conducted. The data were analysed using an established thematic analysis procedure. Study sample Twenty-five adults with mild-moderate hearing loss from the UK and nine hearing healthcare professionals from the UK, USA, and Canada were recruited via maximum variation sampling. Results Cognitive representations: Most participants described their hearing loss as having negative connotations and consequences, although they were not particularly concerned about the progression or controllability/curability of the condition. Opinions differed regarding the benefits of understanding the causes of one’s hearing loss in detail. Emotional representations: negative emotions dominated, although some experienced positive emotions or muted emotions. Coping responses: engaged coping (e.g. hearing aids, communication tactics) and disengaged coping (e.g. withdrawal from situations, withdrawal within situations): both had perceived advantages and disadvantages. Conclusions This novel application of the self-regulatory model demonstrates that it can be used to capture the key psychosocial experiences (i.e. perceptions, emotions, and coping responses) of adults with mild-moderate hearing loss within a single, unifying framework. PMID:26754550
Optimality in mono- and multisensory map formation.
Bürck, Moritz; Friedel, Paul; Sichert, Andreas B; Vossen, Christine; van Hemmen, J Leo
2010-07-01
In the struggle for survival in a complex and dynamic environment, nature has developed a multitude of sophisticated sensory systems. In order to exploit the information provided by these sensory systems, higher vertebrates reconstruct the spatio-temporal environment from each of the sensory systems they have at their disposal. That is, for each modality the animal computes a neuronal representation of the outside world, a monosensory neuronal map. Here we present a universal framework that allows to calculate the specific layout of the involved neuronal network by means of a general mathematical principle, viz., stochastic optimality. In order to illustrate the use of this theoretical framework, we provide a step-by-step tutorial of how to apply our model. In so doing, we present a spatial and a temporal example of optimal stimulus reconstruction which underline the advantages of our approach. That is, given a known physical signal transmission and rudimental knowledge of the detection process, our approach allows to estimate the possible performance and to predict neuronal properties of biological sensory systems. Finally, information from different sensory modalities has to be integrated so as to gain a unified perception of reality for further processing, e.g., for distinct motor commands. We briefly discuss concepts of multimodal interaction and how a multimodal space can evolve by alignment of monosensory maps.
A unified view on weakly correlated recurrent networks
Grytskyy, Dmytro; Tetzlaff, Tom; Diesmann, Markus; Helias, Moritz
2013-01-01
The diversity of neuron models used in contemporary theoretical neuroscience to investigate specific properties of covariances in the spiking activity raises the question how these models relate to each other. In particular it is hard to distinguish between generic properties of covariances and peculiarities due to the abstracted model. Here we present a unified view on pairwise covariances in recurrent networks in the irregular regime. We consider the binary neuron model, the leaky integrate-and-fire (LIF) model, and the Hawkes process. We show that linear approximation maps each of these models to either of two classes of linear rate models (LRM), including the Ornstein–Uhlenbeck process (OUP) as a special case. The distinction between both classes is the location of additive noise in the rate dynamics, which is located on the output side for spiking models and on the input side for the binary model. Both classes allow closed form solutions for the covariance. For output noise it separates into an echo term and a term due to correlated input. The unified framework enables us to transfer results between models. For example, we generalize the binary model and the Hawkes process to the situation with synaptic conduction delays and simplify derivations for established results. Our approach is applicable to general network structures and suitable for the calculation of population averages. The derived averages are exact for fixed out-degree network architectures and approximate for fixed in-degree. We demonstrate how taking into account fluctuations in the linearization procedure increases the accuracy of the effective theory and we explain the class dependent differences between covariances in the time and the frequency domain. Finally we show that the oscillatory instability emerging in networks of LIF models with delayed inhibitory feedback is a model-invariant feature: the same structure of poles in the complex frequency plane determines the population power spectra. PMID:24151463
Pinto, Isabela Cardoso de Matos; Teixeira, Carmen Fontes
2011-09-01
The construction of Brazil's Unified National Health System (SUS) has raised a set of challenges for the health sector's administrators and personnel, including issues of work management and continuing education for health workers, in view of the financial, political, and organizational constraints in the process of changing the healthcare model. The current study aimed to analyze the process of formulating the Health Work and Education Management Policy by the Bahia State Health Department. Public policy cycle was used as the theoretical framework. The study analyzed data from institutional documents and records of participant observation by one of the authors. The results include mapping the governmental and nongovernmental stakeholders that participated in the process. The analysis highlights a series of problems in the SUS in Bahia related to work management and health workers' profile, taken as the point of departure for priority-setting in the State Strategic Agenda and Health Plan for 2008-2011.
Synaptic Transmission Optimization Predicts Expression Loci of Long-Term Plasticity.
Costa, Rui Ponte; Padamsey, Zahid; D'Amour, James A; Emptage, Nigel J; Froemke, Robert C; Vogels, Tim P
2017-09-27
Long-term modifications of neuronal connections are critical for reliable memory storage in the brain. However, their locus of expression-pre- or postsynaptic-is highly variable. Here we introduce a theoretical framework in which long-term plasticity performs an optimization of the postsynaptic response statistics toward a given mean with minimal variance. Consequently, the state of the synapse at the time of plasticity induction determines the ratio of pre- and postsynaptic modifications. Our theory explains the experimentally observed expression loci of the hippocampal and neocortical synaptic potentiation studies we examined. Moreover, the theory predicts presynaptic expression of long-term depression, consistent with experimental observations. At inhibitory synapses, the theory suggests a statistically efficient excitatory-inhibitory balance in which changes in inhibitory postsynaptic response statistics specifically target the mean excitation. Our results provide a unifying theory for understanding the expression mechanisms and functions of long-term synaptic transmission plasticity. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
On the Link Between Kolmogorov Microscales and Friction in Wall-Bounded Flow of Viscoplastic Fluids
NASA Astrophysics Data System (ADS)
Ramos, Fabio; Anbarlooei, Hamid; Cruz, Daniel; Silva Freire, Atila; Santos, Cecilia M.
2017-11-01
Most discussions in literature on the friction coefficient of turbulent flows of fluids with complex rheology are empirical. As a rule, theoretical frameworks are not available even for some relatively simple constitutive models. In this work, we present a new family of formulas for the evaluation of the friction coefficient of turbulent flows of a large family of viscoplastic fluids. The developments combine an unified analysis for the description of the Kolmogorov's micro-scales and the phenomenological turbulence model of Gioia and Chakraborty. The resulting Blasius-type friction equation has only Blasius' constant as a parameter, and tests against experimental data show excellent agreement over a significant range of Hedstrom and Reynolds numbers. The limits of the proposed model are also discussed. We also comment on the role of the new formula as a possible benchmark test for the convergence of DNS simulations of viscoplastic flows. The friction formula also provides limits for the Maximum Drag Reduction (MDR) for viscoplastic flows, which resembles MDR asymptote for viscoelastic flows.
Backpropagation and ordered derivatives in the time scales calculus.
Seiffertt, John; Wunsch, Donald C
2010-08-01
Backpropagation is the most widely used neural network learning technique. It is based on the mathematical notion of an ordered derivative. In this paper, we present a formulation of ordered derivatives and the backpropagation training algorithm using the important emerging area of mathematics known as the time scales calculus. This calculus, with its potential for application to a wide variety of inter-disciplinary problems, is becoming a key area of mathematics. It is capable of unifying continuous and discrete analysis within one coherent theoretical framework. Using this calculus, we present here a generalization of backpropagation which is appropriate for cases beyond the specifically continuous or discrete. We develop a new multivariate chain rule of this calculus, define ordered derivatives on time scales, prove a key theorem about them, and derive the backpropagation weight update equations for a feedforward multilayer neural network architecture. By drawing together the time scales calculus and the area of neural network learning, we present the first connection of two major fields of research.
NASA Astrophysics Data System (ADS)
Sivasundaram, Seenith
2016-07-01
The review paper [1] is devoted to the survey of different structures that have been developed for the modeling and analysis of various types of fibrosis. Biomathematics, bioinformatics, biomechanics and biophysics modeling have been treated by means of a brief description of the different models developed. The review is impressive and clearly written, addressed to a reader interested not only in the theoretical modeling but also in the biological description. The models have been described without recurring to technical statements or mathematical equations thus allowing the non-specialist reader to understand what framework is more suitable at a certain observation scale. The review [1] concludes with the possibility to develop a multiscale approach considering also the definition of a therapeutical strategy for pathological fibrosis. In particular the control and optimization of therapeutics action is an important issue and this article aims at commenting on this topic.
Face-space: A unifying concept in face recognition research.
Valentine, Tim; Lewis, Michael B; Hills, Peter J
2016-10-01
The concept of a multidimensional psychological space, in which faces can be represented according to their perceived properties, is fundamental to the modern theorist in face processing. Yet the idea was not clearly expressed until 1991. The background that led to the development of face-space is explained, and its continuing influence on theories of face processing is discussed. Research that has explored the properties of the face-space and sought to understand caricature, including facial adaptation paradigms, is reviewed. Face-space as a theoretical framework for understanding the effect of ethnicity and the development of face recognition is evaluated. Finally, two applications of face-space in the forensic setting are discussed. From initially being presented as a model to explain distinctiveness, inversion, and the effect of ethnicity, face-space has become a central pillar in many aspects of face processing. It is currently being developed to help us understand adaptation effects with faces. While being in principle a simple concept, face-space has shaped, and continues to shape, our understanding of face perception.
Leal, Cristian Oliveira Benevides Sanches; Teixeira, Carmen Fontes de Souza
2017-10-01
This is a theoretical essay about the development of the concept of solidarity, a word used in the regulatory framework and in political proposals to reorient the Brazilian Unified Health System (SUS). The methodology consisted of mapping authors addressing human action aspects related to this theme from Durkheim's tradition, linking them to his followers, like Marcel Mauss and authors from the "anti-utilitarianism" movement in social sciences. Solidarity is one way to express a "gift" and appears as a multidimensional action, where duty and freedom, instrumental interest and disinterest interpose and interlace. The planning and execution of sanitary surveillance (VISA) actions requires comprehension of organizational forms and solidary relationship management among agents involved in health risk control, transcending the strongly normative aspect of the prevailing supervision actions. The development of associative actions involving sanitary surveillance professionals, economic agents and consumers, aiming to share the responsibilities in the health risk control of products, services and environments subjected to Sanitary Surveillance action is suggested.
The consciousness state space (CSS)—a unifying model for consciousness and self
Berkovich-Ohana, Aviva; Glicksohn, Joseph
2014-01-01
Every experience, those we are aware of and those we are not, is embedded in a subjective timeline, is tinged with emotion, and inevitably evokes a certain sense of self. Here, we present a phenomenological model for consciousness and selfhood which relates time, awareness, and emotion within one framework. The consciousness state space (CSS) model is a theoretical one. It relies on a broad range of literature, hence has high explanatory and integrative strength, and helps in visualizing the relationship between different aspects of experience. Briefly, it is suggested that all phenomenological states fall into two categories of consciousness, core and extended (CC and EC, respectively). CC supports minimal selfhood that is short of temporal extension, its scope being the here and now. EC supports narrative selfhood, which involves personal identity and continuity across time, as well as memory, imagination and conceptual thought. The CSS is a phenomenological space, created by three dimensions: time, awareness and emotion. Each of the three dimensions is shown to have a dual phenomenological composition, falling within CC and EC. The neural spaces supporting each of these dimensions, as well as CC and EC, are laid out based on the neuroscientific literature. The CSS dynamics include two simultaneous trajectories, one in CC and one in EC, typically antagonistic in normal experiences. However, this characteristic behavior is altered in states in which a person experiences an altered sense of self. Two examples are laid out, flow and meditation. The CSS model creates a broad theoretical framework with explanatory and unificatory power. It constructs a detailed map of the consciousness and selfhood phenomenology, which offers constraints for the science of consciousness. We conclude by outlining several testable predictions raised by the CSS model. PMID:24808870
LIFE CYCLE ENGINEERING GUIDELINES
This document provides guidelines for the implementation of LCE concepts, information, and techniques in engineering products, systems, processes, and facilities. To make this document as practical and useable as possible, a unifying LCE framework is presented. Subsequent topics ...
Value of Flexibility - Phase 1
2010-09-25
weaknesses of each approach. During this period, we also explored the development of an analytical framework based on sound mathematical constructs... mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory or guidance on best approaches to...research activities is in developing a coherent value based definition of flexibility that is based on an analytical framework that is mathematically
Modern Fysics Phallacies: The Best Way Not to Unify Physics
NASA Astrophysics Data System (ADS)
Beichler, James E.
Too many physicists believe the `phallacy' that the quantum is more fundamental than relativity without any valid supporting evidence, so the earliest attempts to unify physics based on the continuity of relativity have been all but abandoned. This belief is probably due to the wealth of pro-quantum propaganda and general `phallacies in fysics' that were spread during the second quarter of the twentieth century, although serious `phallacies' exist throughout physics on both sides of the debate. Yet both approaches are basically flawed because both relativity and the quantum theory are incomplete and grossly misunderstood as they now stand. Had either side of the quantum versus relativity controversy sought common ground between the two worldviews, total unification would have been accomplished long ago. The point is, literally, that the discrete quantum, continuous relativity, basic physical geometry, theoretical mathematics and classical physics all share one common characteristic that has never been fully explored or explained - a paradoxical duality between a dimensionless point (discrete) and an extended length (continuity) in any dimension - and if the problem of unification is approached from an understanding of how this paradox relates to each paradigm, all of physics and indeed all of science could be unified under a single new theoretical paradigm.
Unified approach to redshift in cosmological/black hole spacetimes and synchronous frame
NASA Astrophysics Data System (ADS)
Toporensky, A. V.; Zaslavskii, O. B.; Popov, S. B.
2018-01-01
Usually, interpretation of redshift in static spacetimes (for example, near black holes) is opposed to that in cosmology. In this methodological note, we show that both explanations are unified in a natural picture. This is achieved if, considering the static spacetime, one (i) makes a transition to a synchronous frame, and (ii) returns to the original frame by means of local Lorentz boost. To reach our goal, we consider a rather general class of spherically symmetric spacetimes. In doing so, we construct frames that generalize the well-known Lemaitre and Painlevé-Gullstand ones and elucidate the relation between them. This helps us to understand, in a unifying approach, how gravitation reveals itself in different branches of general relativity. This framework can be useful for general relativity university courses.
SCIFIO: an extensible framework to support scientific image formats.
Hiner, Mark C; Rueden, Curtis T; Eliceiri, Kevin W
2016-12-07
No gold standard exists in the world of scientific image acquisition; a proliferation of instruments each with its own proprietary data format has made out-of-the-box sharing of that data nearly impossible. In the field of light microscopy, the Bio-Formats library was designed to translate such proprietary data formats to a common, open-source schema, enabling sharing and reproduction of scientific results. While Bio-Formats has proved successful for microscopy images, the greater scientific community was lacking a domain-independent framework for format translation. SCIFIO (SCientific Image Format Input and Output) is presented as a freely available, open-source library unifying the mechanisms of reading and writing image data. The core of SCIFIO is its modular definition of formats, the design of which clearly outlines the components of image I/O to encourage extensibility, facilitated by the dynamic discovery of the SciJava plugin framework. SCIFIO is structured to support coexistence of multiple domain-specific open exchange formats, such as Bio-Formats' OME-TIFF, within a unified environment. SCIFIO is a freely available software library developed to standardize the process of reading and writing scientific image formats.
Zenni, Rafael Dudeque; Dickie, Ian A; Wingfield, Michael J; Hirsch, Heidi; Crous, Casparus J; Meyerson, Laura A; Burgess, Treena I; Zimmermann, Thalita G; Klock, Metha M; Siemann, Evan; Erfmeier, Alexandra; Aragon, Roxana; Montti, Lia; Le Roux, Johannes J
2016-12-30
Evolutionary processes greatly impact the outcomes of biological invasions. An extensive body of research suggests that invasive populations often undergo phenotypic and ecological divergence from their native sources. Evolution also operates at different and distinct stages during the invasion process. Thus, it is important to incorporate evolutionary change into frameworks of biological invasions because it allows us to conceptualize how these processes may facilitate or hinder invasion success. Here, we review such processes, with an emphasis on tree invasions, and place them in the context of the unified framework for biological invasions. The processes and mechanisms described are pre-introduction evolutionary history, sampling effect, founder effect, genotype-by-environment interactions, admixture, hybridization, polyploidization, rapid evolution, epigenetics, and second-genomes. For the last, we propose that co-evolved symbionts, both beneficial and harmful, which are closely physiologically associated with invasive species, contain critical genetic traits that affect the evolutionary dynamics of biological invasions. By understanding the mechanisms underlying invasion success, researchers will be better equipped to predict, understand, and manage biological invasions. Published by Oxford University Press on behalf of the Annals of Botany Company.
Dickie, Ian A.; Wingfield, Michael J.; Hirsch, Heidi; Crous, Casparus J.; Meyerson, Laura A.; Burgess, Treena I.; Zimmermann, Thalita G.; Klock, Metha M.; Siemann, Evan; Erfmeier, Alexandra; Aragon, Roxana; Montti, Lia; Le Roux, Johannes J.
2017-01-01
Abstract Evolutionary processes greatly impact the outcomes of biological invasions. An extensive body of research suggests that invasive populations often undergo phenotypic and ecological divergence from their native sources. Evolution also operates at different and distinct stages during the invasion process. Thus, it is important to incorporate evolutionary change into frameworks of biological invasions because it allows us to conceptualize how these processes may facilitate or hinder invasion success. Here, we review such processes, with an emphasis on tree invasions, and place them in the context of the unified framework for biological invasions. The processes and mechanisms described are pre-introduction evolutionary history, sampling effect, founder effect, genotype-by-environment interactions, admixture, hybridization, polyploidization, rapid evolution, epigenetics and second-genomes. For the last, we propose that co-evolved symbionts, both beneficial and harmful, which are closely physiologically associated with invasive species, contain critical genetic traits that affect the evolutionary dynamics of biological invasions. By understanding the mechanisms underlying invasion success, researchers will be better equipped to predict, understand and manage biological invasions. PMID:28039118
NASA Astrophysics Data System (ADS)
Pathirana, A.; Radhakrishnan, M.; Zevenbergen, C.; Quan, N. H.
2016-12-01
The need to address the shortcomings of urban systems - adaptation deficit - and shortcomings in response to climate change - `adaptation gap' - are both major challenges in maintaining the livability and sustainability of cities. However, the adaptation actions defined in terms of type I (addressing adaptation deficits) and type II (addressing adaptation gaps), often compete and conflict each other in the secondary cities of the global south. Extending the concept of the environmental Kuznets curve, this paper argues that a unified framework that calls for synergistic action on type I and type II adaptation is essential in order for these cities to maintain their livability, sustainability and resilience facing extreme rates of urbanization and rapid onset of climate change. The proposed framework has been demonstrated in Can Tho, Vietnam, where there are significant adaptation deficits due to rapid urbanisation and adaptation gaps due to climate change and socio-economic changes. The analysis in Can Tho reveals the lack of integration between type I and type II measures that could be overcome by closer integration between various stakeholders in terms of planning, prioritising and implementing the adaptation measures.
Unified framework for automated iris segmentation using distantly acquired face images.
Tan, Chun-Wei; Kumar, Ajay
2012-09-01
Remote human identification using iris biometrics has high civilian and surveillance applications and its success requires the development of robust segmentation algorithm to automatically extract the iris region. This paper presents a new iris segmentation framework which can robustly segment the iris images acquired using near infrared or visible illumination. The proposed approach exploits multiple higher order local pixel dependencies to robustly classify the eye region pixels into iris or noniris regions. Face and eye detection modules have been incorporated in the unified framework to automatically provide the localized eye region from facial image for iris segmentation. We develop robust postprocessing operations algorithm to effectively mitigate the noisy pixels caused by the misclassification. Experimental results presented in this paper suggest significant improvement in the average segmentation errors over the previously proposed approaches, i.e., 47.5%, 34.1%, and 32.6% on UBIRIS.v2, FRGC, and CASIA.v4 at-a-distance databases, respectively. The usefulness of the proposed approach is also ascertained from recognition experiments on three different publicly available databases.
Trajectory optimization for lunar soft landing with complex constraints
NASA Astrophysics Data System (ADS)
Chu, Huiping; Ma, Lin; Wang, Kexin; Shao, Zhijiang; Song, Zhengyu
2017-11-01
A unified trajectory optimization framework with initialization strategies is proposed in this paper for lunar soft landing for various missions with specific requirements. Two main missions of interest are Apollo-like Landing from low lunar orbit and Vertical Takeoff Vertical Landing (a promising mobility method) on the lunar surface. The trajectory optimization is characterized by difficulties arising from discontinuous thrust, multi-phase connections, jump of attitude angle, and obstacles avoidance. Here R-function is applied to deal with the discontinuities of thrust, checkpoint constraints are introduced to connect multiple landing phases, attitude angular rate is designed to get rid of radical changes, and safeguards are imposed to avoid collision with obstacles. The resulting dynamic problems are generally with complex constraints. The unified framework based on Gauss Pseudospectral Method (GPM) and Nonlinear Programming (NLP) solver are designed to solve the problems efficiently. Advanced initialization strategies are developed to enhance both the convergence and computation efficiency. Numerical results demonstrate the adaptability of the framework for various landing missions, and the performance of successful solution of difficult dynamic problems.
Rater cognition: review and integration of research findings.
Gauthier, Geneviève; St-Onge, Christina; Tavares, Walter
2016-05-01
Given the complexity of competency frameworks, associated skills and abilities, and contexts in which they are to be assessed in competency-based education (CBE), there is an increased reliance on rater judgements when considering trainee performance. This increased dependence on rater-based assessment has led to the emergence of rater cognition as a field of research in health professions education. The topic, however, is often conceptualised and ultimately investigated using many different perspectives and theoretical frameworks. Critically analysing how researchers think about, study and discuss rater cognition or the judgement processes in assessment frameworks may provide meaningful and efficient directions in how the field continues to explore the topic. We conducted a critical and integrative review of the literature to explore common conceptualisations and unified terminology associated with rater cognition research. We identified 1045 articles on rater-based assessment in health professions education using Scorpus, Medline and ERIC and 78 articles were included in our review. We propose a three-phase framework of observation, processing and integration. We situate nine specific mechanisms and sub-mechanisms described across the literature within these phases: (i) generating automatic impressions about the person; (ii) formulating high-level inferences; (iii) focusing on different dimensions of competencies; (iv) categorising through well-developed schemata based on (a) personal concept of competence, (b) comparison with various exemplars and (c) task and context specificity; (v) weighting and synthesising information differently, (vi) producing narrative judgements; and (vii) translating narrative judgements into scales. Our review has allowed us to identify common underlying conceptualisations of observed rater mechanisms and subsequently propose a comprehensive, although complex, framework for the dynamic and contextual nature of the rating process. This framework could help bridge the gap between researchers adopting different perspectives when studying rater cognition and enable the interpretation of contradictory findings of raters' performance by determining which mechanism is enabled or disabled in any given context. © 2016 John Wiley & Sons Ltd.
Cane, James; O'Connor, Denise; Michie, Susan
2012-04-24
An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research, has been applied across a wide range of clinical situations. This study tests the validity of this framework. Validity was investigated by behavioural experts sorting 112 unique theoretical constructs using closed and open sort tasks. The extent of replication was tested by Discriminant Content Validation and Fuzzy Cluster Analysis. There was good support for a refinement of the framework comprising 14 domains of theoretical constructs (average silhouette value 0.29): 'Knowledge', 'Skills', 'Social/Professional Role and Identity', 'Beliefs about Capabilities', 'Optimism', 'Beliefs about Consequences', 'Reinforcement', 'Intentions', 'Goals', 'Memory, Attention and Decision Processes', 'Environmental Context and Resources', 'Social Influences', 'Emotions', and 'Behavioural Regulation'. The refined Theoretical Domains Framework has a strengthened empirical base and provides a method for theoretically assessing implementation problems, as well as professional and other health-related behaviours as a basis for intervention development.
A Unified Framework for Periodic, On-Demand, and User-Specified Software Information
NASA Technical Reports Server (NTRS)
Kolano, Paul Z.
2004-01-01
Although grid computing can increase the number of resources available to a user; not all resources on the grid may have a software environment suitable for running a given application. To provide users with the necessary assistance for selecting resources with compatible software environments and/or for automatically establishing such environments, it is necessary to have an accurate source of information about the software installed across the grid. This paper presents a new OGSI-compliant software information service that has been implemented as part of NASA's Information Power Grid project. This service is built on top of a general framework for reconciling information from periodic, on-demand, and user-specified sources. Information is retrieved using standard XPath queries over a single unified namespace independent of the information's source. Two consumers of the provided software information, the IPG Resource Broker and the IPG Neutralization Service, are briefly described.
Semantically enabled image similarity search
NASA Astrophysics Data System (ADS)
Casterline, May V.; Emerick, Timothy; Sadeghi, Kolia; Gosse, C. A.; Bartlett, Brent; Casey, Jason
2015-05-01
Georeferenced data of various modalities are increasingly available for intelligence and commercial use, however effectively exploiting these sources demands a unified data space capable of capturing the unique contribution of each input. This work presents a suite of software tools for representing geospatial vector data and overhead imagery in a shared high-dimension vector or embedding" space that supports fused learning and similarity search across dissimilar modalities. While the approach is suitable for fusing arbitrary input types, including free text, the present work exploits the obvious but computationally difficult relationship between GIS and overhead imagery. GIS is comprised of temporally-smoothed but information-limited content of a GIS, while overhead imagery provides an information-rich but temporally-limited perspective. This processing framework includes some important extensions of concepts in literature but, more critically, presents a means to accomplish them as a unified framework at scale on commodity cloud architectures.
Motor symptoms in Parkinson's disease: A unified framework.
Moustafa, Ahmed A; Chakravarthy, Srinivasa; Phillips, Joseph R; Gupta, Ankur; Keri, Szabolcs; Polner, Bertalan; Frank, Michael J; Jahanshahi, Marjan
2016-09-01
Parkinson's disease (PD) is characterized by a range of motor symptoms. Besides the cardinal symptoms (akinesia and bradykinesia, tremor and rigidity), PD patients show additional motor deficits, including: gait disturbance, impaired handwriting, grip force and speech deficits, among others. Some of these motor symptoms (e.g., deficits of gait, speech, and handwriting) have similar clinical profiles, neural substrates, and respond similarly to dopaminergic medication and deep brain stimulation (DBS). Here, we provide an extensive review of the clinical characteristics and neural substrates of each of these motor symptoms, to highlight precisely how PD and its medical and surgical treatments impact motor symptoms. In conclusion, we offer a unified framework for understanding the range of motor symptoms in PD. We argue that various motor symptoms in PD reflect dysfunction of neural structures responsible for action selection, motor sequencing, and coordination and execution of movement. Copyright © 2016 Elsevier Ltd. All rights reserved.
Benefits of a Unified LaSRS++ Simulation for NAS-Wide and High-Fidelity Modeling
NASA Technical Reports Server (NTRS)
Glaab, Patricia; Madden, Michael
2014-01-01
The LaSRS++ high-fidelity vehicle simulation was extended in 2012 to support a NAS-wide simulation mode. Since the initial proof-of-concept, the LaSRS++ NAS-wide simulation is maturing into a research-ready tool. A primary benefit of this new capability is the consolidation of the two modeling paradigms under a single framework to save cost, facilitate iterative concept testing between the two tools, and to promote communication and model sharing between user communities at Langley. Specific benefits of each type of modeling are discussed along with the expected benefits of the unified framework. Current capability details of the LaSRS++ NAS-wide simulations are provided, including the visualization tool, live data interface, trajectory generators, terminal routing for arrivals and departures, maneuvering, re-routing, navigation, winds, and turbulence. The plan for future development is also described.
Liu, Dan; Liu, Xuejun; Wu, Yiguang
2018-04-24
This paper presents an effective approach for depth reconstruction from a single image through the incorporation of semantic information and local details from the image. A unified framework for depth acquisition is constructed by joining a deep Convolutional Neural Network (CNN) and a continuous pairwise Conditional Random Field (CRF) model. Semantic information and relative depth trends of local regions inside the image are integrated into the framework. A deep CNN network is firstly used to automatically learn a hierarchical feature representation of the image. To get more local details in the image, the relative depth trends of local regions are incorporated into the network. Combined with semantic information of the image, a continuous pairwise CRF is then established and is used as the loss function of the unified model. Experiments on real scenes demonstrate that the proposed approach is effective and that the approach obtains satisfactory results.
Robust nonlinear control of vectored thrust aircraft
NASA Technical Reports Server (NTRS)
Doyle, John C.; Murray, Richard; Morris, John
1993-01-01
An interdisciplinary program in robust control for nonlinear systems with applications to a variety of engineering problems is outlined. Major emphasis will be placed on flight control, with both experimental and analytical studies. This program builds on recent new results in control theory for stability, stabilization, robust stability, robust performance, synthesis, and model reduction in a unified framework using Linear Fractional Transformations (LFT's), Linear Matrix Inequalities (LMI's), and the structured singular value micron. Most of these new advances have been accomplished by the Caltech controls group independently or in collaboration with researchers in other institutions. These recent results offer a new and remarkably unified framework for all aspects of robust control, but what is particularly important for this program is that they also have important implications for system identification and control of nonlinear systems. This combines well with Caltech's expertise in nonlinear control theory, both in geometric methods and methods for systems with constraints and saturations.
Snoopy--a unifying Petri net framework to investigate biomolecular networks.
Rohr, Christian; Marwan, Wolfgang; Heiner, Monika
2010-04-01
To investigate biomolecular networks, Snoopy provides a unifying Petri net framework comprising a family of related Petri net classes. Models can be hierarchically structured, allowing for the mastering of larger networks. To move easily between the qualitative, stochastic and continuous modelling paradigms, models can be converted into each other. We get models sharing structure, but specialized by their kinetic information. The analysis and iterative reverse engineering of biomolecular networks is supported by the simultaneous use of several Petri net classes, while the graphical user interface adapts dynamically to the active one. Built-in animation and simulation are complemented by exports to various analysis tools. Snoopy facilitates the addition of new Petri net classes thanks to its generic design. Our tool with Petri net samples is available free of charge for non-commercial use at http://www-dssz.informatik.tu-cottbus.de/snoopy.html; supported operating systems: Mac OS X, Windows and Linux (selected distributions).
Penetration with Long Rods: A Theoretical Framework and Comparison with Instrumented Impacts,
1980-06-01
theoretical framework for an experimental program is described. The theory of one dimensional wave propagation is used to show how data from instrumented long rods and targets may be fitted together to give a...the theoretical framework . In the final section the results to date are discussed.
ERIC Educational Resources Information Center
Palmer, Zsuzsanna Bacsa
2013-01-01
The effects of globalization on communication products and processes have resulted in document features and interactional practices that are sometimes difficult to describe within current theoretical frameworks of inter/transcultural technical communication. Although it has been recognized in our field that the old theoretical frameworks and…
1989-10-02
REVIEW OF THE LITERATURE AND A J.M.C. Schraagen THEORETICAL FRAMEWORK 2 Nothing from this issue may be reproduced and/or published by print, photoprint...Availability Codes Dist Special 5 Report No.: IZF 1989-36 Title: Navigation in unfamiliar cities: a review of the literature and a theoretical framework Author... theoretical framework sketched above suggests that some people may be better in encoding spatial informa- tion than others. This may be because of their
Burns, K C; Zotz, G
2010-02-01
Epiphytes are an important component of many forested ecosystems, yet our understanding of epiphyte communities lags far behind that of terrestrial-based plant communities. This discrepancy is exacerbated by the lack of a theoretical context to assess patterns in epiphyte community structure. We attempt to fill this gap by developing an analytical framework to investigate epiphyte assemblages, which we then apply to a data set on epiphyte distributions in a Panamanian rain forest. On a coarse scale, interactions between epiphyte species and host tree species can be viewed as bipartite networks, similar to pollination and seed dispersal networks. On a finer scale, epiphyte communities on individual host trees can be viewed as meta-communities, or suites of local epiphyte communities connected by dispersal. Similar analytical tools are typically employed to investigate species interaction networks and meta-communities, thus providing a unified analytical framework to investigate coarse-scale (network) and fine-scale (meta-community) patterns in epiphyte distributions. Coarse-scale analysis of the Panamanian data set showed that most epiphyte species interacted with fewer host species than expected by chance. Fine-scale analyses showed that epiphyte species richness on individual trees was lower than null model expectations. Therefore, epiphyte distributions were clumped at both scales, perhaps as a result of dispersal limitations. Scale-dependent patterns in epiphyte species composition were observed. Epiphyte-host networks showed evidence of negative co-occurrence patterns, which could arise from adaptations among epiphyte species to avoid competition for host species, while most epiphyte meta-communities were distributed at random. Application of our "meta-network" analytical framework in other locales may help to identify general patterns in the structure of epiphyte assemblages and their variation in space and time.
Genetic Programming for Automatic Hydrological Modelling
NASA Astrophysics Data System (ADS)
Chadalawada, Jayashree; Babovic, Vladan
2017-04-01
One of the recent challenges for the hydrologic research community is the need for the development of coupled systems that involves the integration of hydrologic, atmospheric and socio-economic relationships. This poses a requirement for novel modelling frameworks that can accurately represent complex systems, given, the limited understanding of underlying processes, increasing volume of data and high levels of uncertainity. Each of the existing hydrological models vary in terms of conceptualization and process representation and is the best suited to capture the environmental dynamics of a particular hydrological system. Data driven approaches can be used in the integration of alternative process hypotheses in order to achieve a unified theory at catchment scale. The key steps in the implementation of integrated modelling framework that is influenced by prior understanding and data, include, choice of the technique for the induction of knowledge from data, identification of alternative structural hypotheses, definition of rules, constraints for meaningful, intelligent combination of model component hypotheses and definition of evaluation metrics. This study aims at defining a Genetic Programming based modelling framework that test different conceptual model constructs based on wide range of objective functions and evolves accurate and parsimonious models that capture dominant hydrological processes at catchment scale. In this paper, GP initializes the evolutionary process using the modelling decisions inspired from the Superflex framework [Fenicia et al., 2011] and automatically combines them into model structures that are scrutinized against observed data using statistical, hydrological and flow duration curve based performance metrics. The collaboration between data driven and physical, conceptual modelling paradigms improves the ability to model and manage hydrologic systems. Fenicia, F., D. Kavetski, and H. H. Savenije (2011), Elements of a flexible approach for conceptual hydrological modeling: 1. Motivation and theoretical development, Water Resources Research, 47(11).
[Towards an unified theory of the universe basic forces ("the everything theory")].
Aguilar Peris, José
2004-01-01
Numerous efforts have been made in order to unify all the basic forces in nature. In 1967 the fusion of electromagnetic and weak forces was obtained and in 1973 a theoretical bridge between the electroweak and the strong forces have been constructed. This theory is waiting for experimental proofs in the CERN large hadron collider. The last stage would be "the everything theory", which includes the gravitational force. Only the so called superstring theory is a good candidate to overcome the incompatibility of the quantum mechanics and the general relativity, but this theory is not already achieved.
A Unified Framework for Street-View Panorama Stitching
Li, Li; Yao, Jian; Xie, Renping; Xia, Menghan; Zhang, Wei
2016-01-01
In this paper, we propose a unified framework to generate a pleasant and high-quality street-view panorama by stitching multiple panoramic images captured from the cameras mounted on the mobile platform. Our proposed framework is comprised of four major steps: image warping, color correction, optimal seam line detection and image blending. Since the input images are captured without a precisely common projection center from the scenes with the depth differences with respect to the cameras to different extents, such images cannot be precisely aligned in geometry. Therefore, an efficient image warping method based on the dense optical flow field is proposed to greatly suppress the influence of large geometric misalignment at first. Then, to lessen the influence of photometric inconsistencies caused by the illumination variations and different exposure settings, we propose an efficient color correction algorithm via matching extreme points of histograms to greatly decrease color differences between warped images. After that, the optimal seam lines between adjacent input images are detected via the graph cut energy minimization framework. At last, the Laplacian pyramid blending algorithm is applied to further eliminate the stitching artifacts along the optimal seam lines. Experimental results on a large set of challenging street-view panoramic images captured form the real world illustrate that the proposed system is capable of creating high-quality panoramas. PMID:28025481
A Unified Analysis of Structured Sonar-terrain Data using Bayesian Functional Mixed Models.
Zhu, Hongxiao; Caspers, Philip; Morris, Jeffrey S; Wu, Xiaowei; Müller, Rolf
2018-01-01
Sonar emits pulses of sound and uses the reflected echoes to gain information about target objects. It offers a low cost, complementary sensing modality for small robotic platforms. While existing analytical approaches often assume independence across echoes, real sonar data can have more complicated structures due to device setup or experimental design. In this paper, we consider sonar echo data collected from multiple terrain substrates with a dual-channel sonar head. Our goals are to identify the differential sonar responses to terrains and study the effectiveness of this dual-channel design in discriminating targets. We describe a unified analytical framework that achieves these goals rigorously, simultaneously, and automatically. The analysis was done by treating the echo envelope signals as functional responses and the terrain/channel information as covariates in a functional regression setting. We adopt functional mixed models that facilitate the estimation of terrain and channel effects while capturing the complex hierarchical structure in data. This unified analytical framework incorporates both Gaussian models and robust models. We fit the models using a full Bayesian approach, which enables us to perform multiple inferential tasks under the same modeling framework, including selecting models, estimating the effects of interest, identifying significant local regions, discriminating terrain types, and describing the discriminatory power of local regions. Our analysis of the sonar-terrain data identifies time regions that reflect differential sonar responses to terrains. The discriminant analysis suggests that a multi- or dual-channel design achieves target identification performance comparable with or better than a single-channel design.
A Unified Analysis of Structured Sonar-terrain Data using Bayesian Functional Mixed Models
Zhu, Hongxiao; Caspers, Philip; Morris, Jeffrey S.; Wu, Xiaowei; Müller, Rolf
2017-01-01
Sonar emits pulses of sound and uses the reflected echoes to gain information about target objects. It offers a low cost, complementary sensing modality for small robotic platforms. While existing analytical approaches often assume independence across echoes, real sonar data can have more complicated structures due to device setup or experimental design. In this paper, we consider sonar echo data collected from multiple terrain substrates with a dual-channel sonar head. Our goals are to identify the differential sonar responses to terrains and study the effectiveness of this dual-channel design in discriminating targets. We describe a unified analytical framework that achieves these goals rigorously, simultaneously, and automatically. The analysis was done by treating the echo envelope signals as functional responses and the terrain/channel information as covariates in a functional regression setting. We adopt functional mixed models that facilitate the estimation of terrain and channel effects while capturing the complex hierarchical structure in data. This unified analytical framework incorporates both Gaussian models and robust models. We fit the models using a full Bayesian approach, which enables us to perform multiple inferential tasks under the same modeling framework, including selecting models, estimating the effects of interest, identifying significant local regions, discriminating terrain types, and describing the discriminatory power of local regions. Our analysis of the sonar-terrain data identifies time regions that reflect differential sonar responses to terrains. The discriminant analysis suggests that a multi- or dual-channel design achieves target identification performance comparable with or better than a single-channel design. PMID:29749977
Unifying framework for multimodal brain MRI segmentation based on Hidden Markov Chains.
Bricq, S; Collet, Ch; Armspach, J P
2008-12-01
In the frame of 3D medical imaging, accurate segmentation of multimodal brain MR images is of interest for many brain disorders. However, due to several factors such as noise, imaging artifacts, intrinsic tissue variation and partial volume effects, tissue classification remains a challenging task. In this paper, we present a unifying framework for unsupervised segmentation of multimodal brain MR images including partial volume effect, bias field correction, and information given by a probabilistic atlas. Here-proposed method takes into account neighborhood information using a Hidden Markov Chain (HMC) model. Due to the limited resolution of imaging devices, voxels may be composed of a mixture of different tissue types, this partial volume effect is included to achieve an accurate segmentation of brain tissues. Instead of assigning each voxel to a single tissue class (i.e., hard classification), we compute the relative amount of each pure tissue class in each voxel (mixture estimation). Further, a bias field estimation step is added to the proposed algorithm to correct intensity inhomogeneities. Furthermore, atlas priors were incorporated using probabilistic brain atlas containing prior expectations about the spatial localization of different tissue classes. This atlas is considered as a complementary sensor and the proposed method is extended to multimodal brain MRI without any user-tunable parameter (unsupervised algorithm). To validate this new unifying framework, we present experimental results on both synthetic and real brain images, for which the ground truth is available. Comparison with other often used techniques demonstrates the accuracy and the robustness of this new Markovian segmentation scheme.
A Social-Cognitive Theoretical Framework for Examining Music Teacher Identity
ERIC Educational Resources Information Center
McClellan, Edward
2017-01-01
The purpose of the study was to examine a diverse range of research literature to provide a social-cognitive theoretical framework as a foundation for definition of identity construction in the music teacher education program. The review of literature may reveal a theoretical framework based around tenets of commonly studied constructs in the…
Penetration with Long Rods: A Theoretical Framework and Comparison with Instrumented Impacts
1981-05-01
program to begin probing the details of the interaction process. The theoretical framework underlying such a program is explained in detail. The theory of...of the time sequence of events during penetration. Data from one series of experiments, reported in detail elsewhere, is presented and discussed within the theoretical framework .
Theoretical and Conceptual Frameworks Used in Research on Family-School Partnerships
ERIC Educational Resources Information Center
Yamauchi, Lois A.; Ponte, Eva; Ratliffe, Katherine T.; Traynor, Kevin
2017-01-01
This study investigated the theoretical frameworks used to frame research on family-school partnerships over a five-year period. Although many researchers have described their theoretical approaches, little has been written about the diversity of frameworks used and how they are applied. Coders analyzed 215 journal articles published from 2007 to…
Maharishi International University
ERIC Educational Resources Information Center
Goldberg, Phil
1975-01-01
The director of curriculum development at Maharishi International University describes background and design of the program based on the Science of Creative Intelligence (SCI) as a unifying theoretical structure and on transcendental meditation (TM) for expanding awareness and utilizing videotape technology in its core curriculum courses. (JT)
Brain-Mind Operational Architectonics Imaging: Technical and Methodological Aspects
Fingelkurts, Andrew A; Fingelkurts, Alexander A
2008-01-01
This review paper deals with methodological and technical foundations of the Operational Architectonics framework of brain and mind functioning. This theory provides a framework for mapping and understanding important aspects of the brain mechanisms that constitute perception, cognition, and eventually consciousness. The methods utilized within Operational Architectonics framework allow analyzing with an incredible detail the operational behavior of local neuronal assemblies and their joint activity in the form of unified and metastable operational modules, which constitute the whole hierarchy of brain operations, operations of cognition and phenomenal consciousness. PMID:19526071
A Unified Nonlinear Adaptive Approach for Detection and Isolation of Engine Faults
NASA Technical Reports Server (NTRS)
Tang, Liang; DeCastro, Jonathan A.; Zhang, Xiaodong; Farfan-Ramos, Luis; Simon, Donald L.
2010-01-01
A challenging problem in aircraft engine health management (EHM) system development is to detect and isolate faults in system components (i.e., compressor, turbine), actuators, and sensors. Existing nonlinear EHM methods often deal with component faults, actuator faults, and sensor faults separately, which may potentially lead to incorrect diagnostic decisions and unnecessary maintenance. Therefore, it would be ideal to address sensor faults, actuator faults, and component faults under one unified framework. This paper presents a systematic and unified nonlinear adaptive framework for detecting and isolating sensor faults, actuator faults, and component faults for aircraft engines. The fault detection and isolation (FDI) architecture consists of a parallel bank of nonlinear adaptive estimators. Adaptive thresholds are appropriately designed such that, in the presence of a particular fault, all components of the residual generated by the adaptive estimator corresponding to the actual fault type remain below their thresholds. If the faults are sufficiently different, then at least one component of the residual generated by each remaining adaptive estimator should exceed its threshold. Therefore, based on the specific response of the residuals, sensor faults, actuator faults, and component faults can be isolated. The effectiveness of the approach was evaluated using the NASA C-MAPSS turbofan engine model, and simulation results are presented.
Reframing Information Literacy as a Metaliteracy
ERIC Educational Resources Information Center
Mackey, Thomas P.; Jacobson, Trudi E.
2011-01-01
Social media environments and online communities are innovative collaborative technologies that challenge traditional definitions of information literacy. Metaliteracy is an overarching and self-referential framework that integrates emerging technologies and unifies multiple literacy types. This redefinition of information literacy expands the…
Statistical mechanics of complex neural systems and high dimensional data
NASA Astrophysics Data System (ADS)
Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya
2013-03-01
Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.
Banerjee, Kinshuk; Das, Biswajit; Gangopadhyay, Gautam
2013-04-28
In this paper, we have explored generic criteria of cooperative behavior in ion channel kinetics treating it on the same footing with multistate receptor-ligand binding in a compact theoretical framework. We have shown that the characterization of cooperativity of ion channels in terms of the Hill coefficient violates the standard Hill criteria defined for allosteric cooperativity of ligand binding. To resolve the issue, an alternative measure of cooperativity is proposed here in terms of the cooperativity index that sets a unified criteria for both the systems. More importantly, for ion channel this index can be very useful to describe the cooperative kinetics as it can be readily determined from the experimentally measured ionic current combined with theoretical modelling. We have analyzed the correlation between the voltage value and slope of the voltage-activation curve at the half-activation point and consequently determined the standard free energy of activation of the ion channel using two well-established mechanisms of cooperativity, namely, Koshland-Nemethy-Filmer (KNF) and Monod-Wyman-Changeux (MWC) models. Comparison of the theoretical results for both the models with appropriate experimental data of mutational perturbation of Shaker K(+) channel supports the experimental fact that the KNF model is more suitable to describe the cooperative behavior of this class of ion channels, whereas the performance of the MWC model is unsatisfactory. We have also estimated the mechanistic performance through standard free energy of channel activation for both the models and proposed a possible functional disadvantage in the MWC scheme.
A UML profile for framework modeling.
Xu, Xiao-liang; Wang, Le-yu; Zhou, Hong
2004-01-01
The current standard Unified Modeling Language(UML) could not model framework flexibility and extendability adequately due to lack of appropriate constructs to distinguish framework hot-spots from kernel elements. A new UML profile that may customize UML for framework modeling was presented using the extension mechanisms of UML, providing a group of UML extensions to meet the needs of framework modeling. In this profile, the extended class diagrams and sequence diagrams were defined to straightforwardly identify the hot-spots and describe their instantiation restrictions. A transformation model based on design patterns was also put forward, such that the profile based framework design diagrams could be automatically mapped to the corresponding implementation diagrams. It was proved that the presented profile makes framework modeling more straightforwardly and therefore easier to understand and instantiate.
Provenzi, Livio; Scotto di Minico, Giunia; Giusti, Lorenzo; Guida, Elena; Müller, Mitho
2018-01-01
Background: During the last decades, the research on mother-infant dyad has produced a great amount of data, methods and theories, which largely contributed to set a revolution in the way we look at developmental changes during infancy and childhood. Very different constructs depict the different aspects of the "dyadic dance" occurring between a mother and her infant; nonetheless, a comprehensive and consistent systematization of these concepts in a coherent theoretical landscape is still lacking. Aim: In the present work, we aim at disentangling the different theoretical and methodological definitions of 9 dyadic constructs and we highlight their effects on infants' and children developmental outcomes. Methods: A literature search has been conducted on three databases-PubMed, Scopus, Web of Science. Three different reviews are reported here: (1) a review on the theoretical definitions of dyadic constructs; (2) a review of operational definitions, settings and methods of dyadic processes; (3) a systematic review of dyadic processes' outcomes for infants' and children developmental trajectories. Results: Two constructs emerged as wide meta-theoretical concepts (reciprocity and mutuality) and seven described specific processes (attunement, contingency, coordination, matching, mirroring, reparation, synchrony). A global model resuming the relationships among different processes is reported, which highlights the emergence of two specific cycles of dyadic functioning (i.e., matching-mismatching-reparation-synchrony; contingency, coordination, attunement, mirroring). A comprehensive review of the adopted measures is also provided. Finally, all the processes provided significant contributions to infants' behavioral, cognitive, and socio-emotional development during the first 3 years of age, but limited research has been conducted on specific processes (e.g. reparation and mirroring). Conclusion: The present study provides an original research-grounded framework to consider the different nature of mother-infant dyadic processes within a unified dyadic eco-system. Different levels of evidence emerged for the role of diverse mother-infant dyadic processes on infants' and children development. Open questions and future research directions are highlighted.
Provenzi, Livio; Scotto di Minico, Giunia; Giusti, Lorenzo; Guida, Elena; Müller, Mitho
2018-01-01
Background: During the last decades, the research on mother-infant dyad has produced a great amount of data, methods and theories, which largely contributed to set a revolution in the way we look at developmental changes during infancy and childhood. Very different constructs depict the different aspects of the “dyadic dance” occurring between a mother and her infant; nonetheless, a comprehensive and consistent systematization of these concepts in a coherent theoretical landscape is still lacking. Aim: In the present work, we aim at disentangling the different theoretical and methodological definitions of 9 dyadic constructs and we highlight their effects on infants' and children developmental outcomes. Methods: A literature search has been conducted on three databases—PubMed, Scopus, Web of Science. Three different reviews are reported here: (1) a review on the theoretical definitions of dyadic constructs; (2) a review of operational definitions, settings and methods of dyadic processes; (3) a systematic review of dyadic processes' outcomes for infants' and children developmental trajectories. Results: Two constructs emerged as wide meta-theoretical concepts (reciprocity and mutuality) and seven described specific processes (attunement, contingency, coordination, matching, mirroring, reparation, synchrony). A global model resuming the relationships among different processes is reported, which highlights the emergence of two specific cycles of dyadic functioning (i.e., matching-mismatching-reparation-synchrony; contingency, coordination, attunement, mirroring). A comprehensive review of the adopted measures is also provided. Finally, all the processes provided significant contributions to infants' behavioral, cognitive, and socio-emotional development during the first 3 years of age, but limited research has been conducted on specific processes (e.g. reparation and mirroring). Conclusion: The present study provides an original research-grounded framework to consider the different nature of mother-infant dyadic processes within a unified dyadic eco-system. Different levels of evidence emerged for the role of diverse mother-infant dyadic processes on infants' and children development. Open questions and future research directions are highlighted. PMID:29615947
Seven Basic Steps to Solving Ethical Dilemmas in Special Education: A Decision-Making Framework
ERIC Educational Resources Information Center
Stockall, Nancy; Dennis, Lindsay R.
2015-01-01
This article presents a seven-step framework for decision making to solve ethical issues in special education. The authors developed the framework from the existing literature and theoretical frameworks of justice, critique, care, and professionalism. The authors briefly discuss each theoretical framework and then describe the decision-making…
Unified beam splitter of fused silica grating under the second Bragg incidence.
Sun, Zhumei; Zhou, Changhe; Cao, Hongchao; Wu, Jun
2015-11-01
A unified design for a 1×2 beam splitter of dielectric rectangular transmission gratings under the second Bragg incidence is theoretically investigated for TE- and TM-polarized light. The empirical equations of the relative grating parameters (ratio of the absolute one to incidence wavelength) for this design are also obtained with the simplified modal method (SMM). The influences of polarization of incident light and relative grating parameters on the performance of the beam splitter are thoroughly studied based on the SMM and rigorous coupled-wave analysis. Two specific gratings are demonstrated with an even split and high diffraction efficiency (>94% for TE polarization and >97% for the TM counterpart). The unified profiles of the 1×2 beam splitter are independent from the incidence wavelength since the refractive index of fused silica is roughly a constant over a wide range of wavelengths, which should be promising for future applications.
A unified account of gloss and lightness perception in terms of gamut relativity.
Vladusich, Tony
2013-08-01
A recently introduced computational theory of visual surface representation, termed gamut relativity, overturns the classical assumption that brightness, lightness, and transparency constitute perceptual dimensions corresponding to the physical dimensions of luminance, diffuse reflectance, and transmittance, respectively. Here I extend the theory to show how surface gloss and lightness can be understood in a unified manner in terms of the vector computation of "layered representations" of surface and illumination properties, rather than as perceptual dimensions corresponding to diffuse and specular reflectance, respectively. The theory simulates the effects of image histogram skewness on surface gloss/lightness and lightness constancy as a function of specular highlight intensity. More generally, gamut relativity clarifies, unifies, and generalizes a wide body of previous theoretical and experimental work aimed at understanding how the visual system parses the retinal image into layered representations of surface and illumination properties.
2009-08-05
Socio-cultural data acquisition, extraction, and management.??? First the idea of a theoretical framework will be very briefly discussed as well as...SUBJECT TERMS human behavior, theoretical framework , hypothesis development, experimental design, ethical research, statistical power, human laboratory...who throw rocks? • How can we make them stay too far away to throw rocks? UNCLASSIFIED – Approved for Public Release Theoretical Framework / Conceptual
ERIC Educational Resources Information Center
Ornek, Funda
2008-01-01
One or more theoretical frameworks or orientations are used in qualitative education research. In this paper, the main tenets, the background and the appropriateness of phenomenography, which is one of the theoretical frameworks used in qualitative research, will be depicted. Further, the differences among phenomenography, phenomenology and…
Using a Theoretical Framework of Institutional Culture to Analyse an Institutional Strategy Document
ERIC Educational Resources Information Center
Jacobs, Anthea Hydi Maxine
2016-01-01
This paper builds on a conceptual analysis of institutional culture in higher education. A theoretical framework was proposed to analyse institutional documents of two higher education institutions in the Western Cape, for the period 2002 to 2012 (Jacobs 2012). The elements of this theoretical framework are "shared values and beliefs",…
ERIC Educational Resources Information Center
Asiri, Mohammed J. Sherbib; Mahmud, Rosnaini bt; Bakar, Kamariah Abu; Ayub, Ahmad Fauzi bin Mohd
2012-01-01
The purpose of this paper is to present the theoretical framework underlying a research on factors that influence utilization of the Jusur Learning Management System (Jusur LMS) in Saudi Arabian public universities. Development of the theoretical framework was done based on library research approach. Initially, the existing literature relevant to…
Hydrodynamics of soft active matter
NASA Astrophysics Data System (ADS)
Marchetti, M. C.; Joanny, J. F.; Ramaswamy, S.; Liverpool, T. B.; Prost, J.; Rao, Madan; Simha, R. Aditi
2013-07-01
This review summarizes theoretical progress in the field of active matter, placing it in the context of recent experiments. This approach offers a unified framework for the mechanical and statistical properties of living matter: biofilaments and molecular motors in vitro or in vivo, collections of motile microorganisms, animal flocks, and chemical or mechanical imitations. A major goal of this review is to integrate several approaches proposed in the literature, from semimicroscopic to phenomenological. In particular, first considered are “dry” systems, defined as those where momentum is not conserved due to friction with a substrate or an embedding porous medium. The differences and similarities between two types of orientationally ordered states, the nematic and the polar, are clarified. Next, the active hydrodynamics of suspensions or “wet” systems is discussed and the relation with and difference from the dry case, as well as various large-scale instabilities of these nonequilibrium states of matter, are highlighted. Further highlighted are various large-scale instabilities of these nonequilibrium states of matter. Various semimicroscopic derivations of the continuum theory are discussed and connected, highlighting the unifying and generic nature of the continuum model. Throughout the review, the experimental relevance of these theories for describing bacterial swarms and suspensions, the cytoskeleton of living cells, and vibrated granular material is discussed. Promising extensions toward greater realism in specific contexts from cell biology to animal behavior are suggested, and remarks are given on some exotic active-matter analogs. Last, the outlook for a quantitative understanding of active matter, through the interplay of detailed theory with controlled experiments on simplified systems, with living or artificial constituents, is summarized.
A Theoretical Foundation for Understanding Clergy-Perpetrated Sexual Abuse
ERIC Educational Resources Information Center
Fogler, Jason M.; Shipherd, Jillian C.; Rowe, Erin; Jensen, Jennifer; Clarke, Stephanie
2008-01-01
Incorporating elements from broadband theories of psychological adaptation to extreme adversity, including Summit's (1983) Child Sexual Abuse Accommodation Syndrome, Finkelhor and Browne's (1986) Traumagenic Dynamics Model of sexual abuse, and Pyszczynski and colleagues' (1997) Terror Management Theory, this paper proposes a unified theoretical…
Optimization in Bilingual Language Use
ERIC Educational Resources Information Center
Bhatt, Rakesh M.
2013-01-01
Pieter Muysken's keynote paper, "Language contact outcomes as a result of bilingual optimization strategies", undertakes an ambitious project to theoretically unify different empirical outcomes of language contact, for instance, SLA, pidgins and Creoles, and code-switching. Muysken has dedicated a life-time to researching, rather…
Psychophysical Laws and the Superorganism.
Reina, Andreagiovanni; Bose, Thomas; Trianni, Vito; Marshall, James A R
2018-03-12
Through theoretical analysis, we show how a superorganism may react to stimulus variations according to psychophysical laws observed in humans and other animals. We investigate an empirically-motivated honeybee house-hunting model, which describes a value-sensitive decision process over potential nest-sites, at the level of the colony. In this study, we show how colony decision time increases with the number of available nests, in agreement with the Hick-Hyman law of psychophysics, and decreases with mean nest quality, in agreement with Piéron's law. We also show that colony error rate depends on mean nest quality, and difference in quality, in agreement with Weber's law. Psychophysical laws, particularly Weber's law, have been found in diverse species, including unicellular organisms. Our theoretical results predict that superorganisms may also exhibit such behaviour, suggesting that these laws arise from fundamental mechanisms of information processing and decision-making. Finally, we propose a combined psychophysical law which unifies Hick-Hyman's law and Piéron's law, traditionally studied independently; this unified law makes predictions that can be empirically tested.
Ravioli, Antonio Franco; Soárez, Patrícia Coelho De; Scheffer, Mário César
2018-01-01
The current study aimed to systematically analyze trends and priorities in the theoretical and conceptual approaches and empirical studies on specific health services management modalities in the Brazilian Unified National Health System. A narrative review of the literature identified, in 33 publications, the location and nature of services, management models, methodological procedures, and study outcomes. The research deals mainly with the models' conceptual and legal characteristics and management practices, in addition to addressing contracts, procurement, human resources, financing, and control mechanisms. In conclusion, the literature is limited and concentrated in the State of São Paulo, showing little theoretical diversity and methodological weaknesses, while it is nonconclusive as to the superiority of one management model over another. New evaluation studies are needed that are capable of comparing different models and assessing their performance and their effects on the quality of health services' provision, the population's health, and the health system's organization.
COMPLEMENTARITY OF ECOLOGICAL GOAL FUNCTIONS
This paper summarizes, in the framework of network environ analysis, a set of analyses of energy-matter flow and storage in steady state systems. The network perspective is used to codify and unify ten ecological orientors or external principles: maximum power (Lotka), maximum st...
Chimaera simulation of complex states of flowing matter
2016-01-01
We discuss a unified mesoscale framework (chimaera) for the simulation of complex states of flowing matter across scales of motion. The chimaera framework can deal with each of the three macro–meso–micro levels through suitable ‘mutations’ of the basic mesoscale formulation. The idea is illustrated through selected simulations of complex micro- and nanoscale flows. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698031
NASA Astrophysics Data System (ADS)
Fernandes, Geraldo W. Rocha; Rodrigues, António M.; Ferreira, Carlos Alberto
2018-03-01
This article aims to characterise the research on science teachers' professional development programs that support the use of Information and Communication Technologies (ICTs) and the main trends concerning the theoretical frameworks (theoretical foundation, literature review or background) that underpin these studies. Through a systematic review of the literature, 76 articles were found and divided into two axes on training science teachers and the use of digital technologies with their categories. The first axis (characterisation of articles) presents the category key features that characterise the articles selected (major subjects, training and actions for the professional development and major ICT tools and digital resources). The second axis (trends of theoretical frameworks) has three categories organised in theoretical frameworks that emphasise the following: (a) the digital technologies, (b) prospects of curricular renewal and (c) cognitive processes. It also characterised a group of articles with theoretical frameworks that contain multiple elements without deepening them or that even lack a theoretical framework that supports the studies. In this review, we found that many professional development programs for teachers still use inadequate strategies for bringing about change in teacher practices. New professional development proposals are emerging with the objective of minimising such difficulties and this analysis could be a helpful tool to restructure those proposals.
A Review of Research on Driving Styles and Road Safety.
Sagberg, Fridulv; Selpi; Piccinini, Giulio Francesco Bianchi; Engström, Johan
2015-11-01
The aim of this study was to outline a conceptual framework for understanding driving style and, on this basis, review the state-of-the-art research on driving styles in relation to road safety. Previous research has indicated a relationship between the driving styles adopted by drivers and their crash involvement. However, a comprehensive literature review of driving style research is lacking. A systematic literature search was conducted, including empirical, theoretical, and methodological research, on driving styles related to road safety. A conceptual framework was proposed whereby driving styles are viewed in terms of driving habits established as a result of individual dispositions as well as social norms and cultural values. Moreover, a general scheme for categorizing and operationalizing driving styles was suggested. On this basis, existing literature on driving styles and indicators was reviewed. Links between driving styles and road safety were identified and individual and sociocultural factors influencing driving style were reviewed. Existing studies have addressed a wide variety of driving styles, and there is an acute need for a unifying conceptual framework in order to synthesize these results and make useful generalizations. There is a considerable potential for increasing road safety by means of behavior modification. Naturalistic driving observations represent particularly promising approaches to future research on driving styles. Knowledge about driving styles can be applied in programs for modifying driver behavior and in the context of usage-based insurance. It may also be used as a means for driver identification and for the development of driver assistance systems. © 2015, Human Factors and Ergonomics Society.
A theoretical framework to support research of health service innovation.
Fox, Amanda; Gardner, Glenn; Osborne, Sonya
2015-02-01
Health service managers and policy makers are increasingly concerned about the sustainability of innovations implemented in health care settings. The increasing demand on health services requires that innovations are both effective and sustainable; however, research in this field is limited, with multiple disciplines, approaches and paradigms influencing the field. These variations prevent a cohesive approach, and therefore the accumulation of research findings, in the development of a body of knowledge. The purpose of this paper is to provide a thorough examination of the research findings and provide an appropriate theoretical framework to examine sustainability of health service innovation. This paper presents an integrative review of the literature available in relation to sustainability of health service innovation and provides the development of a theoretical framework based on integration and synthesis of the literature. A theoretical framework serves to guide research, determine variables, influence data analysis and is central to the quest for ongoing knowledge development. This research outlines the sustainability of innovation framework; a theoretical framework suitable for examining the sustainability of health service innovation. If left unaddressed, health services research will continue in an ad hoc manner, preventing full utilisation of outcomes, recommendations and knowledge for effective provision of health services. The sustainability of innovation theoretical framework provides an operational basis upon which reliable future research can be conducted.
Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A
2015-01-01
This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a flexible and extensible framework for all types of interaction between health record systems and research systems. CDIM, as core ontology of such an approach, enables simplicity and consistency of design across the heterogeneous software landscape and can support the specific needs of EHR-driven phenotyping research using primary care data.
Pinto, Rogério M; da Silva, Sueli Bulhões; Soriano, Rafaela
2012-03-01
Community health workers (CHWs) play a pivotal role in primary care, serving as liaisons between community members and medical providers. However, the growing reliance of health care systems worldwide on CHWs has outpaced research explaining their praxis - how they combine indigenous and technical knowledge, overcome challenges and impact patient outcomes. This paper thus articulates the CHW Praxis and Patient Health Behavior Framework. Such a framework is needed to advance research on CHW impact on patient outcomes and to advance CHW training. The project that originated this framework followed community-based participatory research principles. A team of U.S.-Brazil research partners, including CHWs, worked together from conceptualization of the study to dissemination of its findings. The framework is built on an integrated conceptual foundation including learning/teaching and individual behavior theories. The empirical base of the framework comprises in-depth interviews with 30 CHWs in Brazil's Unified Health System, Mesquita, Rio de Janeiro. Data collection for the project which originated this report occurred in 2008-10. Semi-structured questions examined how CHWs used their knowledge/skills; addressed personal and environmental challenges; and how they promoted patient health behaviors. This study advances an explanation of how CHWs use self-identified strategies--i.e., empathic communication and perseverance--to help patients engage in health behaviors. Grounded in our proposed framework, survey measures can be developed and used in predictive models testing the effects of CHW praxis on health behaviors. Training for CHWs can explicitly integrate indigenous and technical knowledge in order for CHWs to overcome contextual challenges and enhance service delivery. Copyright © 2012 Elsevier Ltd. All rights reserved.
Pinto, Rogério M.; da Silva, Sueli Bulhões; Soriano, Rafaela
2012-01-01
Community Health Workers (CHWs) play a pivotal role in primary care, serving as liaisons between community members and medical providers. However, the growing reliance of health care systems worldwide on CHWs has outpaced research explaining their praxis – how they combine indigenous and technical knowledge, overcome challenges and impact patient outcomes. This paper thus articulates the CHW Praxis and Patient Health Behavior Framework. Such a framework is needed to advance research on CHW impact on patient outcomes and to advance CHW training. The project that originated this framework followed Community-Based Participatory Research principles. A team of U.S.-Brazil research partners, including CHWs, worked together from conceptualization of the study to dissemination of its findings. The framework is built on an integrated conceptual foundation including learning/teaching and individual behavior theories. The empirical base of the framework comprises in-depth interviews with 30 CHWs in Brazil's Unified Health System, Mesquita, Rio de Janeiro. Data collection for the project which originated this report occurred in 2008–10. Semi-structured questions examined how CHWs used their knowledge/skills; addressed personal and environmental challenges; and how they promoted patient health behaviors. This study advances an explanation of how CHWs use self-identified strategies – i.e., empathic communication and perseverance – to help patients engage in health behaviors. Grounded in our proposed framework, survey measures can be developed and used in predictive models testing the effects of CHW praxis on health behaviors. Training for CHWs can explicitly integrate indigenous and technical knowledge in order for CHWs to overcome contextual challenges and enhance service delivery. PMID:22305469
Wibral, Michael; Priesemann, Viola; Kay, Jim W; Lizier, Joseph T; Phillips, William A
2017-03-01
In many neural systems anatomical motifs are present repeatedly, but despite their structural similarity they can serve very different tasks. A prime example for such a motif is the canonical microcircuit of six-layered neo-cortex, which is repeated across cortical areas, and is involved in a number of different tasks (e.g. sensory, cognitive, or motor tasks). This observation has spawned interest in finding a common underlying principle, a 'goal function', of information processing implemented in this structure. By definition such a goal function, if universal, cannot be cast in processing-domain specific language (e.g. 'edge filtering', 'working memory'). Thus, to formulate such a principle, we have to use a domain-independent framework. Information theory offers such a framework. However, while the classical framework of information theory focuses on the relation between one input and one output (Shannon's mutual information), we argue that neural information processing crucially depends on the combination of multiple inputs to create the output of a processor. To account for this, we use a very recent extension of Shannon Information theory, called partial information decomposition (PID). PID allows to quantify the information that several inputs provide individually (unique information), redundantly (shared information) or only jointly (synergistic information) about the output. First, we review the framework of PID. Then we apply it to reevaluate and analyze several earlier proposals of information theoretic neural goal functions (predictive coding, infomax and coherent infomax, efficient coding). We find that PID allows to compare these goal functions in a common framework, and also provides a versatile approach to design new goal functions from first principles. Building on this, we design and analyze a novel goal function, called 'coding with synergy', which builds on combining external input and prior knowledge in a synergistic manner. We suggest that this novel goal function may be highly useful in neural information processing. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Unifying diffusion and seepage for nonlinear gas transport in multiscale porous media
NASA Astrophysics Data System (ADS)
Song, Hongqing; Wang, Yuhe; Wang, Jiulong; Li, Zhengyi
2016-09-01
We unify the diffusion and seepage process for nonlinear gas transport in multiscale porous media via a proposed new general transport equation. A coherent theoretical derivation indicates the wall-molecule and molecule-molecule collisions drive the Knudsen and collective diffusive fluxes, and constitute the system pressure across the porous media. A new terminology, nominal diffusion coefficient can summarize Knudsen and collective diffusion coefficients. Physical and numerical experiments show the support of the new formulation and provide approaches to obtain the diffusion coefficient and permeability simultaneously. This work has important implication for natural gas extraction and greenhouse gases sequestration in geological formations.
Surface electrical properties experiment, Part 3
NASA Technical Reports Server (NTRS)
1974-01-01
A complete unified discussion of the electromagnetic response of a plane stratified structure is reported. A detailed and comprehensive analysis of the theoretical parts of the electromagnetic is given. The numerical problem of computing numbers of the electromagnetic field strengths is discussed. It is shown that the analysis of the conductive media is not very far removed from the theoretical analysis and the numerical difficulties are not as accute as for the low-loss problem. For Vol. 1, see N75-15570; for Vol. 2 see N75-15571.
Large-scale Cross-modality Search via Collective Matrix Factorization Hashing.
Ding, Guiguang; Guo, Yuchen; Zhou, Jile; Gao, Yue
2016-09-08
By transforming data into binary representation, i.e., Hashing, we can perform high-speed search with low storage cost, and thus Hashing has collected increasing research interest in the recent years. Recently, how to generate Hashcode for multimodal data (e.g., images with textual tags, documents with photos, etc) for large-scale cross-modality search (e.g., searching semantically related images in database for a document query) is an important research issue because of the fast growth of multimodal data in the Web. To address this issue, a novel framework for multimodal Hashing is proposed, termed as Collective Matrix Factorization Hashing (CMFH). The key idea of CMFH is to learn unified Hashcodes for different modalities of one multimodal instance in the shared latent semantic space in which different modalities can be effectively connected. Therefore, accurate cross-modality search is supported. Based on the general framework, we extend it in the unsupervised scenario where it tries to preserve the Euclidean structure, and in the supervised scenario where it fully exploits the label information of data. The corresponding theoretical analysis and the optimization algorithms are given. We conducted comprehensive experiments on three benchmark datasets for cross-modality search. The experimental results demonstrate that CMFH can significantly outperform several state-of-the-art cross-modality Hashing methods, which validates the effectiveness of the proposed CMFH.
NASA Astrophysics Data System (ADS)
Ivancevic, Vladimir
2016-07-01
The topic of the review article [1] is the derivation of a multiscale paradigm for the modeling of fibrosis. Firstly, the biological process of the physiological and pathological fibrosis including therapeutical actions is reviewed. Fibrosis can be a consequence of tissue damage, infections and autoimmune diseases, foreign material, tumors. Some questions regarding the pathogenesis, progression and possible regression of fibrosis are lacking. At each scale of observation, different theoretical tools coming from computational, mathematical and physical biology have been proposed. However a complete framework that takes into account the different mechanisms occurring at different scales is still missing. Therefore with the main aim to define a multiscale approach for the modeling of fibrosis, the authors of [1] have presented different top-down and bottom-up approaches that have been developed in the literature. Specifically, their description refers to models for fibrosis diseases based on ordinary and partial differential equation, agents [2], thermostatted kinetic theory [3-5], coarse-grained structures [6-8] and constitutive laws for fibrous collagen networks [9]. A critical analysis has been addressed for all frameworks discussed in the paper. Open problems and future research directions referring to both biological and modeling insight of fibrosis are presented. The paper concludes with the ambitious aim of a multiscale model.
Griffiths, Kristi R.; Morris, Richard W.; Balleine, Bernard W.
2014-01-01
The ability to learn contingencies between actions and outcomes in a dynamic environment is critical for flexible, adaptive behavior. Goal-directed actions adapt to changes in action-outcome contingencies as well as to changes in the reward-value of the outcome. When networks involved in reward processing and contingency learning are maladaptive, this fundamental ability can be lost, with detrimental consequences for decision-making. Impaired decision-making is a core feature in a number of psychiatric disorders, ranging from depression to schizophrenia. The argument can be developed, therefore, that seemingly disparate symptoms across psychiatric disorders can be explained by dysfunction within common decision-making circuitry. From this perspective, gaining a better understanding of the neural processes involved in goal-directed action, will allow a comparison of deficits observed across traditional diagnostic boundaries within a unified theoretical framework. This review describes the key processes and neural circuits involved in goal-directed decision-making using evidence from animal studies and human neuroimaging. Select studies are discussed to outline what we currently know about causal judgments regarding actions and their consequences, action-related reward evaluation, and, most importantly, how these processes are integrated in goal-directed learning and performance. Finally, we look at how adaptive decision-making is impaired across a range of psychiatric disorders and how deepening our understanding of this circuitry may offer insights into phenotypes and more targeted interventions. PMID:24904322
Charras, Guillaume T; Mitchison, Timothy J; Mahadevan, L
2009-09-15
Water is the dominant ingredient of cells and its dynamics are crucial to life. We and others have suggested a physical picture of the cell as a soft, fluid-infiltrated sponge, surrounded by a water-permeable barrier. To understand water movements in an animal cell, we imposed an external, inhomogeneous osmotic stress on cultured cancer cells. This forced water through the membrane on one side, and out on the other. Inside the cell, it created a gradient in hydration, that we visualized by tracking cellular responses using natural organelles and artificially introduced quantum dots. The dynamics of these markers at short times were the same for normal and metabolically poisoned cells, indicating that the cellular responses are primarily physical rather than chemical. Our finding of an internal gradient in hydration is inconsistent with a continuum model for cytoplasm, but consistent with the sponge model, and implies that the effective pore size of the sponge is small enough to retard water flow significantly on time scales ( approximately 10-100 seconds) relevant to cell physiology. We interpret these data in terms of a theoretical framework that combines mechanics and hydraulics in a multiphase poroelastic description of the cytoplasm and explains the experimentally observed dynamics quantitatively in terms of a few coarse-grained parameters that are based on microscopically measurable structural, hydraulic and mechanical properties. Our fluid-filled sponge model could provide a unified framework to understand a number of disparate observations in cell morphology and motility.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Machnes, S.; Institute for Theoretical Physics, University of Ulm, D-89069 Ulm; Sander, U.
2011-08-15
For paving the way to novel applications in quantum simulation, computation, and technology, increasingly large quantum systems have to be steered with high precision. It is a typical task amenable to numerical optimal control to turn the time course of pulses, i.e., piecewise constant control amplitudes, iteratively into an optimized shape. Here, we present a comparative study of optimal-control algorithms for a wide range of finite-dimensional applications. We focus on the most commonly used algorithms: GRAPE methods which update all controls concurrently, and Krotov-type methods which do so sequentially. Guidelines for their use are given and open research questions aremore » pointed out. Moreover, we introduce a unifying algorithmic framework, DYNAMO (dynamic optimization platform), designed to provide the quantum-technology community with a convenient matlab-based tool set for optimal control. In addition, it gives researchers in optimal-control techniques a framework for benchmarking and comparing newly proposed algorithms with the state of the art. It allows a mix-and-match approach with various types of gradients, update and step-size methods as well as subspace choices. Open-source code including examples is made available at http://qlib.info.« less
NASA Technical Reports Server (NTRS)
2005-01-01
A number of titanium matrix composite (TMC) systems are currently being investigated for high-temperature air frame and propulsion system applications. As a result, numerous computational methodologies for predicting both deformation and life for this class of materials are under development. An integral part of these methodologies is an accurate and computationally efficient constitutive model for the metallic matrix constituent. Furthermore, because these systems are designed to operate at elevated temperatures, the required constitutive models must account for both time-dependent and time-independent deformations. To accomplish this, the NASA Lewis Research Center is employing a recently developed, complete, potential-based framework. This framework, which utilizes internal state variables, was put forth for the derivation of reversible and irreversible constitutive equations. The framework, and consequently the resulting constitutive model, is termed complete because the existence of the total (integrated) form of the Gibbs complementary free energy and complementary dissipation potentials are assumed a priori. The specific forms selected here for both the Gibbs and complementary dissipation potentials result in a fully associative, multiaxial, nonisothermal, unified viscoplastic model with nonlinear kinematic hardening. This model constitutes one of many models in the Generalized Viscoplasticity with Potential Structure (GVIPS) class of inelastic constitutive equations.
The "New" Economics of Education: Towards a "Unified" Macro/Micro-Educational Planning Policy.
ERIC Educational Resources Information Center
Kraft, Richard H.; Nakib, Yasser
1991-01-01
Takes issue with conventional human capital theory, questioning assumptions regarding external benefits, internal efficiency, educational purposes, and returns-to-education and manpower needs approaches. Reviews new theoretical directions regarding supply and demand, socialization, labor market segmentation, and overeducation and undereducation,…
ERIC Educational Resources Information Center
Atwood, Margaret
1976-01-01
Basic to Library-College thought is the Communication Way. Such a construct is theoretical in the sense it combines the structure of a discipline and the structure of a literature into a system which enables the learner to see that finding and thinking about given subject matter is a unified process. (Author)
Classical Markov Chains: A Unifying Framework for Understanding Avian Reproductive Success
Traditional methods for monitoring and analysis of avian nesting success have several important shortcomings, including 1) inability to handle multiple classes of nest failure, and 2) inability to provide estimates of annual reproductive success (because birds can, and typically ...
Do changes in connectivity explain desertification?
USDA-ARS?s Scientific Manuscript database
Desertification, broad-scale land degradation in drylands, is a major environmental hazard facing inhabitants of the world’s deserts as well as an important component of global change. There is no unifying framework that simply and effectively explains different forms of desertification. Here we arg...
Cho, Kwang-Hyun; Choo, Sang-Mok; Wellstead, Peter; Wolkenhauer, Olaf
2005-08-15
We propose a unified framework for the identification of functional interaction structures of biomolecular networks in a way that leads to a new experimental design procedure. In developing our approach, we have built upon previous work. Thus we begin by pointing out some of the restrictions associated with existing structure identification methods and point out how these restrictions may be eased. In particular, existing methods use specific forms of experimental algebraic equations with which to identify the functional interaction structure of a biomolecular network. In our work, we employ an extended form of these experimental algebraic equations which, while retaining their merits, also overcome some of their disadvantages. Experimental data are required in order to estimate the coefficients of the experimental algebraic equation set associated with the structure identification task. However, experimentalists are rarely provided with guidance on which parameters to perturb, and to what extent, to perturb them. When a model of network dynamics is required then there is also the vexed question of sample rate and sample time selection to be resolved. Supplying some answers to these questions is the main motivation of this paper. The approach is based on stationary and/or temporal data obtained from parameter perturbations, and unifies the previous approaches of Kholodenko et al. (PNAS 99 (2002) 12841-12846) and Sontag et al. (Bioinformatics 20 (2004) 1877-1886). By way of demonstration, we apply our unified approach to a network model which cannot be properly identified by existing methods. Finally, we propose an experiment design methodology, which is not limited by the amount of parameter perturbations, and illustrate its use with an in numero example.
Wolfrum, Ed (ORCID:0000000273618931); Knoshug, Eric (ORCID:000000025709914X); Laurens, Lieve (ORCID:0000000349303267); Harmon, Valerie; Dempster, Thomas (ORCID:000000029550488X); McGowan, John (ORCID:0000000266920518); Rosov, Theresa; Cardello, David; Arrowsmith, Sarah; Kempkes, Sarah; Bautista, Maria; Lundquist, Tryg; Crowe, Brandon; Murawsky, Garrett; Nicolai, Eric; Rowe, Egan; Knurek, Emily; Javar, Reyna; Saracco Alvarez, Marcela; Schlosser, Steve; Riddle, Mary; Withstandley, Chris; Chen, Yongsheng; Van Ginkel, Steven; Igou, Thomas; Xu, Chunyan; Hu, Zixuan
2017-10-20
ATP3 Unified Field Study Data The Algae Testbed Public-Private Partnership (ATP3) was established with the goal of investigating open pond algae cultivation across different geographic, climatic, seasonal, and operational conditions while setting the benchmark for quality data collection, analysis, and dissemination. Identical algae cultivation systems and data analysis methodologies were established at testbed sites across the continental United States and Hawaii. Within this framework, the Unified Field Studies (UFS) were designed to characterize the cultivation of different algal strains during all 4 seasons across this testbed network. The dataset presented here is the complete, curated, climatic, cultivation, harvest, and biomass composition data for each season at each site. These data enable others to do in-depth cultivation, harvest, techno-economic, life cycle, resource, and predictive growth modeling analysis, as well as develop crop protection strategies for the nascent algae industry. NREL Sub award Number: DE-AC36-08-GO28308
A Unified Methodology for Computing Accurate Quaternion Color Moments and Moment Invariants.
Karakasis, Evangelos G; Papakostas, George A; Koulouriotis, Dimitrios E; Tourassis, Vassilios D
2014-02-01
In this paper, a general framework for computing accurate quaternion color moments and their corresponding invariants is proposed. The proposed unified scheme arose by studying the characteristics of different orthogonal polynomials. These polynomials are used as kernels in order to form moments, the invariants of which can easily be derived. The resulted scheme permits the usage of any polynomial-like kernel in a unified and consistent way. The resulted moments and moment invariants demonstrate robustness to noisy conditions and high discriminative power. Additionally, in the case of continuous moments, accurate computations take place to avoid approximation errors. Based on this general methodology, the quaternion Tchebichef, Krawtchouk, Dual Hahn, Legendre, orthogonal Fourier-Mellin, pseudo Zernike and Zernike color moments, and their corresponding invariants are introduced. A selected paradigm presents the reconstruction capability of each moment family, whereas proper classification scenarios evaluate the performance of color moment invariants.
NASA Astrophysics Data System (ADS)
Codello, Alessandro; Jain, Rajeev Kumar
2018-05-01
We present a unified evolution of the universe from very early times until the present epoch by including both the leading local correction R^2 and the leading non-local term R1/\\square ^2R to the classical gravitational action. We find that the inflationary phase driven by R^2 term gracefully exits in a transitory regime characterized by coherent oscillations of the Hubble parameter. The universe then naturally enters into a radiation dominated epoch followed by a matter dominated era. At sufficiently late times after radiation-matter equality, the non-local term starts to dominate inducing an accelerated expansion of the universe at the present epoch. We further exhibit the fact that both the leading local and non-local terms can be obtained within the covariant effective field theory of gravity. This scenario thus provides a unified picture of inflation and dark energy in a single framework by means of a purely gravitational action without the usual need of a scalar field.
Multilayer network of language: A unified framework for structural analysis of linguistic subsystems
NASA Astrophysics Data System (ADS)
Martinčić-Ipšić, Sanda; Margan, Domagoj; Meštrović, Ana
2016-09-01
Recently, the focus of complex networks' research has shifted from the analysis of isolated properties of a system toward a more realistic modeling of multiple phenomena - multilayer networks. Motivated by the prosperity of multilayer approach in social, transport or trade systems, we introduce the multilayer networks for language. The multilayer network of language is a unified framework for modeling linguistic subsystems and their structural properties enabling the exploration of their mutual interactions. Various aspects of natural language systems can be represented as complex networks, whose vertices depict linguistic units, while links model their relations. The multilayer network of language is defined by three aspects: the network construction principle, the linguistic subsystem and the language of interest. More precisely, we construct a word-level (syntax and co-occurrence) and a subword-level (syllables and graphemes) network layers, from four variations of original text (in the modeled language). The analysis and comparison of layers at the word and subword-levels are employed in order to determine the mechanism of the structural influences between linguistic units and subsystems. The obtained results suggest that there are substantial differences between the networks' structures of different language subsystems, which are hidden during the exploration of an isolated layer. The word-level layers share structural properties regardless of the language (e.g. Croatian or English), while the syllabic subword-level expresses more language dependent structural properties. The preserved weighted overlap quantifies the similarity of word-level layers in weighted and directed networks. Moreover, the analysis of motifs reveals a close topological structure of the syntactic and syllabic layers for both languages. The findings corroborate that the multilayer network framework is a powerful, consistent and systematic approach to model several linguistic subsystems simultaneously and hence to provide a more unified view on language.
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental disturbances, is also proposed, together with a simulation example. PMID:23515190
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental disturbances, is also proposed, together with a simulation example.
2012-01-01
Background An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research, has been applied across a wide range of clinical situations. This study tests the validity of this framework. Methods Validity was investigated by behavioural experts sorting 112 unique theoretical constructs using closed and open sort tasks. The extent of replication was tested by Discriminant Content Validation and Fuzzy Cluster Analysis. Results There was good support for a refinement of the framework comprising 14 domains of theoretical constructs (average silhouette value 0.29): ‘Knowledge’, ‘Skills’, ‘Social/Professional Role and Identity’, ‘Beliefs about Capabilities’, ‘Optimism’, ‘Beliefs about Consequences’, ‘Reinforcement’, ‘Intentions’, ‘Goals’, ‘Memory, Attention and Decision Processes’, ‘Environmental Context and Resources’, ‘Social Influences’, ‘Emotions’, and ‘Behavioural Regulation’. Conclusions The refined Theoretical Domains Framework has a strengthened empirical base and provides a method for theoretically assessing implementation problems, as well as professional and other health-related behaviours as a basis for intervention development. PMID:22530986
NASA Technical Reports Server (NTRS)
1978-01-01
A unified framework for comparing intercity passenger and freight transportation systems is presented. Composite measures for cost, service/demand, energy, and environmental impact were determined. A set of 14 basic measures were articulated to form the foundation for computing the composite measures. A parameter dependency diagram, constructed to explicitly interrelate the composite and basic measures is discussed. Ground rules and methodology for developing the values of the basic measures are provided and the use of the framework with existing cost and service data is illustrated for various freight systems.
NASA Astrophysics Data System (ADS)
Perfors, Amy
2014-09-01
There is much to approve of in this provocative and interesting paper. I strongly agree in many parts, especially the point that dichotomies like nature/nurture are actively detrimental to the field. I also appreciate the idea that cognitive scientists should take the "biological wetware" of the cell (rather than the network) more seriously.
Chimaera simulation of complex states of flowing matter.
Succi, S
2016-11-13
We discuss a unified mesoscale framework (chimaera) for the simulation of complex states of flowing matter across scales of motion. The chimaera framework can deal with each of the three macro-meso-micro levels through suitable 'mutations' of the basic mesoscale formulation. The idea is illustrated through selected simulations of complex micro- and nanoscale flows.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2016 The Author(s).
A unified framework for gesture recognition and spatiotemporal gesture segmentation.
Alon, Jonathan; Athitsos, Vassilis; Yuan, Quan; Sclaroff, Stan
2009-09-01
Within the context of hand gesture recognition, spatiotemporal gesture segmentation is the task of determining, in a video sequence, where the gesturing hand is located and when the gesture starts and ends. Existing gesture recognition methods typically assume either known spatial segmentation or known temporal segmentation, or both. This paper introduces a unified framework for simultaneously performing spatial segmentation, temporal segmentation, and recognition. In the proposed framework, information flows both bottom-up and top-down. A gesture can be recognized even when the hand location is highly ambiguous and when information about when the gesture begins and ends is unavailable. Thus, the method can be applied to continuous image streams where gestures are performed in front of moving, cluttered backgrounds. The proposed method consists of three novel contributions: a spatiotemporal matching algorithm that can accommodate multiple candidate hand detections in every frame, a classifier-based pruning framework that enables accurate and early rejection of poor matches to gesture models, and a subgesture reasoning algorithm that learns which gesture models can falsely match parts of other longer gestures. The performance of the approach is evaluated on two challenging applications: recognition of hand-signed digits gestured by users wearing short-sleeved shirts, in front of a cluttered background, and retrieval of occurrences of signs of interest in a video database containing continuous, unsegmented signing in American Sign Language (ASL).
Kernel-imbedded Gaussian processes for disease classification using microarray gene expression data
Zhao, Xin; Cheung, Leo Wang-Kit
2007-01-01
Background Designing appropriate machine learning methods for identifying genes that have a significant discriminating power for disease outcomes has become more and more important for our understanding of diseases at genomic level. Although many machine learning methods have been developed and applied to the area of microarray gene expression data analysis, the majority of them are based on linear models, which however are not necessarily appropriate for the underlying connection between the target disease and its associated explanatory genes. Linear model based methods usually also bring in false positive significant features more easily. Furthermore, linear model based algorithms often involve calculating the inverse of a matrix that is possibly singular when the number of potentially important genes is relatively large. This leads to problems of numerical instability. To overcome these limitations, a few non-linear methods have recently been introduced to the area. Many of the existing non-linear methods have a couple of critical problems, the model selection problem and the model parameter tuning problem, that remain unsolved or even untouched. In general, a unified framework that allows model parameters of both linear and non-linear models to be easily tuned is always preferred in real-world applications. Kernel-induced learning methods form a class of approaches that show promising potentials to achieve this goal. Results A hierarchical statistical model named kernel-imbedded Gaussian process (KIGP) is developed under a unified Bayesian framework for binary disease classification problems using microarray gene expression data. In particular, based on a probit regression setting, an adaptive algorithm with a cascading structure is designed to find the appropriate kernel, to discover the potentially significant genes, and to make the optimal class prediction accordingly. A Gibbs sampler is built as the core of the algorithm to make Bayesian inferences. Simulation studies showed that, even without any knowledge of the underlying generative model, the KIGP performed very close to the theoretical Bayesian bound not only in the case with a linear Bayesian classifier but also in the case with a very non-linear Bayesian classifier. This sheds light on its broader usability to microarray data analysis problems, especially to those that linear methods work awkwardly. The KIGP was also applied to four published microarray datasets, and the results showed that the KIGP performed better than or at least as well as any of the referred state-of-the-art methods did in all of these cases. Conclusion Mathematically built on the kernel-induced feature space concept under a Bayesian framework, the KIGP method presented in this paper provides a unified machine learning approach to explore both the linear and the possibly non-linear underlying relationship between the target features of a given binary disease classification problem and the related explanatory gene expression data. More importantly, it incorporates the model parameter tuning into the framework. The model selection problem is addressed in the form of selecting a proper kernel type. The KIGP method also gives Bayesian probabilistic predictions for disease classification. These properties and features are beneficial to most real-world applications. The algorithm is naturally robust in numerical computation. The simulation studies and the published data studies demonstrated that the proposed KIGP performs satisfactorily and consistently. PMID:17328811
Pinchevsky, Gillian M
2016-05-22
This study fills a gap in the literature by exploring the utility of contemporary courtroom theoretical frameworks-uncertainty avoidance, causal attribution, and focal concerns-for explaining decision-making in specialized domestic violence courts. Using data from two specialized domestic violence courts, this study explores the predictors of prosecutorial and judicial decision-making and the extent to which these factors are congruent with theoretical frameworks often used in studies of court processing. Findings suggest that these theoretical frameworks only partially help explain decision-making in the courts under study. A discussion of the findings and implications for future research is provided. © The Author(s) 2016.
The Philosophy of Information as an Underlying and Unifying Theory of Information Science
ERIC Educational Resources Information Center
Tomic, Taeda
2010-01-01
Introduction: Philosophical analyses of theoretical principles underlying these sub-domains reveal philosophy of information as underlying meta-theory of information science. Method: Conceptual research on the knowledge sub-domains in information science and philosophy and analysis of their mutual connection. Analysis: Similarities between…
Bayesian Estimation of Multi-Unidimensional Graded Response IRT Models
ERIC Educational Resources Information Center
Kuo, Tzu-Chun
2015-01-01
Item response theory (IRT) has gained an increasing popularity in large-scale educational and psychological testing situations because of its theoretical advantages over classical test theory. Unidimensional graded response models (GRMs) are useful when polytomous response items are designed to measure a unified latent trait. They are limited in…
Chiu, Chia-Yi; Köhn, Hans-Friedrich
2016-09-01
The asymptotic classification theory of cognitive diagnosis (ACTCD) provided the theoretical foundation for using clustering methods that do not rely on a parametric statistical model for assigning examinees to proficiency classes. Like general diagnostic classification models, clustering methods can be useful in situations where the true diagnostic classification model (DCM) underlying the data is unknown and possibly misspecified, or the items of a test conform to a mix of multiple DCMs. Clustering methods can also be an option when fitting advanced and complex DCMs encounters computational difficulties. These can range from the use of excessive CPU times to plain computational infeasibility. However, the propositions of the ACTCD have only been proven for the Deterministic Input Noisy Output "AND" gate (DINA) model and the Deterministic Input Noisy Output "OR" gate (DINO) model. For other DCMs, there does not exist a theoretical justification to use clustering for assigning examinees to proficiency classes. But if clustering is to be used legitimately, then the ACTCD must cover a larger number of DCMs than just the DINA model and the DINO model. Thus, the purpose of this article is to prove the theoretical propositions of the ACTCD for two other important DCMs, the Reduced Reparameterized Unified Model and the General Diagnostic Model.
Data Centric Development Methodology
ERIC Educational Resources Information Center
Khoury, Fadi E.
2012-01-01
Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…
The semiotics of medical image Segmentation.
Baxter, John S H; Gibson, Eli; Eagleson, Roy; Peters, Terry M
2018-02-01
As the interaction between clinicians and computational processes increases in complexity, more nuanced mechanisms are required to describe how their communication is mediated. Medical image segmentation in particular affords a large number of distinct loci for interaction which can act on a deep, knowledge-driven level which complicates the naive interpretation of the computer as a symbol processing machine. Using the perspective of the computer as dialogue partner, we can motivate the semiotic understanding of medical image segmentation. Taking advantage of Peircean semiotic traditions and new philosophical inquiry into the structure and quality of metaphors, we can construct a unified framework for the interpretation of medical image segmentation as a sign exchange in which each sign acts as an interface metaphor. This allows for a notion of finite semiosis, described through a schematic medium, that can rigorously describe how clinicians and computers interpret the signs mediating their interaction. Altogether, this framework provides a unified approach to the understanding and development of medical image segmentation interfaces. Copyright © 2017 Elsevier B.V. All rights reserved.
Complex networks as a unified framework for descriptive analysis and predictive modeling in climate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steinhaeuser, Karsten J K; Chawla, Nitesh; Ganguly, Auroop R
The analysis of climate data has relied heavily on hypothesis-driven statistical methods, while projections of future climate are based primarily on physics-based computational models. However, in recent years a wealth of new datasets has become available. Therefore, we take a more data-centric approach and propose a unified framework for studying climate, with an aim towards characterizing observed phenomena as well as discovering new knowledge in the climate domain. Specifically, we posit that complex networks are well-suited for both descriptive analysis and predictive modeling tasks. We show that the structural properties of climate networks have useful interpretation within the domain. Further,more » we extract clusters from these networks and demonstrate their predictive power as climate indices. Our experimental results establish that the network clusters are statistically significantly better predictors than clusters derived using a more traditional clustering approach. Using complex networks as data representation thus enables the unique opportunity for descriptive and predictive modeling to inform each other.« less
Li, Bing; Yuan, Chunfeng; Xiong, Weihua; Hu, Weiming; Peng, Houwen; Ding, Xinmiao; Maybank, Steve
2017-12-01
In multi-instance learning (MIL), the relations among instances in a bag convey important contextual information in many applications. Previous studies on MIL either ignore such relations or simply model them with a fixed graph structure so that the overall performance inevitably degrades in complex environments. To address this problem, this paper proposes a novel multi-view multi-instance learning algorithm (MIL) that combines multiple context structures in a bag into a unified framework. The novel aspects are: (i) we propose a sparse -graph model that can generate different graphs with different parameters to represent various context relations in a bag, (ii) we propose a multi-view joint sparse representation that integrates these graphs into a unified framework for bag classification, and (iii) we propose a multi-view dictionary learning algorithm to obtain a multi-view graph dictionary that considers cues from all views simultaneously to improve the discrimination of the MIL. Experiments and analyses in many practical applications prove the effectiveness of the M IL.
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
In ecological networks, network robustness should be large enough to confer intrinsic robustness for tolerating intrinsic parameter fluctuations, as well as environmental robustness for resisting environmental disturbances, so that the phenotype stability of ecological networks can be maintained, thus guaranteeing phenotype robustness. However, it is difficult to analyze the network robustness of ecological systems because they are complex nonlinear partial differential stochastic systems. This paper develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance sensitivity in ecological networks. We found that the phenotype robustness criterion for ecological networks is that if intrinsic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations and environmental disturbances. These results in robust ecological networks are similar to that in robust gene regulatory networks and evolutionary networks even they have different spatial-time scales. PMID:23515112
Unified Computational Methods for Regression Analysis of Zero-Inflated and Bound-Inflated Data
Yang, Yan; Simpson, Douglas
2010-01-01
Bounded data with excess observations at the boundary are common in many areas of application. Various individual cases of inflated mixture models have been studied in the literature for bound-inflated data, yet the computational methods have been developed separately for each type of model. In this article we use a common framework for computing these models, and expand the range of models for both discrete and semi-continuous data with point inflation at the lower boundary. The quasi-Newton and EM algorithms are adapted and compared for estimation of model parameters. The numerical Hessian and generalized Louis method are investigated as means for computing standard errors after optimization. Correlated data are included in this framework via generalized estimating equations. The estimation of parameters and effectiveness of standard errors are demonstrated through simulation and in the analysis of data from an ultrasound bioeffect study. The unified approach enables reliable computation for a wide class of inflated mixture models and comparison of competing models. PMID:20228950
Domain Anomaly Detection in Machine Perception: A System Architecture and Taxonomy.
Kittler, Josef; Christmas, William; de Campos, Teófilo; Windridge, David; Yan, Fei; Illingworth, John; Osman, Magda
2014-05-01
We address the problem of anomaly detection in machine perception. The concept of domain anomaly is introduced as distinct from the conventional notion of anomaly used in the literature. We propose a unified framework for anomaly detection which exposes the multifaceted nature of anomalies and suggest effective mechanisms for identifying and distinguishing each facet as instruments for domain anomaly detection. The framework draws on the Bayesian probabilistic reasoning apparatus which clearly defines concepts such as outlier, noise, distribution drift, novelty detection (object, object primitive), rare events, and unexpected events. Based on these concepts we provide a taxonomy of domain anomaly events. One of the mechanisms helping to pinpoint the nature of anomaly is based on detecting incongruence between contextual and noncontextual sensor(y) data interpretation. The proposed methodology has wide applicability. It underpins in a unified way the anomaly detection applications found in the literature. To illustrate some of its distinguishing features, in here the domain anomaly detection methodology is applied to the problem of anomaly detection for a video annotation system.
A unifying retinex model based on non-local differential operators
NASA Astrophysics Data System (ADS)
Zosso, Dominique; Tran, Giang; Osher, Stanley
2013-02-01
In this paper, we present a unifying framework for retinex that is able to reproduce many of the existing retinex implementations within a single model. The fundamental assumption, as shared with many retinex models, is that the observed image is a multiplication between the illumination and the true underlying reflectance of the object. Starting from Morel's 2010 PDE model for retinex, where illumination is supposed to vary smoothly and where the reflectance is thus recovered from a hard-thresholded Laplacian of the observed image in a Poisson equation, we define our retinex model in similar but more general two steps. First, look for a filtered gradient that is the solution of an optimization problem consisting of two terms: The first term is a sparsity prior of the reflectance, such as the TV or H1 norm, while the second term is a quadratic fidelity prior of the reflectance gradient with respect to the observed image gradients. In a second step, since this filtered gradient almost certainly is not a consistent image gradient, we then look for a reflectance whose actual gradient comes close. Beyond unifying existing models, we are able to derive entirely novel retinex formulations by using more interesting non-local versions for the sparsity and fidelity prior. Hence we define within a single framework new retinex instances particularly suited for texture-preserving shadow removal, cartoon-texture decomposition, color and hyperspectral image enhancement.
Fallah, Parisa Nicole; Bernstein, Mark
2017-09-07
Access to adequate surgical care is limited globally, particularly in low- and middle-income countries (LMICs). To address this issue, surgeons are becoming increasingly involved in international surgical teaching collaborations (ISTCs), which include educational partnerships between surgical teams in high-income countries and those in LMICs. The purpose of this study is to determine a framework for unifying, systematizing, and improving the quality of ISTCs so that they can better address the global surgical need. A convenience sample of 68 surgeons, anesthesiologists, physicians, residents, nurses, academics, and administrators from the U.S., Canada, and Norway was used for the study. Participants all had some involvement in ISTCs and came from multiple specialties and institutions. Qualitative methodology was used, and participants were interviewed using a pre-determined set of open-ended questions. Data was gathered over two months either in-person, over the phone, or on Skype. Data was evaluated using thematic content analysis. To organize and systematize ISTCs, participants reported a need for a centralized/systematized process with designated leaders, a universal data bank of current efforts/progress, communication amongst involved parties, full-time administrative staff, dedicated funds, a scholarly approach, increased use of technology, and more research on needs and outcomes. By taking steps towards unifying and systematizing ISTCs, the quality of ISTCs can be improved. This could lead to an advancement in efforts to increase access to surgical care worldwide.
Usharani, Dandamudi; Janardanan, Deepa; Li, Chunsen; Shaik, Sason
2013-02-19
Over the past decades metalloenzymes and their synthetic models have emerged as an area of increasing research interest. The metalloenzymes and their synthetic models oxidize organic molecules using oxometal complexes (OMCs), especially oxoiron(IV)-based ones. Theoretical studies have helped researchers to characterize the active species and to resolve mechanistic issues. This activity has generated massive amounts of data on the relationship between the reactivity of OMCs and the transition metal's identity, oxidation state, ligand sphere, and spin state. Theoretical studies have also produced information on transition state (TS) structures, reaction intermediates, barriers, and rate-equilibrium relationships. For example, the experimental-theoretical interplay has revealed that nonheme enzymes carry out H-abstraction from strong C-H bonds using high-spin (S = 2) oxoiron(IV) species with four unpaired electrons on the iron center. However, other reagents with higher spin states and more unpaired electrons on the metal are not as reactive. Still other reagents carry out these transformations using lower spin states with fewer unpaired electrons on the metal. The TS structures for these reactions exhibit structural selectivity depending on the reactive spin states. The barriers and thermodynamic driving forces of the reactions also depend on the spin state. H-Abstraction is preferred over the thermodynamically more favorable concerted insertion into C-H bonds. Currently, there is no unified theoretical framework that explains the totality of these fascinating trends. This Account aims to unify this rich chemistry and understand the role of unpaired electrons on chemical reactivity. We show that during an oxidative step the d-orbital block of the transition metal is enriched by one electron through proton-coupled electron transfer (PCET). That single electron elicits variable exchange interactions on the metal, which in turn depend critically on the number of unpaired electrons on the metal center. Thus, we introduce the exchange-enhanced reactivity (EER) principle, which predicts the preferred spin state during oxidation reactions, the dependence of the barrier on the number of unpaired electrons in the TS, and the dependence of the deformation energy of the reactants on the spin state. We complement EER with orbital-selection rules, which predict the structure of the preferred TS and provide a handy theory of bioinorganic oxidative reactions. These rules show how EER provides a Hund's Rule for chemical reactivity: EER controls the reactivity landscape for a great variety of transition-metal complexes and substrates. Among many reactivity patterns explained, EER rationalizes the abundance of high-spin oxoiron(IV) complexes in enzymes that carry out bond activation of the strongest bonds. The concepts used in this Account might also be applicable in other areas such as in f-block chemistry and excited-state reactivity of 4d and 5d OMCs.
Contextuality supplies the 'magic' for quantum computation.
Howard, Mark; Wallman, Joel; Veitch, Victor; Emerson, Joseph
2014-06-19
Quantum computers promise dramatic advantages over their classical counterparts, but the source of the power in quantum computing has remained elusive. Here we prove a remarkable equivalence between the onset of contextuality and the possibility of universal quantum computation via 'magic state' distillation, which is the leading model for experimentally realizing a fault-tolerant quantum computer. This is a conceptually satisfying link, because contextuality, which precludes a simple 'hidden variable' model of quantum mechanics, provides one of the fundamental characterizations of uniquely quantum phenomena. Furthermore, this connection suggests a unifying paradigm for the resources of quantum information: the non-locality of quantum theory is a particular kind of contextuality, and non-locality is already known to be a critical resource for achieving advantages with quantum communication. In addition to clarifying these fundamental issues, this work advances the resource framework for quantum computation, which has a number of practical applications, such as characterizing the efficiency and trade-offs between distinct theoretical and experimental schemes for achieving robust quantum computation, and putting bounds on the overhead cost for the classical simulation of quantum algorithms.
A Multiscale Model for Virus Capsid Dynamics
Chen, Changjun; Saxena, Rishu; Wei, Guo-Wei
2010-01-01
Viruses are infectious agents that can cause epidemics and pandemics. The understanding of virus formation, evolution, stability, and interaction with host cells is of great importance to the scientific community and public health. Typically, a virus complex in association with its aquatic environment poses a fabulous challenge to theoretical description and prediction. In this work, we propose a differential geometry-based multiscale paradigm to model complex biomolecule systems. In our approach, the differential geometry theory of surfaces and geometric measure theory are employed as a natural means to couple the macroscopic continuum domain of the fluid mechanical description of the aquatic environment from the microscopic discrete domain of the atomistic description of the biomolecule. A multiscale action functional is constructed as a unified framework to derive the governing equations for the dynamics of different scales. We show that the classical Navier-Stokes equation for the fluid dynamics and Newton's equation for the molecular dynamics can be derived from the least action principle. These equations are coupled through the continuum-discrete interface whose dynamics is governed by potential driven geometric flows. PMID:20224756
A theoretical and experimental study of neuromorphic atomic switch networks for reservoir computing.
Sillin, Henry O; Aguilera, Renato; Shieh, Hsien-Hang; Avizienis, Audrius V; Aono, Masakazu; Stieg, Adam Z; Gimzewski, James K
2013-09-27
Atomic switch networks (ASNs) have been shown to generate network level dynamics that resemble those observed in biological neural networks. To facilitate understanding and control of these behaviors, we developed a numerical model based on the synapse-like properties of individual atomic switches and the random nature of the network wiring. We validated the model against various experimental results highlighting the possibility to functionalize the network plasticity and the differences between an atomic switch in isolation and its behaviors in a network. The effects of changing connectivity density on the nonlinear dynamics were examined as characterized by higher harmonic generation in response to AC inputs. To demonstrate their utility for computation, we subjected the simulated network to training within the framework of reservoir computing and showed initial evidence of the ASN acting as a reservoir which may be optimized for specific tasks by adjusting the input gain. The work presented represents steps in a unified approach to experimentation and theory of complex systems to make ASNs a uniquely scalable platform for neuromorphic computing.
Electromagnetic mixing laws: A supersymmetric approach
NASA Astrophysics Data System (ADS)
Niez, J. J.
2010-02-01
In this article we address the old problem of finding the effective dielectric constant of materials described either by a local random dielectric constant, or by a set of non-overlapping spherical inclusions randomly dispersed in a host. We use a unified theoretical framework, such that all the most important Electromagnetic Mixing Laws (EML) can be recovered as the first iterative step of a family of results, thus opening the way to future improvements through the refinements of the approximation schemes. When the material is described by a set of immersed inclusions characterized by their spatial correlation functions, we exhibit an EML which, being featured by a minimal approximation scheme, does not come from the multiple scattering paradigm. It is made of a pure Hori-Yonezawa formula, corrected by a power series of the inclusion density. The coefficients of the latter, which are given as sums of standard diagrams, are recast into electromagnetic quantities which calculation is amenable numerically thanks to codes available on the web. The methods used and developed in this work are generic and can be used in a large variety of areas ranging from mechanics to thermodynamics.
Simple effective rule to estimate the jamming packing fraction of polydisperse hard spheres.
Santos, Andrés; Yuste, Santos B; López de Haro, Mariano; Odriozola, Gerardo; Ogarko, Vitaliy
2014-04-01
A recent proposal in which the equation of state of a polydisperse hard-sphere mixture is mapped onto that of the one-component fluid is extrapolated beyond the freezing point to estimate the jamming packing fraction ϕJ of the polydisperse system as a simple function of M1M3/M22, where Mk is the kth moment of the size distribution. An analysis of experimental and simulation data of ϕJ for a large number of different mixtures shows a remarkable general agreement with the theoretical estimate. To give extra support to the procedure, simulation data for seventeen mixtures in the high-density region are used to infer the equation of state of the pure hard-sphere system in the metastable region. An excellent collapse of the inferred curves up to the glass transition and a significant narrowing of the different out-of-equilibrium glass branches all the way to jamming are observed. Thus, the present approach provides an extremely simple criterion to unify in a common framework and to give coherence to data coming from very different polydisperse hard-sphere mixtures.
Unfavorable Individuals in Social Gaming Networks.
Zhang, Yichao; Chen, Guanrong; Guan, Jihong; Zhang, Zhongzhi; Zhou, Shuigeng
2015-12-09
In social gaming networks, the current research focus has been on the origin of widespread reciprocal behaviors when individuals play non-cooperative games. In this paper, we investigate the topological properties of unfavorable individuals in evolutionary games. The unfavorable individuals are defined as the individuals gaining the lowest average payoff in a round of game. Since the average payoff is normally considered as a measure of fitness, the unfavorable individuals are very likely to be eliminated or change their strategy updating rules from a Darwinian perspective. Considering that humans can hardly adopt a unified strategy to play with their neighbors, we propose a divide-and-conquer game model, where individuals can interact with their neighbors in the network with appropriate strategies. We test and compare a series of highly rational strategy updating rules. In the tested scenarios, our analytical and simulation results surprisingly reveal that the less-connected individuals in degree-heterogeneous networks are more likely to become the unfavorable individuals. Our finding suggests that the connectivity of individuals as a social capital fundamentally changes the gaming environment. Our model, therefore, provides a theoretical framework for further understanding the social gaming networks.
NASA Astrophysics Data System (ADS)
Le, Jia-Liang; Bažant, Zdeněk P.
2011-07-01
This paper extends the theoretical framework presented in the preceding Part I to the lifetime distribution of quasibrittle structures failing at the fracture of one representative volume element under constant amplitude fatigue. The probability distribution of the critical stress amplitude is derived for a given number of cycles and a given minimum-to-maximum stress ratio. The physical mechanism underlying the Paris law for fatigue crack growth is explained under certain plausible assumptions about the damage accumulation in the cyclic fracture process zone at the tip of subcritical crack. This law is then used to relate the probability distribution of critical stress amplitude to the probability distribution of fatigue lifetime. The theory naturally yields a power-law relation for the stress-life curve (S-N curve), which agrees with Basquin's law. Furthermore, the theory indicates that, for quasibrittle structures, the S-N curve must be size dependent. Finally, physical explanation is provided to the experimentally observed systematic deviations of lifetime histograms of various ceramics and bones from the Weibull distribution, and their close fits by the present theory are demonstrated.
Ocampo, Cesar
2004-05-01
The modeling, design, and optimization of finite burn maneuvers for a generalized trajectory design and optimization system is presented. A generalized trajectory design and optimization system is a system that uses a single unified framework that facilitates the modeling and optimization of complex spacecraft trajectories that may operate in complex gravitational force fields, use multiple propulsion systems, and involve multiple spacecraft. The modeling and optimization issues associated with the use of controlled engine burn maneuvers of finite thrust magnitude and duration are presented in the context of designing and optimizing a wide class of finite thrust trajectories. Optimal control theory is used examine the optimization of these maneuvers in arbitrary force fields that are generally position, velocity, mass, and are time dependent. The associated numerical methods used to obtain these solutions involve either, the solution to a system of nonlinear equations, an explicit parameter optimization method, or a hybrid parameter optimization that combines certain aspects of both. The theoretical and numerical methods presented here have been implemented in copernicus, a prototype trajectory design and optimization system under development at the University of Texas at Austin.
A theoretical and experimental study of neuromorphic atomic switch networks for reservoir computing
NASA Astrophysics Data System (ADS)
Sillin, Henry O.; Aguilera, Renato; Shieh, Hsien-Hang; Avizienis, Audrius V.; Aono, Masakazu; Stieg, Adam Z.; Gimzewski, James K.
2013-09-01
Atomic switch networks (ASNs) have been shown to generate network level dynamics that resemble those observed in biological neural networks. To facilitate understanding and control of these behaviors, we developed a numerical model based on the synapse-like properties of individual atomic switches and the random nature of the network wiring. We validated the model against various experimental results highlighting the possibility to functionalize the network plasticity and the differences between an atomic switch in isolation and its behaviors in a network. The effects of changing connectivity density on the nonlinear dynamics were examined as characterized by higher harmonic generation in response to AC inputs. To demonstrate their utility for computation, we subjected the simulated network to training within the framework of reservoir computing and showed initial evidence of the ASN acting as a reservoir which may be optimized for specific tasks by adjusting the input gain. The work presented represents steps in a unified approach to experimentation and theory of complex systems to make ASNs a uniquely scalable platform for neuromorphic computing.
Generalized Centroid Estimators in Bioinformatics
Hamada, Michiaki; Kiryu, Hisanori; Iwasaki, Wataru; Asai, Kiyoshi
2011-01-01
In a number of estimation problems in bioinformatics, accuracy measures of the target problem are usually given, and it is important to design estimators that are suitable to those accuracy measures. However, there is often a discrepancy between an employed estimator and a given accuracy measure of the problem. In this study, we introduce a general class of efficient estimators for estimation problems on high-dimensional binary spaces, which represent many fundamental problems in bioinformatics. Theoretical analysis reveals that the proposed estimators generally fit with commonly-used accuracy measures (e.g. sensitivity, PPV, MCC and F-score) as well as it can be computed efficiently in many cases, and cover a wide range of problems in bioinformatics from the viewpoint of the principle of maximum expected accuracy (MEA). It is also shown that some important algorithms in bioinformatics can be interpreted in a unified manner. Not only the concept presented in this paper gives a useful framework to design MEA-based estimators but also it is highly extendable and sheds new light on many problems in bioinformatics. PMID:21365017
NASA Astrophysics Data System (ADS)
Schinckus, C.
2016-12-01
This article aimed at presenting the scattered econophysics literature as a unified and coherent field through a specific lens imported from philosophy science. More precisely, I used the methodology developed by Imre Lakatos to cover the methodological evolution of econophysics over these last two decades. In this perspective, three co-existing approaches have been identified: statistical econophysics, bottom-up agent based econophysics and top-down agent based econophysics. Although the last is presented here as the last step of the methodological evolution of econophysics, it is worth mentioning that this tradition is still very new. A quick look on the econophysics literature shows that the vast majority of works in this field deal with a strictly statistical approach or a classical bottom-up agent-based modelling. In this context of diversification, the objective (and contribution) of this article is to emphasize the conceptual coherence of econophysics as a unique field of research. With this purpose, I used a theoretical framework coming from philosophy of science to characterize how econophysics evolved by combining a methodological enrichment with the preservation of its core conceptual statements.
The phenotypic equilibrium of cancer cells: From average-level stability to path-wise convergence.
Niu, Yuanling; Wang, Yue; Zhou, Da
2015-12-07
The phenotypic equilibrium, i.e. heterogeneous population of cancer cells tending to a fixed equilibrium of phenotypic proportions, has received much attention in cancer biology very recently. In the previous literature, some theoretical models were used to predict the experimental phenomena of the phenotypic equilibrium, which were often explained by different concepts of stabilities of the models. Here we present a stochastic multi-phenotype branching model by integrating conventional cellular hierarchy with phenotypic plasticity mechanisms of cancer cells. Based on our model, it is shown that: (i) our model can serve as a framework to unify the previous models for the phenotypic equilibrium, and then harmonizes the different kinds of average-level stabilities proposed in these models; and (ii) path-wise convergence of our model provides a deeper understanding to the phenotypic equilibrium from stochastic point of view. That is, the emergence of the phenotypic equilibrium is rooted in the stochastic nature of (almost) every sample path, the average-level stability just follows from it by averaging stochastic samples. Copyright © 2015 Elsevier Ltd. All rights reserved.
Aftershock identification problem via the nearest-neighbor analysis for marked point processes
NASA Astrophysics Data System (ADS)
Gabrielov, A.; Zaliapin, I.; Wong, H.; Keilis-Borok, V.
2007-12-01
The centennial observations on the world seismicity have revealed a wide variety of clustering phenomena that unfold in the space-time-energy domain and provide most reliable information about the earthquake dynamics. However, there is neither a unifying theory nor a convenient statistical apparatus that would naturally account for the different types of seismic clustering. In this talk we present a theoretical framework for nearest-neighbor analysis of marked processes and obtain new results on hierarchical approach to studying seismic clustering introduced by Baiesi and Paczuski (2004). Recall that under this approach one defines an asymmetric distance D in space-time-energy domain such that the nearest-neighbor spanning graph with respect to D becomes a time- oriented tree. We demonstrate how this approach can be used to detect earthquake clustering. We apply our analysis to the observed seismicity of California and synthetic catalogs from ETAS model and show that the earthquake clustering part is statistically different from the homogeneous part. This finding may serve as a basis for an objective aftershock identification procedure.
Missing Modality Transfer Learning via Latent Low-Rank Constraint.
Ding, Zhengming; Shao, Ming; Fu, Yun
2015-11-01
Transfer learning is usually exploited to leverage previously well-learned source domain for evaluating the unknown target domain; however, it may fail if no target data are available in the training stage. This problem arises when the data are multi-modal. For example, the target domain is in one modality, while the source domain is in another. To overcome this, we first borrow an auxiliary database with complete modalities, then consider knowledge transfer across databases and across modalities within databases simultaneously in a unified framework. The contributions are threefold: 1) a latent factor is introduced to uncover the underlying structure of the missing modality from the known data; 2) transfer learning in two directions allows the data alignment between both modalities and databases, giving rise to a very promising recovery; and 3) an efficient solution with theoretical guarantees to the proposed latent low-rank transfer learning algorithm. Comprehensive experiments on multi-modal knowledge transfer with missing target modality verify that our method can successfully inherit knowledge from both auxiliary database and source modality, and therefore significantly improve the recognition performance even when test modality is inaccessible in the training stage.
Unfavorable Individuals in Social Gaming Networks
NASA Astrophysics Data System (ADS)
Zhang, Yichao; Chen, Guanrong; Guan, Jihong; Zhang, Zhongzhi; Zhou, Shuigeng
2015-12-01
In social gaming networks, the current research focus has been on the origin of widespread reciprocal behaviors when individuals play non-cooperative games. In this paper, we investigate the topological properties of unfavorable individuals in evolutionary games. The unfavorable individuals are defined as the individuals gaining the lowest average payoff in a round of game. Since the average payoff is normally considered as a measure of fitness, the unfavorable individuals are very likely to be eliminated or change their strategy updating rules from a Darwinian perspective. Considering that humans can hardly adopt a unified strategy to play with their neighbors, we propose a divide-and-conquer game model, where individuals can interact with their neighbors in the network with appropriate strategies. We test and compare a series of highly rational strategy updating rules. In the tested scenarios, our analytical and simulation results surprisingly reveal that the less-connected individuals in degree-heterogeneous networks are more likely to become the unfavorable individuals. Our finding suggests that the connectivity of individuals as a social capital fundamentally changes the gaming environment. Our model, therefore, provides a theoretical framework for further understanding the social gaming networks.
NASA Astrophysics Data System (ADS)
Benettin, Paolo; Soulsby, Chris; Birkel, Christian; Tetzlaff, Doerthe; Botter, Gianluca; Rinaldo, Andrea
2017-04-01
We use high resolution tracer data from the Bruntland Burn catchment (UK) to test theoretical approaches that integrate catchment-scale flow and transport processes in a unified framework centered on selective age sampling by streamflow and evapotranspiration fluxes. Hydrologic transport is here described through StorAge Selection (SAS) functions, parametrized as simple power laws. By representing the way in which catchment storage generates outflows composed by water of different ages, the main mechanism regulating the tracer composition of runoff is clearly identified. The calibrated numerical model provides simulations that convincingly reproduce complex measured signals of daily deuterium content in stream waters during wet and dry periods. The results for the catchment under consideration are consistent with other recent studies indicating a tendency for natural catchments to preferentially release younger available water. The model allows estimating transient water age and its related uncertainty, as well as the total catchment storage. This study shows that power-law SAS functions prove a powerful tool to explain catchment-scale transport processes that also has potential in less intensively monitored sites.
Hilltop supernatural inflation and SUSY unified models
NASA Astrophysics Data System (ADS)
Kohri, Kazunori; Lim, C. S.; Lin, Chia-Min; Mimura, Yukihiro
2014-01-01
In this paper, we consider high scale (100TeV) supersymmetry (SUSY) breaking and realize the idea of hilltop supernatural inflation in concrete particle physics models based on flipped-SU(5)and Pati-Salam models in the framework of supersymmetric grand unified theories (SUSY GUTs). The inflaton can be a flat direction including right-handed sneutrino and the waterfall field is a GUT Higgs. The spectral index is ns = 0.96 which fits very well with recent data by PLANCK satellite. There is no both thermal and non-thermal gravitino problems. Non-thermal leptogenesis can be resulted from the decay of right-handed sneutrino which plays (part of) the role of inflaton.
Mean Comparison: Manifest Variable versus Latent Variable
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Bentler, Peter M.
2006-01-01
An extension of multiple correspondence analysis is proposed that takes into account cluster-level heterogeneity in respondents' preferences/choices. The method involves combining multiple correspondence analysis and k-means in a unified framework. The former is used for uncovering a low-dimensional space of multivariate categorical variables…
Unified Framework for Deriving Simultaneous Equation Algorithms for Water Distribution Networks
The known formulations for steady state hydraulics within looped water distribution networks are re-derived in terms of linear and non-linear transformations of the original set of partly linear and partly non-linear equations that express conservation of mass and energy. All of ...
Reconciling Time, Space and Function: A New Dorsal-Ventral Stream Model of Sentence Comprehension
ERIC Educational Resources Information Center
Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias
2013-01-01
We present a new dorsal-ventral stream framework for language comprehension which unifies basic neurobiological assumptions (Rauschecker & Scott, 2009) with a cross-linguistic neurocognitive sentence comprehension model (eADM; Bornkessel & Schlesewsky, 2006). The dissociation between (time-dependent) syntactic structure-building and…
NASA Astrophysics Data System (ADS)
Honing, Henkjan; Zuidema, Willem
2014-09-01
The future of cognitive science will be about bridging neuroscience and behavioral studies, with essential roles played by comparative biology, formal modeling, and the theory of computation. Nowhere will this integration be more strongly needed than in understanding the biological basis of language and music. We thus strongly sympathize with the general framework that Fitch [1] proposes, and welcome the remarkably broad and readable review he presents to support it.
RosettaRemodel: A Generalized Framework for Flexible Backbone Protein Design
Huang, Po-Ssu; Ban, Yih-En Andrew; Richter, Florian; Andre, Ingemar; Vernon, Robert; Schief, William R.; Baker, David
2011-01-01
We describe RosettaRemodel, a generalized framework for flexible protein design that provides a versatile and convenient interface to the Rosetta modeling suite. RosettaRemodel employs a unified interface, called a blueprint, which allows detailed control over many aspects of flexible backbone protein design calculations. RosettaRemodel allows the construction and elaboration of customized protocols for a wide range of design problems ranging from loop insertion and deletion, disulfide engineering, domain assembly, loop remodeling, motif grafting, symmetrical units, to de novo structure modeling. PMID:21909381
Pricing foreign equity option with stochastic volatility
NASA Astrophysics Data System (ADS)
Sun, Qi; Xu, Weidong
2015-11-01
In this paper we propose a general foreign equity option pricing framework that unifies the vast foreign equity option pricing literature and incorporates the stochastic volatility into foreign equity option pricing. Under our framework, the time-changed Lévy processes are used to model the underlying assets price of foreign equity option and the closed form pricing formula is obtained through the use of characteristic function methodology. Numerical tests indicate that stochastic volatility has a dramatic effect on the foreign equity option prices.
Will the Meikirch Model, a New Framework for Health, Induce a Paradigm Shift in Healthcare?
Bircher, Johannes; Hahn, Eckhart G
2017-03-06
Over the past decades, scientific medicine has realized tremendous advances. Yet, it is felt that the quality, costs, and equity of medicine and public health have not improved correspondingly and, both inside and outside the USA, may even have changed for the worse. An initiative for improving this situation is value-based healthcare, in which value is defined as health outcomes relative to the cost of achieving them. Value-based healthcare was advocated in order to stimulate competition among healthcare providers and thereby reduce costs. The approach may be well grounded economically, but in the care of patients, "value" has ethical and philosophical connotations. The restriction of value to an economic meaning ignores the importance of health and, thus, leads to misunderstandings. We postulate that a new understanding of the nature of health is necessary. We present the Meikirch model, a conceptual framework for health and disease that views health as a complex adaptive system. We describe this model and analyze some important consequences of its application to healthcare. The resources each person needs to meet the demands of life are both biological and personal, and both function together. While scientific advances in healthcare are hailed, these advances focus mainly on the biologically given potential (BGP) and tend to neglect the personally acquired potential (PAP) of an individual person. Personal growth to improve the PAP strongly contributes to meeting the demands of life. Therefore, in individual and public health care, personal growth deserves as much attention as the BGP. The conceptual framework of the Meikirch model supports a unified understanding of healthcare and serves to develop common goals, thereby rendering interprofessional and intersectoral cooperation more successful. The Meikirch model can be used as an effective tool to stimulate health literacy and improve health-supporting behavior. If individuals and groups of people involved in healthcare interact based on the model, mutual understanding of and adherence to treatments and preventive measures will improve. In healthcare, the Meikirch model also makes it plain that neither pay-for-performance nor value-based payment is an adequate response to improve person-centered healthcare. The Meikirch model is not only a unifying theoretical framework for health and disease but also a scaffold for the practice of medicine and public health. It is fully in line with the theory and practice of evidence-based medicine, person-centered healthcare, and integrative medicine. The model offers opportunities to self-motivate people to improve their health-supporting behavior, thereby making preventive approaches and overall healthcare more effective. We believe that the Meikirch model could induce a paradigm shift in healthcare. The healthcare community is hereby invited to acquaint themselves with this model and to consider its potential ramifications.
NASA Astrophysics Data System (ADS)
Laban, Shaban; El-Desouky, Aly
2014-05-01
To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). Also, it is possible to extend the use of the framework in monitoring the IDC pipeline. The detailed design, implementation,conclusion and future work of the proposed framework will be presented.
A Theoretically Consistent Framework for Modelling Lagrangian Particle Deposition in Plant Canopies
NASA Astrophysics Data System (ADS)
Bailey, Brian N.; Stoll, Rob; Pardyjak, Eric R.
2018-06-01
We present a theoretically consistent framework for modelling Lagrangian particle deposition in plant canopies. The primary focus is on describing the probability of particles encountering canopy elements (i.e., potential deposition), and provides a consistent means for including the effects of imperfect deposition through any appropriate sub-model for deposition efficiency. Some aspects of the framework draw upon an analogy to radiation propagation through a turbid medium with which to develop model theory. The present method is compared against one of the most commonly used heuristic Lagrangian frameworks, namely that originally developed by Legg and Powell (Agricultural Meteorology, 1979, Vol. 20, 47-67), which is shown to be theoretically inconsistent. A recommendation is made to discontinue the use of this heuristic approach in favour of the theoretically consistent framework developed herein, which is no more difficult to apply under equivalent assumptions. The proposed framework has the additional advantage that it can be applied to arbitrary canopy geometries given readily measurable parameters describing vegetation structure.
NASA Astrophysics Data System (ADS)
Herrera, Jorge M.; Chapanoff, Miguel
2017-12-01
In the field of maritime archaeology, the use of maritime, coastal, riverine, and lacustrine spaces by past societies has been perceived in different and changing viewpoints. These perspectives have flourished in dynamic and varying ways in many countries, and under different theoretical constructs. If in the 1970s the subject was perhaps not recognized as a central research subject by much of our community, it is now not only accepted but it has become a robust area of interest in maritime research. Two concepts in Latin America have been accepted that have had widespread application and influence, namely the regional maritime context and the maritorio. The points of contact between both are so intense that it is possible to speak about a single alternative with two possible names. In this article, their origins, applications, and theoretical influences are presented in a way that unifies these two concepts into a single approach (the maritorium), and examines how these ideas have been applied to research carried out in Mexico, Chile, and Uruguay. These applications are wide ranging, as they include the interconnected complexity between land and sea as used and inhabited by past societies. They have been applied in the study of ship traps, whole fleets, sites of maritime conflict and warfare, exploration activities, and ethnographic research. These will also be presented in light of other concepts of similar interest in the international sphere, such as the widespread concept of the Maritime Cultural Landscape, and also in view of other theoretical frameworks coming from the wider sphere of the profession, such as Landscape Archaeology and Phenomenological Archaeology.
RANZCR Body Systems Framework of diagnostic imaging examination descriptors.
Pitman, Alexander G; Penlington, Lisa; Doromal, Darren; Slater, Gregory; Vukolova, Natalia
2014-08-01
A unified and logical system of descriptors for diagnostic imaging examinations and procedures is a desirable resource for radiology in Australia and New Zealand and is needed to support core activities of RANZCR. Existing descriptor systems available in Australia and New Zealand (including the Medicare DIST and the ACC Schedule) have significant limitations and are inappropriate for broader clinical application. An anatomically based grid was constructed, with anatomical structures arranged in rows and diagnostic imaging modalities arranged in columns (including nuclear medicine and positron emission tomography). The grid was segregated into five body systems. The cells at the intersection of an anatomical structure row and an imaging modality column were populated with short, formulaic descriptors of the applicable diagnostic imaging examinations. Clinically illogical or physically impossible combinations were 'greyed out'. Where the same examination applied to different anatomical structures, the descriptor was kept identical for the purposes of streamlining. The resulting Body Systems Framework of diagnostic imaging examination descriptors lists all the reasonably common diagnostic imaging examinations currently performed in Australia and New Zealand using a unified grid structure allowing navigation by both referrers and radiologists. The Framework has been placed on the RANZCR website and is available for access free of charge by registered users. The Body Systems Framework of diagnostic imaging examination descriptors is a system of descriptors based on relationships between anatomical structures and imaging modalities. The Framework is now available as a resource and reference point for the radiology profession and to support core College activities. © 2014 The Royal Australian and New Zealand College of Radiologists.
Read, Mark; Andrews, Paul S; Timmis, Jon; Kumar, Vipin
2014-10-06
We present a framework to assist the diagrammatic modelling of complex biological systems using the unified modelling language (UML). The framework comprises three levels of modelling, ranging in scope from the dynamics of individual model entities to system-level emergent properties. By way of an immunological case study of the mouse disease experimental autoimmune encephalomyelitis, we show how the framework can be used to produce models that capture and communicate the biological system, detailing how biological entities, interactions and behaviours lead to higher-level emergent properties observed in the real world. We demonstrate how the UML can be successfully applied within our framework, and provide a critique of UML's ability to capture concepts fundamental to immunology and biology more generally. We show how specialized, well-explained diagrams with less formal semantics can be used where no suitable UML formalism exists. We highlight UML's lack of expressive ability concerning cyclic feedbacks in cellular networks, and the compounding concurrency arising from huge numbers of stochastic, interacting agents. To compensate for this, we propose several additional relationships for expressing these concepts in UML's activity diagram. We also demonstrate the ambiguous nature of class diagrams when applied to complex biology, and question their utility in modelling such dynamic systems. Models created through our framework are non-executable, and expressly free of simulation implementation concerns. They are a valuable complement and precursor to simulation specifications and implementations, focusing purely on thoroughly exploring the biology, recording hypotheses and assumptions, and serve as a communication medium detailing exactly how a simulation relates to the real biology.
Read, Mark; Andrews, Paul S.; Timmis, Jon; Kumar, Vipin
2014-01-01
We present a framework to assist the diagrammatic modelling of complex biological systems using the unified modelling language (UML). The framework comprises three levels of modelling, ranging in scope from the dynamics of individual model entities to system-level emergent properties. By way of an immunological case study of the mouse disease experimental autoimmune encephalomyelitis, we show how the framework can be used to produce models that capture and communicate the biological system, detailing how biological entities, interactions and behaviours lead to higher-level emergent properties observed in the real world. We demonstrate how the UML can be successfully applied within our framework, and provide a critique of UML's ability to capture concepts fundamental to immunology and biology more generally. We show how specialized, well-explained diagrams with less formal semantics can be used where no suitable UML formalism exists. We highlight UML's lack of expressive ability concerning cyclic feedbacks in cellular networks, and the compounding concurrency arising from huge numbers of stochastic, interacting agents. To compensate for this, we propose several additional relationships for expressing these concepts in UML's activity diagram. We also demonstrate the ambiguous nature of class diagrams when applied to complex biology, and question their utility in modelling such dynamic systems. Models created through our framework are non-executable, and expressly free of simulation implementation concerns. They are a valuable complement and precursor to simulation specifications and implementations, focusing purely on thoroughly exploring the biology, recording hypotheses and assumptions, and serve as a communication medium detailing exactly how a simulation relates to the real biology. PMID:25142524
A stochastically fully connected conditional random field framework for super resolution OCT
NASA Astrophysics Data System (ADS)
Boroomand, A.; Tan, B.; Wong, A.; Bizheva, K.
2017-02-01
A number of factors can degrade the resolution and contrast of OCT images, such as: (1) changes of the OCT pointspread function (PSF) resulting from wavelength dependent scattering and absorption of light along the imaging depth (2) speckle noise, as well as (3) motion artifacts. We propose a new Super Resolution OCT (SR OCT) imaging framework that takes advantage of a Stochastically Fully Connected Conditional Random Field (SF-CRF) model to generate a Super Resolved OCT (SR OCT) image of higher quality from a set of Low-Resolution OCT (LR OCT) images. The proposed SF-CRF SR OCT imaging is able to simultaneously compensate for all of the factors mentioned above, that degrade the OCT image quality, using a unified computational framework. The proposed SF-CRF SR OCT imaging framework was tested on a set of simulated LR human retinal OCT images generated from a high resolution, high contrast retinal image, and on a set of in-vivo, high resolution, high contrast rat retinal OCT images. The reconstructed SR OCT images show considerably higher spatial resolution, less speckle noise and higher contrast compared to other tested methods. Visual assessment of the results demonstrated the usefulness of the proposed approach in better preservation of fine details and structures of the imaged sample, retaining biological tissue boundaries while reducing speckle noise using a unified computational framework. Quantitative evaluation using both Contrast to Noise Ratio (CNR) and Edge Preservation (EP) parameter also showed superior performance of the proposed SF-CRF SR OCT approach compared to other image processing approaches.
Buetow, S; Adair, V; Coster, G; Hight, M; Gribben, B; Mitchell, E
2002-12-01
Different sets of literature suggest how aspects of practice time management can limit access to general practitioner (GP) care. Researchers have not organised this knowledge into a unified framework that can enhance understanding of barriers to, and opportunities for, improved access. To suggest a framework conceptualising how differences in professional and cultural understanding of practice time management in Auckland, New Zealand, influence access to GP care for children with chronic asthma. A qualitative study involving selective sampling, semi-structured interviews on barriers to access, and a general inductive approach. Twenty-nine key informants and ten mothers of children with chronic, moderate to severe asthma and poor access to GP care in Auckland. Development of a framework from themes describing barriers associated with, and needs for, practice time management. The themes were independently identified by two authors from transcribed interviews and confirmed through informant checking. Themes from key informant and patient interviews were triangulated with each other and with published literature. The framework distinguishes 'practice-centred time' from 'patient-centred time.' A predominance of 'practice-centred time' and an unmet opportunity for 'patient-centred time' are suggested by the persistence of five barriers to accessing GP care: limited hours of opening; traditional appointment systems; practice intolerance of missed appointments; long waiting times in the practice; and inadequate consultation lengths. None of the barriers is specific to asthmatic children. A unified framework was suggested for understanding how the organisation of practice work time can influence access to GP care by groups including asthmatic children.
ERIC Educational Resources Information Center
Kohli, Nidhi; Koran, Jennifer; Henn, Lisa
2015-01-01
There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…
Nurses' tending instinct as a conduit for men's access to mental health counseling.
Smith, Jeffrey M; Robertson, Steve
2006-06-01
No article has been found melding the phenomenon of nurses' tending instinct and men's mental health counseling access. This theoretical article presents nurses' tending instinct as a viable rationale to support men in utilizing mental health counseling services. Nurses can be the conduit that assists men in accessing mental health counseling when the need arises. An amalgamation of related topics, including nurses' tending instinct, men's illness/injury/disease profile, psychological medicine, and counselor skills, were forged together to unify this innovative theoretical consideration. Implications for nursing practice also were explored.
On Analysis of Electrical Engineering Programme in GCC Countries
ERIC Educational Resources Information Center
Memon, Qurban A.
2007-01-01
Electrical engineering (EE) curricula in the Gulf Cooperation Council (GCC) region have gone through an evolutionary process, and are now approaching a maturity level. In order to address academic and local industrial needs in a unified way, a need has been felt to investigate EE curricula in a way that highlights theoretical understanding, design…
ERIC Educational Resources Information Center
Stocker, Kurt
2012-01-01
This article provides the first comprehensive conceptual account for the imagistic mental machinery that allows us to travel through time--for the time machine in our mind. It is argued that language reveals this imagistic machine and how we use it. Findings from a range of cognitive fields are theoretically unified and a recent proposal about…
ERIC Educational Resources Information Center
Dackermann, Tanja; Fischer, Ursula; Nuerk, Hans-Christoph; Cress, Ulrike; Moeller, Korbinian
2017-01-01
"Embodied trainings" allowing children to move their whole body in space have recently been shown to foster the acquisition of basic numerical competencies (e.g. magnitude understanding, addition performance). Following a brief summary of recent embodied training studies, we integrate the different results into a unified model framework…
Social Learning Theory: Toward a Unified Approach of Pediatric Procedural Pain
ERIC Educational Resources Information Center
Page, Lynn Olson; Blanchette, Jennifer A.
2009-01-01
Undermanaged procedural pain has been shown to have short and long term effects on children. While significant progress regarding empirically supported treatments has been made, theoretical bases for the development and management of procedural pain are lacking. This paper examines the role of social learning theory in our current understanding of…
Mathematics: PROJECT DESIGN. Educational Needs, Fresno, 1968, Number 12.
ERIC Educational Resources Information Center
Smart, James R.
This report examines and summarizes the needs in mathematics of the Fresno City school system. The study is one in a series of needs assessment reports for PROJECT DESIGN, an ESEA Title III project administered by the Fresno City Unified School District. Theoretical concepts, rather than computational drill, would be emphasized in the proposed…
Probabilistic delay differential equation modeling of event-related potentials.
Ostwald, Dirk; Starke, Ludger
2016-08-01
"Dynamic causal models" (DCMs) are a promising approach in the analysis of functional neuroimaging data due to their biophysical interpretability and their consolidation of functional-segregative and functional-integrative propositions. In this theoretical note we are concerned with the DCM framework for electroencephalographically recorded event-related potentials (ERP-DCM). Intuitively, ERP-DCM combines deterministic dynamical neural mass models with dipole-based EEG forward models to describe the event-related scalp potential time-series over the entire electrode space. Since its inception, ERP-DCM has been successfully employed to capture the neural underpinnings of a wide range of neurocognitive phenomena. However, in spite of its empirical popularity, the technical literature on ERP-DCM remains somewhat patchy. A number of previous communications have detailed certain aspects of the approach, but no unified and coherent documentation exists. With this technical note, we aim to close this gap and to increase the technical accessibility of ERP-DCM. Specifically, this note makes the following novel contributions: firstly, we provide a unified and coherent review of the mathematical machinery of the latent and forward models constituting ERP-DCM by formulating the approach as a probabilistic latent delay differential equation model. Secondly, we emphasize the probabilistic nature of the model and its variational Bayesian inversion scheme by explicitly deriving the variational free energy function in terms of both the likelihood expectation and variance parameters. Thirdly, we detail and validate the estimation of the model with a special focus on the explicit form of the variational free energy function and introduce a conventional nonlinear optimization scheme for its maximization. Finally, we identify and discuss a number of computational issues which may be addressed in the future development of the approach. Copyright © 2016 Elsevier Inc. All rights reserved.
Terrestrial carbon storage dynamics: Chasing a moving target
NASA Astrophysics Data System (ADS)
Luo, Y.; Shi, Z.; Jiang, L.; Xia, J.; Wang, Y.; Kc, M.; Liang, J.; Lu, X.; Niu, S.; Ahlström, A.; Hararuk, O.; Hastings, A.; Hoffman, F. M.; Medlyn, B. E.; Rasmussen, M.; Smith, M. J.; Todd-Brown, K. E.; Wang, Y.
2015-12-01
Terrestrial ecosystems have been estimated to absorb roughly 30% of anthropogenic CO2 emissions. Past studies have identified myriad drivers of terrestrial carbon storage changes, such as fire, climate change, and land use changes. Those drivers influence the carbon storage change via diverse mechanisms, which have not been unified into a general theory so as to identify what control the direction and rate of terrestrial carbon storage dynamics. Here we propose a theoretical framework to quantitatively determine the response of terrestrial carbon storage to different exogenous drivers. With a combination of conceptual reasoning, mathematical analysis, and numeric experiments, we demonstrated that the maximal capacity of an ecosystem to store carbon is time-dependent and equals carbon input (i.e., net primary production, NPP) multiplying by residence time. The capacity is a moving target toward which carbon storage approaches (i.e., the direction of carbon storage change) but usually does not attain. The difference between the capacity and the carbon storage at a given time t is the unrealized carbon storage potential. The rate of the storage change is proportional to the magnitude of the unrealized potential. We also demonstrated that a parameter space of NPP, residence time, and carbon storage potential can well characterize carbon storage dynamics quantified at six sites ranging from tropical forests to tundra and simulated by two versions (carbon-only and coupled carbon-nitrogen) of the Australian Community Atmosphere-Biosphere Land Ecosystem (CABLE) Model under three climate change scenarios (CO2 rising only, climate warming only, and RCP8.5). Overall this study reveals the unified mechanism unerlying terrestrial carbon storage dynamics to guide transient traceability analysis of global land models and synthesis of empirical studies.
NASA Astrophysics Data System (ADS)
Peng, Lanfang; Liu, Paiyu; Feng, Xionghan; Wang, Zimeng; Cheng, Tao; Liang, Yuzhen; Lin, Zhang; Shi, Zhenqing
2018-03-01
Predicting the kinetics of heavy metal adsorption and desorption in soil requires consideration of multiple heterogeneous soil binding sites and variations of reaction chemistry conditions. Although chemical speciation models have been developed for predicting the equilibrium of metal adsorption on soil organic matter (SOM) and important mineral phases (e.g. Fe and Al (hydr)oxides), there is still a lack of modeling tools for predicting the kinetics of metal adsorption and desorption reactions in soil. In this study, we developed a unified model for the kinetics of heavy metal adsorption and desorption in soil based on the equilibrium models WHAM 7 and CD-MUSIC, which specifically consider metal kinetic reactions with multiple binding sites of SOM and soil minerals simultaneously. For each specific binding site, metal adsorption and desorption rate coefficients were constrained by the local equilibrium partition coefficients predicted by WHAM 7 or CD-MUSIC, and, for each metal, the desorption rate coefficients of various binding sites were constrained by their metal binding constants with those sites. The model had only one fitting parameter for each soil binding phase, and all other parameters were derived from WHAM 7 and CD-MUSIC. A stirred-flow method was used to study the kinetics of Cd, Cu, Ni, Pb, and Zn adsorption and desorption in multiple soils under various pH and metal concentrations, and the model successfully reproduced most of the kinetic data. We quantitatively elucidated the significance of different soil components and important soil binding sites during the adsorption and desorption kinetic processes. Our model has provided a theoretical framework to predict metal adsorption and desorption kinetics, which can be further used to predict the dynamic behavior of heavy metals in soil under various natural conditions by coupling other important soil processes.
NASA Technical Reports Server (NTRS)
Li, Wei; Saleeb, Atef F.
1995-01-01
This two-part report is concerned with the development of a general framework for the implicit time-stepping integrators for the flow and evolution equations in generalized viscoplastic models. The primary goal is to present a complete theoretical formulation, and to address in detail the algorithmic and numerical analysis aspects involved in its finite element implementation, as well as to critically assess the numerical performance of the developed schemes in a comprehensive set of test cases. On the theoretical side, the general framework is developed on the basis of the unconditionally-stable, backward-Euler difference scheme as a starting point. Its mathematical structure is of sufficient generality to allow a unified treatment of different classes of viscoplastic models with internal variables. In particular, two specific models of this type, which are representative of the present start-of-art in metal viscoplasticity, are considered in applications reported here; i.e., fully associative (GVIPS) and non-associative (NAV) models. The matrix forms developed for both these models are directly applicable for both initially isotropic and anisotropic materials, in general (three-dimensional) situations as well as subspace applications (i.e., plane stress/strain, axisymmetric, generalized plane stress in shells). On the computational side, issues related to efficiency and robustness are emphasized in developing the (local) interative algorithm. In particular, closed-form expressions for residual vectors and (consistent) material tangent stiffness arrays are given explicitly for both GVIPS and NAV models, with their maximum sizes 'optimized' to depend only on the number of independent stress components (but independent of the number of viscoplastic internal state parameters). Significant robustness of the local iterative solution is provided by complementing the basic Newton-Raphson scheme with a line-search strategy for convergence. In the present second part of the report, we focus on the specific details of the numerical schemes, and associated computer algorithms, for the finite-element implementation of GVIPS and NAV models.
Teacher Preparation for Vocational Education and Training in Germany: A Potential Model for Canada?
ERIC Educational Resources Information Center
Barabasch, Antje; Watt-Malcolm, Bonnie
2013-01-01
Germany's vocational education and training (VET) and corresponding teacher-education programmes are known worldwide for their integrated framework. Government legislation unifies companies, unions and vocational schools, and specifies the education and training required for students as well as vocational teachers. Changing from the Diplom…
The Unified Plant Growth Model (UPGM): software framework overview and model application
USDA-ARS?s Scientific Manuscript database
Since the Environmental Policy Integrated Climate (EPIC) model was developed in 1989, the EPIC plant growth component has been incorporated into other erosion and crop management models (e.g., WEPS, WEPP, SWAT, ALMANAC, and APEX) and modified to meet model developer research objectives. This has re...
The Importance of Culture for Developmental Science
ERIC Educational Resources Information Center
Keller, Heidi
2012-01-01
In this essay, it is argued that a general understanding of human development needs a unified framework based on evolutionary theorizing and cross-cultural and cultural anthropological approaches. An eco-social model of development has been proposed that defines cultural milieus as adaptations to specific socio-demographic contexts. Ontogenetic…
2009-08-19
SSDS Ship Self Defense System TSTS Total Ship Training System UDDI Universal Description, Discovery, and Integration UML Unified Modeling...34ContractorOrganization" type="ContractorOrganizationType"> <xs:annotation> <xs:documentation>Identifies a contractor organization resposible for the
ERIC Educational Resources Information Center
Hwang, Heungsun; Montreal, Hec; Dillon, William R.; Takane, Yoshio
2006-01-01
An extension of multiple correspondence analysis is proposed that takes into account cluster-level heterogeneity in respondents' preferences/choices. The method involves combining multiple correspondence analysis and k-means in a unified framework. The former is used for uncovering a low-dimensional space of multivariate categorical variables…
USDA-ARS?s Scientific Manuscript database
Biological diversity is a key concept in the life sciences and plays a fundamental role in many ecological and evolutionary processes. Although biodiversity is inherently a hierarchical concept covering different levels of organization (genes, population, species, ecological communities and ecosyst...
The Theory behind the Theory in DCT and SCDT: A Response to Rigazio-DiGilio.
ERIC Educational Resources Information Center
Terry, Linda L.
1994-01-01
Responds to previous article by Rigazio-DiGilio on Developmental Counseling and Therapy and Systemic Cognitive-Developmental Therapy as two integrative models that unify individual, family, and network treatment within coconstructive-developmental framework. Discusses hidden complexities in cognitive-developmental ecosystemic integration and…
Potential of DCT/SCDT in Addressing Two Elusive Themes of Mental Health Counseling.
ERIC Educational Resources Information Center
Borders, L. DiAnne
1994-01-01
Responds to previous article by Rigazio-DiGilio on Developmental Counseling and Therapy and Systemic Cognitive-Developmental Therapy as two integrative models that unify individual, family, and network treatment within coconstructive-developmental framework. Considers extent to which model breaks impasse in integrating development into counseling…
Converging Instructional Technology and Critical Intercultural Pedagogy in Teacher Education
ERIC Educational Resources Information Center
Pittman, Joyce
2007-01-01
Purpose: This paper aims to postulate an emerging unified cultural-convergence framework to converge the delivery of instructional technology and intercultural education (ICE) that extends beyond web-learning technologies to inculcate inclusive pedagogy in teacher education. Design/methodology/approach: The paper explores the literature and a…
Simultaneous Two-Way Clustering of Multiple Correspondence Analysis
ERIC Educational Resources Information Center
Hwang, Heungsun; Dillon, William R.
2010-01-01
A 2-way clustering approach to multiple correspondence analysis is proposed to account for cluster-level heterogeneity of both respondents and variable categories in multivariate categorical data. Specifically, in the proposed method, multiple correspondence analysis is combined with k-means in a unified framework in which "k"-means is…
ERIC Educational Resources Information Center
Arnhart, Larry
2006-01-01
Be it metaphysics, theology, or some other unifying framework, humans have long sought to determine "first principles" underlying knowledge. Larry Arnhart continues in this vein, positing a Darwinian web of genetic, cultural, and cognitive evolution to explain our social behavior in terms of human nature as governed by biology. He leaves it to us…
Unified, Insular, Firmly Policed, or Fractured, Porous, Contested, Gifted Education?
ERIC Educational Resources Information Center
Ambrose, Don; VanTassel-Baska, Joyce; Coleman, Laurence J.; Cross, Tracy L.
2010-01-01
Much like medieval, feudal nations, professional fields such as gifted education can take shape as centralized kingdoms with strong armies controlling their compliant populations and protecting closed borders, or as loose collections of conflict-prone principalities with borders open to invaders. Using an investigative framework borrowed from an…
Software for Data Analysis with Graphical Models
NASA Technical Reports Server (NTRS)
Buntine, Wray L.; Roy, H. Scott
1994-01-01
Probabilistic graphical models are being used widely in artificial intelligence and statistics, for instance, in diagnosis and expert systems, as a framework for representing and reasoning with probabilities and independencies. They come with corresponding algorithms for performing statistical inference. This offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper illustrates the framework with an example and then presents some basic techniques for the task: problem decomposition and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.
PERCH: A Unified Framework for Disease Gene Prioritization.
Feng, Bing-Jian
2017-03-01
To interpret genetic variants discovered from next-generation sequencing, integration of heterogeneous information is vital for success. This article describes a framework named PERCH (Polymorphism Evaluation, Ranking, and Classification for a Heritable trait), available at http://BJFengLab.org/. It can prioritize disease genes by quantitatively unifying a new deleteriousness measure called BayesDel, an improved assessment of the biological relevance of genes to the disease, a modified linkage analysis, a novel rare-variant association test, and a converted variant call quality score. It supports data that contain various combinations of extended pedigrees, trios, and case-controls, and allows for a reduced penetrance, an elevated phenocopy rate, liability classes, and covariates. BayesDel is more accurate than PolyPhen2, SIFT, FATHMM, LRT, Mutation Taster, Mutation Assessor, PhyloP, GERP++, SiPhy, CADD, MetaLR, and MetaSVM. The overall approach is faster and more powerful than the existing quantitative method pVAAST, as shown by the simulations of challenging situations in finding the missing heritability of a complex disease. This framework can also classify variants of unknown significance (variants of uncertain significance) by quantitatively integrating allele frequencies, deleteriousness, association, and co-segregation. PERCH is a versatile tool for gene prioritization in gene discovery research and variant classification in clinical genetic testing. © 2016 The Authors. **Human Mutation published by Wiley Periodicals, Inc.
Fatima, Iram; Fahim, Muhammad; Lee, Young-Koo; Lee, Sungyoung
2013-01-01
In recent years, activity recognition in smart homes is an active research area due to its applicability in many applications, such as assistive living and healthcare. Besides activity recognition, the information collected from smart homes has great potential for other application domains like lifestyle analysis, security and surveillance, and interaction monitoring. Therefore, discovery of users common behaviors and prediction of future actions from past behaviors become an important step towards allowing an environment to provide personalized service. In this paper, we develop a unified framework for activity recognition-based behavior analysis and action prediction. For this purpose, first we propose kernel fusion method for accurate activity recognition and then identify the significant sequential behaviors of inhabitants from recognized activities of their daily routines. Moreover, behaviors patterns are further utilized to predict the future actions from past activities. To evaluate the proposed framework, we performed experiments on two real datasets. The results show a remarkable improvement of 13.82% in the accuracy on average of recognized activities along with the extraction of significant behavioral patterns and precise activity predictions with 6.76% increase in F-measure. All this collectively help in understanding the users” actions to gain knowledge about their habits and preferences. PMID:23435057
Ghanbari, Yasser; Smith, Alex R.; Schultz, Robert T.; Verma, Ragini
2014-01-01
Diffusion tensor imaging (DTI) offers rich insights into the physical characteristics of white matter (WM) fiber tracts and their development in the brain, facilitating a network representation of brain’s traffic pathways. Such a network representation of brain connectivity has provided a novel means of investigating brain changes arising from pathology, development or aging. The high dimensionality of these connectivity networks necessitates the development of methods that identify the connectivity building blocks or sub-network components that characterize the underlying variation in the population. In addition, the projection of the subject networks into the basis set provides a low dimensional representation of it, that teases apart different sources of variation in the sample, facilitating variation-specific statistical analysis. We propose a unified framework of non-negative matrix factorization and graph embedding for learning sub-network patterns of connectivity by their projective non-negative decomposition into a reconstructive basis set, as well as, additional basis sets representing variational sources in the population like age and pathology. The proposed framework is applied to a study of diffusion-based connectivity in subjects with autism that shows localized sparse sub-networks which mostly capture the changes related to pathology and developmental variations. PMID:25037933
ERIC Educational Resources Information Center
Styres, Sandra D.; Zinga, Dawn M.
2013-01-01
This article introduces an emergent research theoretical framework, the community-first Land-centred research framework. Carefully examining the literature within Indigenous educational research, we noted the limited approaches for engaging in culturally aligned and relevant research within Indigenous communities. The community-first Land-centred…
An e-Learning Theoretical Framework
ERIC Educational Resources Information Center
Aparicio, Manuela; Bacao, Fernando; Oliveira, Tiago
2016-01-01
E-learning systems have witnessed a usage and research increase in the past decade. This article presents the e-learning concepts ecosystem. It summarizes the various scopes on e-learning studies. Here we propose an e-learning theoretical framework. This theory framework is based upon three principal dimensions: users, technology, and services…
Threshold Capabilities: Threshold Concepts and Knowledge Capability Linked through Variation Theory
ERIC Educational Resources Information Center
Baillie, Caroline; Bowden, John A.; Meyer, Jan H. F.
2013-01-01
The Threshold Capability Integrated Theoretical Framework (TCITF) is presented as a framework for the design of university curricula, aimed at developing graduates' capability to deal with previously unseen situations in their professional, social, and personal lives. The TCITF is a new theoretical framework derived from, and heavily dependent…
Reiter-Theil, Stella; Mertz, Marcel; Schürmann, Jan; Stingelin Giles, Nicola; Meyer-Zehnder, Barbara
2011-09-01
In this paper we assume that 'theory' is important for Clinical Ethics Support Services (CESS). We will argue that the underlying implicit theory should be reflected. Moreover, we suggest that the theoretical components on which any clinical ethics support (CES) relies should be explicitly articulated in order to enhance the quality of CES. A theoretical framework appropriate for CES will be necessarily complex and should include ethical (both descriptive and normative), metaethical and organizational components. The various forms of CES that exist in North-America and in Europe show their underlying theory more or less explicitly, with most of them referring to some kind of theoretical components including 'how-to' questions (methodology), organizational issues (implementation), problem analysis (phenomenology or typology of problems), and related ethical issues such as end-of-life decisions (major ethical topics). In order to illustrate and explain the theoretical framework that we are suggesting for our own CES project METAP, we will outline this project which has been established in a multi-centre context in several healthcare institutions. We conceptualize three 'pillars' as the major components of our theoretical framework: (1) evidence, (2) competence, and (3) discourse. As a whole, the framework is aimed at developing a foundation of our CES project METAP. We conclude that this specific integration of theoretical components is a promising model for the fruitful further development of CES. © 2011 Blackwell Publishing Ltd.
Yin, X-X; Zhang, Y; Cao, J; Wu, J-L; Hadjiloucas, S
2016-12-01
We provide a comprehensive account of recent advances in biomedical image analysis and classification from two complementary imaging modalities: terahertz (THz) pulse imaging and dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). The work aims to highlight underlining commonalities in both data structures so that a common multi-channel data fusion framework can be developed. Signal pre-processing in both datasets is discussed briefly taking into consideration advances in multi-resolution analysis and model based fractional order calculus system identification. Developments in statistical signal processing using principal component and independent component analysis are also considered. These algorithms have been developed independently by the THz-pulse imaging and DCE-MRI communities, and there is scope to place them in a common multi-channel framework to provide better software standardization at the pre-processing de-noising stage. A comprehensive discussion of feature selection strategies is also provided and the importance of preserving textural information is highlighted. Feature extraction and classification methods taking into consideration recent advances in support vector machine (SVM) and extreme learning machine (ELM) classifiers and their complex extensions are presented. An outlook on Clifford algebra classifiers and deep learning techniques suitable to both types of datasets is also provided. The work points toward the direction of developing a new unified multi-channel signal processing framework for biomedical image analysis that will explore synergies from both sensing modalities for inferring disease proliferation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
A unifying kinetic framework for modeling oxidoreductase-catalyzed reactions.
Chang, Ivan; Baldi, Pierre
2013-05-15
Oxidoreductases are a fundamental class of enzymes responsible for the catalysis of oxidation-reduction reactions, crucial in most bioenergetic metabolic pathways. From their common root in the ancient prebiotic environment, oxidoreductases have evolved into diverse and elaborate protein structures with specific kinetic properties and mechanisms adapted to their individual functional roles and environmental conditions. While accurate kinetic modeling of oxidoreductases is thus important, current models suffer from limitations to the steady-state domain, lack empirical validation or are too specialized to a single system or set of conditions. To address these limitations, we introduce a novel unifying modeling framework for kinetic descriptions of oxidoreductases. The framework is based on a set of seven elementary reactions that (i) form the basis for 69 pairs of enzyme state transitions for encoding various specific microscopic intra-enzyme reaction networks (micro-models), and (ii) lead to various specific macroscopic steady-state kinetic equations (macro-models) via thermodynamic assumptions. Thus, a synergistic bridge between the micro and macro kinetics can be achieved, enabling us to extract unitary rate constants, simulate reaction variance and validate the micro-models using steady-state empirical data. To help facilitate the application of this framework, we make available RedoxMech: a Mathematica™ software package that automates the generation and customization of micro-models. The Mathematica™ source code for RedoxMech, the documentation and the experimental datasets are all available from: http://www.igb.uci.edu/tools/sb/metabolic-modeling. pfbaldi@ics.uci.edu Supplementary data are available at Bioinformatics online.
War-gaming application for future space systems acquisition
NASA Astrophysics Data System (ADS)
Nguyen, Tien M.; Guillen, Andy T.
2016-05-01
Recently the U.S. Department of Defense (DOD) released the Defense Innovation Initiative (DII) [1] to focus DOD on five key aspects; Aspect #1: Recruit talented and innovative people, Aspect #2: Reinvigorate war-gaming, Aspect #3: Initiate long-range research and development programs, Aspect #4: Make DOD practices more innovative, and Aspect #5: Advance technology and new operational concepts. Per DII instruction, this paper concentrates on Aspect #2 and Aspect #4 by reinvigorating the war-gaming effort with a focus on an innovative approach for developing the optimum Program and Technical Baselines (PTBs) and their corresponding optimum acquisition strategies for acquiring future space systems. The paper describes a unified approach for applying the war-gaming concept for future DOD acquisition of space systems. The proposed approach includes a Unified Game-based Acquisition Framework (UGAF) and an Advanced Game-Based Mathematical Framework (AGMF) using Bayesian war-gaming engines to optimize PTB solutions and select the corresponding optimum acquisition strategies for acquiring a space system. The framework defines the action space for all players with a complete description of the elements associated with the games, including Department of Defense Acquisition Authority (DAA), stakeholders, warfighters, and potential contractors, War-Gaming Engines (WGEs) played by DAA, WGEs played by Contractor (KTR), and the players' Payoff and Cost functions (PCFs). The AGMF presented here addresses both complete and incomplete information cases. The proposed framework provides a recipe for the DAA and USAF-Space and Missile Systems Center (SMC) to acquire future space systems optimally.
Photonics and spectroscopy in nanojunctions: a theoretical insight
Galperin, Michael
2017-04-11
The progress of experimental techniques at the nanoscale in the last decade made optical measurements in current-carrying nanojunctions a reality, thus indicating the emergence of a new field of research coined optoelectronics. Optical spectroscopy of open nonequilibrium systems is a natural meeting point for (at least) two research areas: nonlinear optical spectroscopy and quantum transport, each with its own theoretical toolbox. We review recent progress in the field comparing theoretical treatments of optical response in nanojunctions as is accepted in nonlinear spectroscopy and quantum transport communities. A unified theoretical description of spectroscopy in nanojunctions is presented. Here, we argue thatmore » theoretical approaches of the quantum transport community (and in particular, the Green function based considerations) yield a convenient tool for optoelectronics when the radiation field is treated classically, and that differences between the toolboxes may become critical when studying the quantum radiation field in junctions.« less
Marsh, Herbert W; Pekrun, Reinhard; Murayama, Kou; Arens, A Katrin; Parker, Philip D; Guo, Jiesi; Dicke, Theresa
2018-02-01
Our newly proposed integrated academic self-concept model integrates 3 major theories of academic self-concept formation and developmental perspectives into a unified conceptual and methodological framework. Relations among math self-concept (MSC), school grades, test scores, and school-level contextual effects over 6 years, from the end of primary school through the first 5 years of secondary school (a representative sample of 3,370 German students, 42 secondary schools, 50% male, M age at grade 5 = 11.75) support the (1) internal/external frame of reference model: Math school grades had positive effects on MSC, but the effects of German grades were negative; (2) reciprocal effects (longitudinal panel) model: MSC was predictive of and predicted by math test scores and school grades; (3) big-fish-little-pond effect: The effects on MSC were negative for school-average achievement based on 4 indicators (primary school grades in math and German, school-track prior to the start of secondary school, math test scores in the first year of secondary school). Results for all 3 theoretical models were consistent across the 5 secondary school years: This supports the prediction of developmental equilibrium. This integration highlights the robustness of support over the potentially volatile early to middle adolescent period; the interconnectedness and complementarity of 3 ASC models; their counterbalancing strengths and weaknesses; and new theoretical, developmental, and substantive implications at their intersections. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Understanding the mind from an evolutionary perspective: an overview of evolutionary psychology.
Shackelford, Todd K; Liddle, James R
2014-05-01
The theory of evolution by natural selection provides the only scientific explanation for the existence of complex adaptations. The design features of the brain, like any organ, are the result of selection pressures operating over deep time. Evolutionary psychology posits that the human brain comprises a multitude of evolved psychological mechanisms, adaptations to specific and recurrent problems of survival and reproduction faced over human evolutionary history. Although some mistakenly view evolutionary psychology as promoting genetic determinism, evolutionary psychologists appreciate and emphasize the interactions between genes and environments. This approach to psychology has led to a richer understanding of a variety of psychological phenomena, and has provided a powerful foundation for generating novel hypotheses. Critics argue that evolutionary psychologists resort to storytelling, but as with any branch of science, empirical testing is a vital component of the field, with hypotheses standing or falling with the weight of the evidence. Evolutionary psychology is uniquely suited to provide a unifying theoretical framework for the disparate subdisciplines of psychology. An evolutionary perspective has provided insights into several subdisciplines of psychology, while simultaneously demonstrating the arbitrary nature of dividing psychological science into such subdisciplines. Evolutionary psychologists have amassed a substantial empirical and theoretical literature, but as a relatively new approach to psychology, many questions remain, with several promising directions for future research. For further resources related to this article, please visit the WIREs website. The authors have declared no conflicts of interest for this article. © 2014 John Wiley & Sons, Ltd.
The neural optimal control hierarchy for motor control
NASA Astrophysics Data System (ADS)
DeWolf, T.; Eliasmith, C.
2011-10-01
Our empirical, neuroscientific understanding of biological motor systems has been rapidly growing in recent years. However, this understanding has not been systematically mapped to a quantitative characterization of motor control based in control theory. Here, we attempt to bridge this gap by describing the neural optimal control hierarchy (NOCH), which can serve as a foundation for biologically plausible models of neural motor control. The NOCH has been constructed by taking recent control theoretic models of motor control, analyzing the required processes, generating neurally plausible equivalent calculations and mapping them on to the neural structures that have been empirically identified to form the anatomical basis of motor control. We demonstrate the utility of the NOCH by constructing a simple model based on the identified principles and testing it in two ways. First, we perturb specific anatomical elements of the model and compare the resulting motor behavior with clinical data in which the corresponding area of the brain has been damaged. We show that damaging the assigned functions of the basal ganglia and cerebellum can cause the movement deficiencies seen in patients with Huntington's disease and cerebellar lesions. Second, we demonstrate that single spiking neuron data from our model's motor cortical areas explain major features of single-cell responses recorded from the same primate areas. We suggest that together these results show how NOCH-based models can be used to unify a broad range of data relevant to biological motor control in a quantitative, control theoretic framework.
Scaling Laws of Discrete-Fracture-Network Models
NASA Astrophysics Data System (ADS)
Philippe, D.; Olivier, B.; Caroline, D.; Jean-Raynald, D.
2006-12-01
The statistical description of fracture networks through scale still remains a concern for geologists, considering the complexity of fracture networks. A challenging task of the last 20-years studies has been to find a solid and rectifiable rationale to the trivial observation that fractures exist everywhere and at all sizes. The emergence of fractal models and power-law distributions quantifies this fact, and postulates in some ways that small-scale fractures are genetically linked to their larger-scale relatives. But the validation of these scaling concepts still remains an issue considering the unreachable amount of information that would be necessary with regards to the complexity of natural fracture networks. Beyond the theoretical interest, a scaling law is a basic and necessary ingredient of Discrete-Fracture-Network models (DFN) that are used for many environmental and industrial applications (groundwater resources, mining industry, assessment of the safety of deep waste disposal sites, ..). Indeed, such a function is necessary to assemble scattered data, taken at different scales, into a unified scaling model, and to interpolate fracture densities between observations. In this study, we discuss some important issues related to scaling laws of DFN: - We first describe a complete theoretical and mathematical framework that takes account of both the fracture- size distribution and the fracture clustering through scales (fractal dimension). - We review the scaling laws that have been obtained, and we discuss the ability of fracture datasets to really constrain the parameters of the DFN model. - And finally we discuss the limits of scaling models.
Early stress and human behavioral development: emerging evolutionary perspectives.
Del Giudice, M
2014-08-01
Stress experienced early in life exerts a powerful, lasting influence on development. Converging empirical findings show that stressful experiences become deeply embedded in the child's neurobiology, with an astonishing range of long-term effects on cognition, emotion, and behavior. In contrast with the prevailing view that such effects are the maladaptive outcomes of 'toxic' stress, adaptive models regard them as manifestations of evolved developmental plasticity. In this paper, I offer a brief introduction to adaptive models of early stress and human behavioral development, with emphasis on recent theoretical contributions and emerging concepts in the field. I begin by contrasting dysregulation models of early stress with their adaptive counterparts; I then introduce life history theory as a unifying framework, and review recent work on predictive adaptive responses (PARs) in human life history development. In particular, I discuss the distinction between forecasting the future state of the environment (external prediction) and forecasting the future state of the organism (internal prediction). Next, I present the adaptive calibration model, an integrative model of individual differences in stress responsivity based on life history concepts. I conclude by examining how maternal-fetal conflict may shape the physiology of prenatal stress and its adaptive and maladaptive effects on postnatal development. In total, I aim to show how theoretical work from evolutionary biology is reshaping the way we think about the role of stress in human development, and provide researchers with an up-to-date conceptual map of this fascinating and rapidly evolving field.
Calibration of the clock-phase biases of GNSS networks: the closure-ambiguity approach
NASA Astrophysics Data System (ADS)
Lannes, A.; Prieur, J.-L.
2013-08-01
In global navigation satellite systems (GNSS), the problem of retrieving clock-phase biases from network data has a basic rank defect. We analyse the different ways of removing this rank defect, and define a particular strategy for obtaining these phase biases in a standard form. The minimum-constrained problem to be solved in the least-squares (LS) sense depends on some integer vector which can be fixed in an arbitrary manner. We propose to solve the problem via an undifferenced approach based on the notion of closure ambiguity. We present a theoretical justification of this closure-ambiguity approach (CAA), and the main elements for a practical implementation. The links with other methods are also established. We analyse all those methods in a unified interpretative framework, and derive functional relations between the corresponding solutions and our CAA solution. This could be interesting for many GNSS applications like real-time kinematic PPP for instance. To compare the methods providing LS estimates of clock-phase biases, we define a particular solution playing the role of reference solution. For this solution, when a phase bias is estimated for the first time, its fractional part is confined to the one-cycle width interval centred on zero; the integer-ambiguity set is modified accordingly. Our theoretical study is illustrated with some simple and generic examples; it could have applications in data processing of most GNSS networks, and particularly global networks using GPS, Glonass, Galileo, or BeiDou/Compass satellites.
Stochastic simulation of spatially correlated geo-processes
Christakos, G.
1987-01-01
In this study, developments in the theory of stochastic simulation are discussed. The unifying element is the notion of Radon projection in Euclidean spaces. This notion provides a natural way of reconstructing the real process from a corresponding process observable on a reduced dimensionality space, where analysis is theoretically easier and computationally tractable. Within this framework, the concept of space transformation is defined and several of its properties, which are of significant importance within the context of spatially correlated processes, are explored. The turning bands operator is shown to follow from this. This strengthens considerably the theoretical background of the geostatistical method of simulation, and some new results are obtained in both the space and frequency domains. The inverse problem is solved generally and the applicability of the method is extended to anisotropic as well as integrated processes. Some ill-posed problems of the inverse operator are discussed. Effects of the measurement error and impulses at origin are examined. Important features of the simulated process as described by geomechanical laws, the morphology of the deposit, etc., may be incorporated in the analysis. The simulation may become a model-dependent procedure and this, in turn, may provide numerical solutions to spatial-temporal geologic models. Because the spatial simu??lation may be technically reduced to unidimensional simulations, various techniques of generating one-dimensional realizations are reviewed. To link theory and practice, an example is computed in detail. ?? 1987 International Association for Mathematical Geology.
Unifying Gate Synthesis and Magic State Distillation.
Campbell, Earl T; Howard, Mark
2017-02-10
The leading paradigm for performing a computation on quantum memories can be encapsulated as distill-then-synthesize. Initially, one performs several rounds of distillation to create high-fidelity magic states that provide one good T gate, an essential quantum logic gate. Subsequently, gate synthesis intersperses many T gates with Clifford gates to realize a desired circuit. We introduce a unified framework that implements one round of distillation and multiquibit gate synthesis in a single step. Typically, our method uses the same number of T gates as conventional synthesis but with the added benefit of quadratic error suppression. Because of this, one less round of magic state distillation needs to be performed, leading to significant resource savings.