Phase noise suppression for coherent optical block transmission systems: a unified framework.
Yang, Chuanchuan; Yang, Feng; Wang, Ziyu
2011-08-29
A unified framework for phase noise suppression is proposed in this paper, which could be applied in any coherent optical block transmission systems, including coherent optical orthogonal frequency-division multiplexing (CO-OFDM), coherent optical single-carrier frequency-domain equalization block transmission (CO-SCFDE), etc. Based on adaptive modeling of phase noise, unified observation equations for different coherent optical block transmission systems are constructed, which lead to unified phase noise estimation and suppression. Numerical results demonstrate that the proposal is powerful in mitigating laser phase noise.
An Unified Multiscale Framework for Planar, Surface, and Curve Skeletonization.
Jalba, Andrei C; Sobiecki, Andre; Telea, Alexandru C
2016-01-01
Computing skeletons of 2D shapes, and medial surface and curve skeletons of 3D shapes, is a challenging task. In particular, there is no unified framework that detects all types of skeletons using a single model, and also produces a multiscale representation which allows to progressively simplify, or regularize, all skeleton types. In this paper, we present such a framework. We model skeleton detection and regularization by a conservative mass transport process from a shape's boundary to its surface skeleton, next to its curve skeleton, and finally to the shape center. The resulting density field can be thresholded to obtain a multiscale representation of progressively simplified surface, or curve, skeletons. We detail a numerical implementation of our framework which is demonstrably stable and has high computational efficiency. We demonstrate our framework on several complex 2D and 3D shapes.
Robopedia: Leveraging Sensorpedia for Web-Enabled Robot Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Resseguie, David R
There is a growing interest in building Internetscale sensor networks that integrate sensors from around the world into a single unified system. In contrast, robotics application development has primarily focused on building specialized systems. These specialized systems take scalability and reliability into consideration, but generally neglect exploring the key components required to build a large scale system. Integrating robotic applications with Internet-scale sensor networks will unify specialized robotics applications and provide answers to large scale implementation concerns. We focus on utilizing Internet-scale sensor network technology to construct a framework for unifying robotic systems. Our framework web-enables a surveillance robot smore » sensor observations and provides a webinterface to the robot s actuators. This lets robots seamlessly integrate into web applications. In addition, the framework eliminates most prerequisite robotics knowledge, allowing for the creation of general web-based robotics applications. The framework also provides mechanisms to create applications that can interface with any robot. Frameworks such as this one are key to solving large scale mobile robotics implementation problems. We provide an overview of previous Internetscale sensor networks, Sensorpedia (an ad-hoc Internet-scale sensor network), our framework for integrating robots with Sensorpedia, two applications which illustrate our frameworks ability to support general web-based robotic control, and offer experimental results that illustrate our framework s scalability, feasibility, and resource requirements.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang Yinan; Shi Handuo; Xiong Zhaoxi
We present a unified universal quantum cloning machine, which combines several different existing universal cloning machines together, including the asymmetric case. In this unified framework, the identical pure states are projected equally into each copy initially constituted by input and one half of the maximally entangled states. We show explicitly that the output states of those universal cloning machines are the same. One importance of this unified cloning machine is that the cloning procession is always the symmetric projection, which reduces dramatically the difficulties for implementation. Also, it is found that this unified cloning machine can be directly modified tomore » the general asymmetric case. Besides the global fidelity and the single-copy fidelity, we also present all possible arbitrary-copy fidelities.« less
Theory and applications of structured light single pixel imaging
NASA Astrophysics Data System (ADS)
Stokoe, Robert J.; Stockton, Patrick A.; Pezeshki, Ali; Bartels, Randy A.
2018-02-01
Many single-pixel imaging techniques have been developed in recent years. Though the methods of image acquisition vary considerably, the methods share unifying features that make general analysis possible. Furthermore, the methods developed thus far are based on intuitive processes that enable simple and physically-motivated reconstruction algorithms, however, this approach may not leverage the full potential of single-pixel imaging. We present a general theoretical framework of single-pixel imaging based on frame theory, which enables general, mathematically rigorous analysis. We apply our theoretical framework to existing single-pixel imaging techniques, as well as provide a foundation for developing more-advanced methods of image acquisition and reconstruction. The proposed frame theoretic framework for single-pixel imaging results in improved noise robustness, decrease in acquisition time, and can take advantage of special properties of the specimen under study. By building on this framework, new methods of imaging with a single element detector can be developed to realize the full potential associated with single-pixel imaging.
Liu, Dan; Liu, Xuejun; Wu, Yiguang
2018-04-24
This paper presents an effective approach for depth reconstruction from a single image through the incorporation of semantic information and local details from the image. A unified framework for depth acquisition is constructed by joining a deep Convolutional Neural Network (CNN) and a continuous pairwise Conditional Random Field (CRF) model. Semantic information and relative depth trends of local regions inside the image are integrated into the framework. A deep CNN network is firstly used to automatically learn a hierarchical feature representation of the image. To get more local details in the image, the relative depth trends of local regions are incorporated into the network. Combined with semantic information of the image, a continuous pairwise CRF is then established and is used as the loss function of the unified model. Experiments on real scenes demonstrate that the proposed approach is effective and that the approach obtains satisfactory results.
Some characteristics of supernetworks based on unified hybrid network theory framework
NASA Astrophysics Data System (ADS)
Liu, Qiang; Fang, Jin-Qing; Li, Yong
Comparing with single complex networks, supernetworks are more close to the real world in some ways, and have become the newest research hot spot in the network science recently. Some progresses have been made in the research of supernetworks, but the theoretical research method and complex network characteristics of supernetwork models are still needed to further explore. In this paper, we propose three kinds of supernetwork models with three layers based on the unified hybrid network theory framework (UHNTF), and introduce preferential and random linking, respectively, between the upper and lower layers. Then we compared the topological characteristics of the single networks with the supernetwork models. In order to analyze the influence of the interlayer edges on network characteristics, the cross-degree is defined as a new important parameter. Then some interesting new phenomena are found, the results imply this supernetwork model has reference value and application potential.
A Unified Framework for Association Analysis with Multiple Related Phenotypes
Stephens, Matthew
2013-01-01
We consider the problem of assessing associations between multiple related outcome variables, and a single explanatory variable of interest. This problem arises in many settings, including genetic association studies, where the explanatory variable is genotype at a genetic variant. We outline a framework for conducting this type of analysis, based on Bayesian model comparison and model averaging for multivariate regressions. This framework unifies several common approaches to this problem, and includes both standard univariate and standard multivariate association tests as special cases. The framework also unifies the problems of testing for associations and explaining associations – that is, identifying which outcome variables are associated with genotype. This provides an alternative to the usual, but conceptually unsatisfying, approach of resorting to univariate tests when explaining and interpreting significant multivariate findings. The method is computationally tractable genome-wide for modest numbers of phenotypes (e.g. 5–10), and can be applied to summary data, without access to raw genotype and phenotype data. We illustrate the methods on both simulated examples, and to a genome-wide association study of blood lipid traits where we identify 18 potential novel genetic associations that were not identified by univariate analyses of the same data. PMID:23861737
A quasi-likelihood approach to non-negative matrix factorization
Devarajan, Karthik; Cheung, Vincent C.K.
2017-01-01
A unified approach to non-negative matrix factorization based on the theory of generalized linear models is proposed. This approach embeds a variety of statistical models, including the exponential family, within a single theoretical framework and provides a unified view of such factorizations from the perspective of quasi-likelihood. Using this framework, a family of algorithms for handling signal-dependent noise is developed and its convergence proven using the Expectation-Maximization algorithm. In addition, a measure to evaluate the goodness-of-fit of the resulting factorization is described. The proposed methods allow modeling of non-linear effects via appropriate link functions and are illustrated using an application in biomedical signal processing. PMID:27348511
Stam, Henderikus J.
2015-01-01
The search for a so-called unified or integrated theory has long served as a goal for some psychologists, even if the search is often implicit. But if the established sciences do not have an explicitly unified set of theories, then why should psychology? After examining this question again I argue that psychology is in fact reasonably unified around its methods and its commitment to functional explanations, an indeterminate functionalism. The question of the place of the neurosciences in this framework is complex. On the one hand, the neuroscientific project will not likely renew and synthesize the disparate arms of psychology. On the other hand, their reformulation of what it means to be human will exert an influence in multiple ways. One way to capture that influence is to conceptualize the brain in terms of a technology that we interact with in a manner that we do not yet fully understand. In this way we maintain both a distance from neuro-reductionism and refrain from committing to an unfettered subjectivity. PMID:26500571
A general modeling framework for describing spatially structured population dynamics
Sample, Christine; Fryxell, John; Bieri, Joanna; Federico, Paula; Earl, Julia; Wiederholt, Ruscena; Mattsson, Brady; Flockhart, Tyler; Nicol, Sam; Diffendorfer, James E.; Thogmartin, Wayne E.; Erickson, Richard A.; Norris, D. Ryan
2017-01-01
Variation in movement across time and space fundamentally shapes the abundance and distribution of populations. Although a variety of approaches model structured population dynamics, they are limited to specific types of spatially structured populations and lack a unifying framework. Here, we propose a unified network-based framework sufficiently novel in its flexibility to capture a wide variety of spatiotemporal processes including metapopulations and a range of migratory patterns. It can accommodate different kinds of age structures, forms of population growth, dispersal, nomadism and migration, and alternative life-history strategies. Our objective was to link three general elements common to all spatially structured populations (space, time and movement) under a single mathematical framework. To do this, we adopt a network modeling approach. The spatial structure of a population is represented by a weighted and directed network. Each node and each edge has a set of attributes which vary through time. The dynamics of our network-based population is modeled with discrete time steps. Using both theoretical and real-world examples, we show how common elements recur across species with disparate movement strategies and how they can be combined under a unified mathematical framework. We illustrate how metapopulations, various migratory patterns, and nomadism can be represented with this modeling approach. We also apply our network-based framework to four organisms spanning a wide range of life histories, movement patterns, and carrying capacities. General computer code to implement our framework is provided, which can be applied to almost any spatially structured population. This framework contributes to our theoretical understanding of population dynamics and has practical management applications, including understanding the impact of perturbations on population size, distribution, and movement patterns. By working within a common framework, there is less chance that comparative analyses are colored by model details rather than general principles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eccleston, C.H.
1997-09-05
The National Environmental Policy Act (NEPA) of 1969 was established by Congress more than a quarter of a century ago, yet there is a surprising lack of specific tools, techniques, and methodologies for effectively implementing these regulatory requirements. Lack of professionally accepted techniques is a principal factor responsible for many inefficiencies. Often, decision makers do not fully appreciate or capitalize on the true potential which NEPA provides as a platform for planning future actions. New approaches and modem management tools must be adopted to fully achieve NEPA`s mandate. A new strategy, referred to as Total Federal Planning, is proposed formore » unifying large-scale federal planning efforts under a single, systematic, structured, and holistic process. Under this approach, the NEPA planning process provides a unifying framework for integrating all early environmental and nonenvironmental decision-making factors into a single comprehensive planning process. To promote effectiveness and efficiency, modem tools and principles from the disciplines of Value Engineering, Systems Engineering, and Total Quality Management are incorporated. Properly integrated and implemented, these planning tools provide the rigorous, structured, and disciplined framework essential in achieving effective planning. Ultimately, the goal of a Total Federal Planning strategy is to construct a unified and interdisciplinary framework that substantially improves decision-making, while reducing the time, cost, redundancy, and effort necessary to comply with environmental and other planning requirements. At a time when Congress is striving to re-engineer the governmental framework, apparatus, and process, a Total Federal Planning philosophy offers a systematic approach for uniting the disjointed and often convoluted planning process currently used by most federal agencies. Potentially this approach has widespread implications in the way federal planning is approached.« less
Food-web based unified model of macro- and microevolution.
Chowdhury, Debashish; Stauffer, Dietrich
2003-10-01
We incorporate the generic hierarchical architecture of foodwebs into a "unified" model that describes both micro- and macroevolutions within a single theoretical framework. This model describes the microevolution in detail by accounting for the birth, ageing, and natural death of individual organisms as well as prey-predator interactions on a hierarchical dynamic food web. It also provides a natural description of random mutations and speciation (origination) of species as well as their extinctions. The distribution of lifetimes of species follows an approximate power law only over a limited regime.
A unifying retinex model based on non-local differential operators
NASA Astrophysics Data System (ADS)
Zosso, Dominique; Tran, Giang; Osher, Stanley
2013-02-01
In this paper, we present a unifying framework for retinex that is able to reproduce many of the existing retinex implementations within a single model. The fundamental assumption, as shared with many retinex models, is that the observed image is a multiplication between the illumination and the true underlying reflectance of the object. Starting from Morel's 2010 PDE model for retinex, where illumination is supposed to vary smoothly and where the reflectance is thus recovered from a hard-thresholded Laplacian of the observed image in a Poisson equation, we define our retinex model in similar but more general two steps. First, look for a filtered gradient that is the solution of an optimization problem consisting of two terms: The first term is a sparsity prior of the reflectance, such as the TV or H1 norm, while the second term is a quadratic fidelity prior of the reflectance gradient with respect to the observed image gradients. In a second step, since this filtered gradient almost certainly is not a consistent image gradient, we then look for a reflectance whose actual gradient comes close. Beyond unifying existing models, we are able to derive entirely novel retinex formulations by using more interesting non-local versions for the sparsity and fidelity prior. Hence we define within a single framework new retinex instances particularly suited for texture-preserving shadow removal, cartoon-texture decomposition, color and hyperspectral image enhancement.
ERIC Educational Resources Information Center
Ning, Hoi Kwan; Downing, Kevin
2010-01-01
While previous studies have examined the single directional effects of motivation constructs in influencing students' use of self-regulatory strategies, few attempts have been made to unravel their interrelationship in a unified framework. In this study we adopt the social cognitive perspective and examine the reciprocal interplay between…
A Unifying Framework for Causal Analysis in Set-Theoretic Multimethod Research
ERIC Educational Resources Information Center
Rohlfing, Ingo; Schneider, Carsten Q.
2018-01-01
The combination of Qualitative Comparative Analysis (QCA) with process tracing, which we call set-theoretic multimethod research (MMR), is steadily becoming more popular in empirical research. Despite the fact that both methods have an elected affinity based on set theory, it is not obvious how a within-case method operating in a single case and a…
ViSA: A Neurodynamic Model for Visuo-Spatial Working Memory, Attentional Blink, and Conscious Access
ERIC Educational Resources Information Center
Simione, Luca; Raffone, Antonino; Wolters, Gezinus; Salmas, Paola; Nakatani, Chie; Belardinelli, Marta Olivetti; van Leeuwen, Cees
2012-01-01
Two separate lines of study have clarified the role of selectivity in conscious access to visual information. Both involve presenting multiple targets and distracters: one "simultaneously" in a spatially distributed fashion, the other "sequentially" at a single location. To understand their findings in a unified framework, we propose a…
A Unified Framework for Periodic, On-Demand, and User-Specified Software Information
NASA Technical Reports Server (NTRS)
Kolano, Paul Z.
2004-01-01
Although grid computing can increase the number of resources available to a user; not all resources on the grid may have a software environment suitable for running a given application. To provide users with the necessary assistance for selecting resources with compatible software environments and/or for automatically establishing such environments, it is necessary to have an accurate source of information about the software installed across the grid. This paper presents a new OGSI-compliant software information service that has been implemented as part of NASA's Information Power Grid project. This service is built on top of a general framework for reconciling information from periodic, on-demand, and user-specified sources. Information is retrieved using standard XPath queries over a single unified namespace independent of the information's source. Two consumers of the provided software information, the IPG Resource Broker and the IPG Neutralization Service, are briefly described.
Benefits of a Unified LaSRS++ Simulation for NAS-Wide and High-Fidelity Modeling
NASA Technical Reports Server (NTRS)
Glaab, Patricia; Madden, Michael
2014-01-01
The LaSRS++ high-fidelity vehicle simulation was extended in 2012 to support a NAS-wide simulation mode. Since the initial proof-of-concept, the LaSRS++ NAS-wide simulation is maturing into a research-ready tool. A primary benefit of this new capability is the consolidation of the two modeling paradigms under a single framework to save cost, facilitate iterative concept testing between the two tools, and to promote communication and model sharing between user communities at Langley. Specific benefits of each type of modeling are discussed along with the expected benefits of the unified framework. Current capability details of the LaSRS++ NAS-wide simulations are provided, including the visualization tool, live data interface, trajectory generators, terminal routing for arrivals and departures, maneuvering, re-routing, navigation, winds, and turbulence. The plan for future development is also described.
Tomalia, Donald A; Khanna, Shiv N
2016-02-24
Development of a central paradigm is undoubtedly the single most influential force responsible for advancing Dalton's 19th century atomic/molecular chemistry concepts to the current maturity enjoyed by traditional chemistry. A similar central dogma for guiding and unifying nanoscience has been missing. This review traces the origins, evolution, and current status of such a critical nanoperiodic concept/framework for defining and unifying nanoscience. Based on parallel efforts and a mutual consensus now shared by both chemists and physicists, a nanoperiodic/systematic framework concept has emerged. This concept is based on the well-documented existence of discrete, nanoscale collections of traditional inorganic/organic atoms referred to as hard and soft superatoms (i.e., nanoelement categories). These nanometric entities are widely recognized to exhibit nanoscale atom mimicry features reminiscent of traditional picoscale atoms. All unique superatom/nanoelement physicochemical features are derived from quantized structural control defined by six critical nanoscale design parameters (CNDPs), namely, size, shape, surface chemistry, flexibility/rigidity, architecture, and elemental composition. These CNDPs determine all intrinsic superatom properties, their combining behavior to form stoichiometric nanocompounds/assemblies as well as to exhibit nanoperiodic properties leading to new nanoperiodic rules and predictive Mendeleev-like nanoperiodic tables, and they portend possible extension of these principles to larger quantized building blocks including meta-atoms.
A unifying framework for quantifying the nature of animal interactions.
Potts, Jonathan R; Mokross, Karl; Lewis, Mark A
2014-07-06
Collective phenomena, whereby agent-agent interactions determine spatial patterns, are ubiquitous in the animal kingdom. On the other hand, movement and space use are also greatly influenced by the interactions between animals and their environment. Despite both types of interaction fundamentally influencing animal behaviour, there has hitherto been no unifying framework for the models proposed in both areas. Here, we construct a general method for inferring population-level spatial patterns from underlying individual movement and interaction processes, a key ingredient in building a statistical mechanics for ecological systems. We show that resource selection functions, as well as several examples of collective motion models, arise as special cases of our framework, thus bringing together resource selection analysis and collective animal behaviour into a single theory. In particular, we focus on combining the various mechanistic models of territorial interactions in the literature with step selection functions, by incorporating interactions into the step selection framework and demonstrating how to derive territorial patterns from the resulting models. We demonstrate the efficacy of our model by application to a population of insectivore birds in the Amazon rainforest. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Parametric models to relate spike train and LFP dynamics with neural information processing.
Banerjee, Arpan; Dean, Heather L; Pesaran, Bijan
2012-01-01
Spike trains and local field potentials (LFPs) resulting from extracellular current flows provide a substrate for neural information processing. Understanding the neural code from simultaneous spike-field recordings and subsequent decoding of information processing events will have widespread applications. One way to demonstrate an understanding of the neural code, with particular advantages for the development of applications, is to formulate a parametric statistical model of neural activity and its covariates. Here, we propose a set of parametric spike-field models (unified models) that can be used with existing decoding algorithms to reveal the timing of task or stimulus specific processing. Our proposed unified modeling framework captures the effects of two important features of information processing: time-varying stimulus-driven inputs and ongoing background activity that occurs even in the absence of environmental inputs. We have applied this framework for decoding neural latencies in simulated and experimentally recorded spike-field sessions obtained from the lateral intraparietal area (LIP) of awake, behaving monkeys performing cued look-and-reach movements to spatial targets. Using both simulated and experimental data, we find that estimates of trial-by-trial parameters are not significantly affected by the presence of ongoing background activity. However, including background activity in the unified model improves goodness of fit for predicting individual spiking events. Uncovering the relationship between the model parameters and the timing of movements offers new ways to test hypotheses about the relationship between neural activity and behavior. We obtained significant spike-field onset time correlations from single trials using a previously published data set where significantly strong correlation was only obtained through trial averaging. We also found that unified models extracted a stronger relationship between neural response latency and trial-by-trial behavioral performance than existing models of neural information processing. Our results highlight the utility of the unified modeling framework for characterizing spike-LFP recordings obtained during behavioral performance.
Quantification of causal couplings via dynamical effects: A unifying perspective
NASA Astrophysics Data System (ADS)
Smirnov, Dmitry A.
2014-12-01
Quantitative characterization of causal couplings from time series is crucial in studies of complex systems of different origin. Various statistical tools for that exist and new ones are still being developed with a tendency to creating a single, universal, model-free quantifier of coupling strength. However, a clear and generally applicable way of interpreting such universal characteristics is lacking. This work suggests a general conceptual framework for causal coupling quantification, which is based on state space models and extends the concepts of virtual interventions and dynamical causal effects. Namely, two basic kinds of interventions (state space and parametric) and effects (orbital or transient and stationary or limit) are introduced, giving four families of coupling characteristics. The framework provides a unifying view of apparently different well-established measures and allows us to introduce new characteristics, always with a definite "intervention-effect" interpretation. It is shown that diverse characteristics cannot be reduced to any single coupling strength quantifier and their interpretation is inevitably model based. The proposed set of dynamical causal effect measures quantifies different aspects of "how the coupling manifests itself in the dynamics," reformulating the very question about the "causal coupling strength."
NASA Astrophysics Data System (ADS)
Codello, Alessandro; Jain, Rajeev Kumar
2018-05-01
We present a unified evolution of the universe from very early times until the present epoch by including both the leading local correction R^2 and the leading non-local term R1/\\square ^2R to the classical gravitational action. We find that the inflationary phase driven by R^2 term gracefully exits in a transitory regime characterized by coherent oscillations of the Hubble parameter. The universe then naturally enters into a radiation dominated epoch followed by a matter dominated era. At sufficiently late times after radiation-matter equality, the non-local term starts to dominate inducing an accelerated expansion of the universe at the present epoch. We further exhibit the fact that both the leading local and non-local terms can be obtained within the covariant effective field theory of gravity. This scenario thus provides a unified picture of inflation and dark energy in a single framework by means of a purely gravitational action without the usual need of a scalar field.
Single atom emission in an optical resonator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Childs, J.J.; An, K.; Dasari, R.R.
A single atom coupled to a single mode of a radiation field is a fundamental system for studying the interaction of radiation with matter. The study of such systems has come to be called cavity quantum electrodynamics (QED). Atoms coupled to a single mode of a resonator have been studied experimentally and theoretically in several interesting regimes since this basic system was first considered theoretically by Janes and Cummings. The objective of the present chapter is to provide a theoretical framework and present a unifying picture of the various phenomena which can occur in such a system. 35 refs., 11more » figs.« less
Unifying Gate Synthesis and Magic State Distillation.
Campbell, Earl T; Howard, Mark
2017-02-10
The leading paradigm for performing a computation on quantum memories can be encapsulated as distill-then-synthesize. Initially, one performs several rounds of distillation to create high-fidelity magic states that provide one good T gate, an essential quantum logic gate. Subsequently, gate synthesis intersperses many T gates with Clifford gates to realize a desired circuit. We introduce a unified framework that implements one round of distillation and multiquibit gate synthesis in a single step. Typically, our method uses the same number of T gates as conventional synthesis but with the added benefit of quadratic error suppression. Because of this, one less round of magic state distillation needs to be performed, leading to significant resource savings.
A Unified Approach to Intra-Domain Security
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shue, Craig A; Kalafut, Andrew J.; Gupta, Prof. Minaxi
2009-01-01
While a variety of mechanisms have been developed for securing individual intra-domain protocols, none address the issue in a holistic manner. We develop a unified framework to secure prominent networking protocols within a single domain. We begin with a secure version of the DHCP protocol, which has the additional feature of providing each host with a certificate. We then leverage these certificates to secure ARP, prevent spoofing within the domain, and secure SSH and VPN connections between the domain and hosts which have previously interacted with it locally. In doing so, we also develop an incrementally deployable public key infrastructuremore » which can later be leveraged to support inter-domain authentication.« less
A Unified Framework for Analyzing and Designing for Stationary Arterial Networks
DOT National Transportation Integrated Search
2017-05-17
This research aims to develop a unified theoretical and simulation framework for analyzing and designing signals for stationary arterial networks. Existing traffic flow models used in design and analysis of signal control strategies are either too si...
Montijn, Jorrit Steven; Klink, P Christaan; van Wezel, Richard J A
2012-01-01
Divisive normalization models of covert attention commonly use spike rate modulations as indicators of the effect of top-down attention. In addition, an increasing number of studies have shown that top-down attention increases the synchronization of neuronal oscillations as well, particularly in gamma-band frequencies (25-100 Hz). Although modulations of spike rate and synchronous oscillations are not mutually exclusive as mechanisms of attention, there has thus far been little effort to integrate these concepts into a single framework of attention. Here, we aim to provide such a unified framework by expanding the normalization model of attention with a multi-level hierarchical structure and a time dimension; allowing the simulation of a recently reported backward progression of attentional effects along the visual cortical hierarchy. A simple cascade of normalization models simulating different cortical areas is shown to cause signal degradation and a loss of stimulus discriminability over time. To negate this degradation and ensure stable neuronal stimulus representations, we incorporate a kind of oscillatory phase entrainment into our model that has previously been proposed as the "communication-through-coherence" (CTC) hypothesis. Our analysis shows that divisive normalization and oscillation models can complement each other in a unified account of the neural mechanisms of selective visual attention. The resulting hierarchical normalization and oscillation (HNO) model reproduces several additional spatial and temporal aspects of attentional modulation and predicts a latency effect on neuronal responses as a result of cued attention.
Montijn, Jorrit Steven; Klink, P. Christaan; van Wezel, Richard J. A.
2012-01-01
Divisive normalization models of covert attention commonly use spike rate modulations as indicators of the effect of top-down attention. In addition, an increasing number of studies have shown that top-down attention increases the synchronization of neuronal oscillations as well, particularly in gamma-band frequencies (25–100 Hz). Although modulations of spike rate and synchronous oscillations are not mutually exclusive as mechanisms of attention, there has thus far been little effort to integrate these concepts into a single framework of attention. Here, we aim to provide such a unified framework by expanding the normalization model of attention with a multi-level hierarchical structure and a time dimension; allowing the simulation of a recently reported backward progression of attentional effects along the visual cortical hierarchy. A simple cascade of normalization models simulating different cortical areas is shown to cause signal degradation and a loss of stimulus discriminability over time. To negate this degradation and ensure stable neuronal stimulus representations, we incorporate a kind of oscillatory phase entrainment into our model that has previously been proposed as the “communication-through-coherence” (CTC) hypothesis. Our analysis shows that divisive normalization and oscillation models can complement each other in a unified account of the neural mechanisms of selective visual attention. The resulting hierarchical normalization and oscillation (HNO) model reproduces several additional spatial and temporal aspects of attentional modulation and predicts a latency effect on neuronal responses as a result of cued attention. PMID:22586372
Konik, Robert M.; Sfeir, Matthew Y.; Misewich, James A.
2015-02-17
We demonstrate that a non-perturbative framework for the treatment of the excitations of single walled carbon nanotubes based upon a field theoretic reduction is able to accurately describe experiment observations of the absolute values of excitonic energies. This theoretical framework yields a simple scaling function from which the excitonic energies can be read off. This scaling function is primarily determined by a single parameter, the charge Luttinger parameter of the tube, which is in turn a function of the tube chirality, dielectric environment, and the tube's dimensions, thus expressing disparate influences on the excitonic energies in a unified fashion. Asmore » a result, we test this theory explicitly on the data reported in [NanoLetters 5, 2314 (2005)] and [Phys. Rev. B 82, 195424 (2010)] and so demonstrate the method works over a wide range of reported excitonic spectra.« less
Palacios-Flores, Kim; García-Sotelo, Jair; Castillo, Alejandra; Uribe, Carina; Aguilar, Luis; Morales, Lucía; Gómez-Romero, Laura; Reyes, José; Garciarubio, Alejandro; Boege, Margareta; Dávila, Guillermo
2018-01-01
We present a conceptually simple, sensitive, precise, and essentially nonstatistical solution for the analysis of genome variation in haploid organisms. The generation of a Perfect Match Genomic Landscape (PMGL), which computes intergenome identity with single nucleotide resolution, reveals signatures of variation wherever a query genome differs from a reference genome. Such signatures encode the precise location of different types of variants, including single nucleotide variants, deletions, insertions, and amplifications, effectively introducing the concept of a general signature of variation. The precise nature of variants is then resolved through the generation of targeted alignments between specific sets of sequence reads and known regions of the reference genome. Thus, the perfect match logic decouples the identification of the location of variants from the characterization of their nature, providing a unified framework for the detection of genome variation. We assessed the performance of the PMGL strategy via simulation experiments. We determined the variation profiles of natural genomes and of a synthetic chromosome, both in the context of haploid yeast strains. Our approach uncovered variants that have previously escaped detection. Moreover, our strategy is ideally suited for further refining high-quality reference genomes. The source codes for the automated PMGL pipeline have been deposited in a public repository. PMID:29367403
Unifying Temporal and Structural Credit Assignment Problems
NASA Technical Reports Server (NTRS)
Agogino, Adrian K.; Tumer, Kagan
2004-01-01
Single-agent reinforcement learners in time-extended domains and multi-agent systems share a common dilemma known as the credit assignment problem. Multi-agent systems have the structural credit assignment problem of determining the contributions of a particular agent to a common task. Instead, time-extended single-agent systems have the temporal credit assignment problem of determining the contribution of a particular action to the quality of the full sequence of actions. Traditionally these two problems are considered different and are handled in separate ways. In this article we show how these two forms of the credit assignment problem are equivalent. In this unified frame-work, a single-agent Markov decision process can be broken down into a single-time-step multi-agent process. Furthermore we show that Monte-Carlo estimation or Q-learning (depending on whether the values of resulting actions in the episode are known at the time of learning) are equivalent to different agent utility functions in a multi-agent system. This equivalence shows how an often neglected issue in multi-agent systems is equivalent to a well-known deficiency in multi-time-step learning and lays the basis for solving time-extended multi-agent problems, where both credit assignment problems are present.
Control of Distributed Parameter Systems
1990-08-01
vari- ant of the general Lotka - Volterra model for interspecific competition. The variant described the emergence of one subpopulation from another as a...distribut ion unlimited. I&. ARSTRACT (MAUMUnw2O1 A unified arioroximation framework for Parameter estimation In general linear POE models has been completed...unified approximation framework for parameter estimation in general linear PDE models. This framework has provided the theoretical basis for a number of
A Unified Analysis of Structured Sonar-terrain Data using Bayesian Functional Mixed Models.
Zhu, Hongxiao; Caspers, Philip; Morris, Jeffrey S; Wu, Xiaowei; Müller, Rolf
2018-01-01
Sonar emits pulses of sound and uses the reflected echoes to gain information about target objects. It offers a low cost, complementary sensing modality for small robotic platforms. While existing analytical approaches often assume independence across echoes, real sonar data can have more complicated structures due to device setup or experimental design. In this paper, we consider sonar echo data collected from multiple terrain substrates with a dual-channel sonar head. Our goals are to identify the differential sonar responses to terrains and study the effectiveness of this dual-channel design in discriminating targets. We describe a unified analytical framework that achieves these goals rigorously, simultaneously, and automatically. The analysis was done by treating the echo envelope signals as functional responses and the terrain/channel information as covariates in a functional regression setting. We adopt functional mixed models that facilitate the estimation of terrain and channel effects while capturing the complex hierarchical structure in data. This unified analytical framework incorporates both Gaussian models and robust models. We fit the models using a full Bayesian approach, which enables us to perform multiple inferential tasks under the same modeling framework, including selecting models, estimating the effects of interest, identifying significant local regions, discriminating terrain types, and describing the discriminatory power of local regions. Our analysis of the sonar-terrain data identifies time regions that reflect differential sonar responses to terrains. The discriminant analysis suggests that a multi- or dual-channel design achieves target identification performance comparable with or better than a single-channel design.
A Unified Analysis of Structured Sonar-terrain Data using Bayesian Functional Mixed Models
Zhu, Hongxiao; Caspers, Philip; Morris, Jeffrey S.; Wu, Xiaowei; Müller, Rolf
2017-01-01
Sonar emits pulses of sound and uses the reflected echoes to gain information about target objects. It offers a low cost, complementary sensing modality for small robotic platforms. While existing analytical approaches often assume independence across echoes, real sonar data can have more complicated structures due to device setup or experimental design. In this paper, we consider sonar echo data collected from multiple terrain substrates with a dual-channel sonar head. Our goals are to identify the differential sonar responses to terrains and study the effectiveness of this dual-channel design in discriminating targets. We describe a unified analytical framework that achieves these goals rigorously, simultaneously, and automatically. The analysis was done by treating the echo envelope signals as functional responses and the terrain/channel information as covariates in a functional regression setting. We adopt functional mixed models that facilitate the estimation of terrain and channel effects while capturing the complex hierarchical structure in data. This unified analytical framework incorporates both Gaussian models and robust models. We fit the models using a full Bayesian approach, which enables us to perform multiple inferential tasks under the same modeling framework, including selecting models, estimating the effects of interest, identifying significant local regions, discriminating terrain types, and describing the discriminatory power of local regions. Our analysis of the sonar-terrain data identifies time regions that reflect differential sonar responses to terrains. The discriminant analysis suggests that a multi- or dual-channel design achieves target identification performance comparable with or better than a single-channel design. PMID:29749977
Unifying framework for multimodal brain MRI segmentation based on Hidden Markov Chains.
Bricq, S; Collet, Ch; Armspach, J P
2008-12-01
In the frame of 3D medical imaging, accurate segmentation of multimodal brain MR images is of interest for many brain disorders. However, due to several factors such as noise, imaging artifacts, intrinsic tissue variation and partial volume effects, tissue classification remains a challenging task. In this paper, we present a unifying framework for unsupervised segmentation of multimodal brain MR images including partial volume effect, bias field correction, and information given by a probabilistic atlas. Here-proposed method takes into account neighborhood information using a Hidden Markov Chain (HMC) model. Due to the limited resolution of imaging devices, voxels may be composed of a mixture of different tissue types, this partial volume effect is included to achieve an accurate segmentation of brain tissues. Instead of assigning each voxel to a single tissue class (i.e., hard classification), we compute the relative amount of each pure tissue class in each voxel (mixture estimation). Further, a bias field estimation step is added to the proposed algorithm to correct intensity inhomogeneities. Furthermore, atlas priors were incorporated using probabilistic brain atlas containing prior expectations about the spatial localization of different tissue classes. This atlas is considered as a complementary sensor and the proposed method is extended to multimodal brain MRI without any user-tunable parameter (unsupervised algorithm). To validate this new unifying framework, we present experimental results on both synthetic and real brain images, for which the ground truth is available. Comparison with other often used techniques demonstrates the accuracy and the robustness of this new Markovian segmentation scheme.
Kulak, Alex N; Iddon, Peter; Li, Yuting; Armes, Steven P; Cölfen, Helmut; Paris, Oskar; Wilson, Rory M; Meldrum, Fiona C
2007-03-28
Two double-hydrophilic block copolymers, each comprising a nonionic block and an anionic block comprising pendent aromatic sulfonate groups, were used as additives to modify the crystallization of CaCO3. Marked morphological changes in the CaCO3 particles were observed depending on the reaction conditions used. A poly(ethylene oxide)-b-poly(sodium 4-styrenesulfonate) diblock copolymer was particularly versatile in effecting a morphological change in calcite particles, and a continuous structural transition in the product particles from polycrystalline to mesocrystal to single crystal was observed with variation in the calcium concentration. The existence of this structural sequence provides unique insight into the mechanism of polymer-mediated crystallization. We propose that it reflects continuity in the crystallization mechanism itself, spanning the limits from nonoriented aggregation of nanoparticles to classical ion-by-ion growth. The various pathways to polycrystalline, mesocrystal, and single-crystal particles, which had previously been considered to be distinct, therefore all form part of a unifying crystallization framework based on the aggregation of precursor subunits.
From 16-bit to high-accuracy IDCT approximation: fruits of single architecture affliation
NASA Astrophysics Data System (ADS)
Liu, Lijie; Tran, Trac D.; Topiwala, Pankaj
2007-09-01
In this paper, we demonstrate an effective unified framework for high-accuracy approximation of the irrational co-effcient floating-point IDCT by a single integer-coeffcient fixed-point architecture. Our framework is based on a modified version of the Loeffler's sparse DCT factorization, and the IDCT architecture is constructed via a cascade of dyadic lifting steps and butterflies. We illustrate that simply varying the accuracy of the approximating parameters yields a large family of standard-compliant IDCTs, from rare 16-bit approximations catering to portable computing to ultra-high-accuracy 32-bit versions that virtually eliminate any drifting effect when pairing with the 64-bit floating-point IDCT at the encoder. Drifting performances of the proposed IDCTs along with existing popular IDCT algorithms in H.263+, MPEG-2 and MPEG-4 are also demonstrated.
Palacios-Flores, Kim; García-Sotelo, Jair; Castillo, Alejandra; Uribe, Carina; Aguilar, Luis; Morales, Lucía; Gómez-Romero, Laura; Reyes, José; Garciarubio, Alejandro; Boege, Margareta; Dávila, Guillermo
2018-04-01
We present a conceptually simple, sensitive, precise, and essentially nonstatistical solution for the analysis of genome variation in haploid organisms. The generation of a Perfect Match Genomic Landscape (PMGL), which computes intergenome identity with single nucleotide resolution, reveals signatures of variation wherever a query genome differs from a reference genome. Such signatures encode the precise location of different types of variants, including single nucleotide variants, deletions, insertions, and amplifications, effectively introducing the concept of a general signature of variation. The precise nature of variants is then resolved through the generation of targeted alignments between specific sets of sequence reads and known regions of the reference genome. Thus, the perfect match logic decouples the identification of the location of variants from the characterization of their nature, providing a unified framework for the detection of genome variation. We assessed the performance of the PMGL strategy via simulation experiments. We determined the variation profiles of natural genomes and of a synthetic chromosome, both in the context of haploid yeast strains. Our approach uncovered variants that have previously escaped detection. Moreover, our strategy is ideally suited for further refining high-quality reference genomes. The source codes for the automated PMGL pipeline have been deposited in a public repository. Copyright © 2018 by the Genetics Society of America.
Unified Program Design: Organizing Existing Programming Models, Delivery Options, and Curriculum
ERIC Educational Resources Information Center
Rubenstein, Lisa DaVia; Ridgley, Lisa M.
2017-01-01
A persistent problem in the field of gifted education has been the lack of categorization and delineation of gifted programming options. To address this issue, we propose Unified Program Design as a structural framework for gifted program models. This framework defines gifted programs as the combination of delivery methods and curriculum models.…
A Unified Framework for Complex Networks with Degree Trichotomy Based on Markov Chains.
Hui, David Shui Wing; Chen, Yi-Chao; Zhang, Gong; Wu, Weijie; Chen, Guanrong; Lui, John C S; Li, Yingtao
2017-06-16
This paper establishes a Markov chain model as a unified framework for describing the evolution processes in complex networks. The unique feature of the proposed model is its capability in addressing the formation mechanism that can reflect the "trichotomy" observed in degree distributions, based on which closed-form solutions can be derived. Important special cases of the proposed unified framework are those classical models, including Poisson, Exponential, Power-law distributed networks. Both simulation and experimental results demonstrate a good match of the proposed model with real datasets, showing its superiority over the classical models. Implications of the model to various applications including citation analysis, online social networks, and vehicular networks design, are also discussed in the paper.
Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita
2013-01-01
Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration.
Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita
2013-01-01
Objective Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. Materials and methods We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Results Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. Conclusions We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration. PMID:23571850
Incubation, Insight, and Creative Problem Solving: A Unified Theory and a Connectionist Model
ERIC Educational Resources Information Center
Helie, Sebastien; Sun, Ron
2010-01-01
This article proposes a unified framework for understanding creative problem solving, namely, the explicit-implicit interaction theory. This new theory of creative problem solving constitutes an attempt at providing a more unified explanation of relevant phenomena (in part by reinterpreting/integrating various fragmentary existing theories of…
CWDPRNP: A tool for cervid prion sequence analysis in program R
Miller, William L.; Walter, W. David
2017-01-01
Chronic wasting disease is a fatal, neurological disease caused by an infectious prion protein, which affects economically and ecologically important members of the family Cervidae. Single nucleotide polymorphisms within the prion protein gene have been linked to differential susceptibility to the disease in many species. Wildlife managers are seeking to determine the frequencies of disease-associated alleles and genotypes and delineate spatial genetic patterns. The CWDPRNP package, implemented in program R, provides a unified framework for analyzing prion protein gene variability and spatial structure.
Constraints on single entity driven inflationary and radiation eras
NASA Astrophysics Data System (ADS)
Bouhmadi-López, Mariam; Chen, Pisin; Liu, Yen-Wei
2012-07-01
We present a model that attempts to fuse the inflationary era and the subsequent radiation dominated era under a unified framework so as to provide a smooth transition between the two. The model is based on a modification of the generalized Chaplygin gas. We constrain the model observationally by mapping the primordial power spectrum of the scalar perturbations to the latest data of WMAP7. We compute as well the spectrum of the primordial gravitational waves as would be measured today.
A unifying kinetic framework for modeling oxidoreductase-catalyzed reactions.
Chang, Ivan; Baldi, Pierre
2013-05-15
Oxidoreductases are a fundamental class of enzymes responsible for the catalysis of oxidation-reduction reactions, crucial in most bioenergetic metabolic pathways. From their common root in the ancient prebiotic environment, oxidoreductases have evolved into diverse and elaborate protein structures with specific kinetic properties and mechanisms adapted to their individual functional roles and environmental conditions. While accurate kinetic modeling of oxidoreductases is thus important, current models suffer from limitations to the steady-state domain, lack empirical validation or are too specialized to a single system or set of conditions. To address these limitations, we introduce a novel unifying modeling framework for kinetic descriptions of oxidoreductases. The framework is based on a set of seven elementary reactions that (i) form the basis for 69 pairs of enzyme state transitions for encoding various specific microscopic intra-enzyme reaction networks (micro-models), and (ii) lead to various specific macroscopic steady-state kinetic equations (macro-models) via thermodynamic assumptions. Thus, a synergistic bridge between the micro and macro kinetics can be achieved, enabling us to extract unitary rate constants, simulate reaction variance and validate the micro-models using steady-state empirical data. To help facilitate the application of this framework, we make available RedoxMech: a Mathematica™ software package that automates the generation and customization of micro-models. The Mathematica™ source code for RedoxMech, the documentation and the experimental datasets are all available from: http://www.igb.uci.edu/tools/sb/metabolic-modeling. pfbaldi@ics.uci.edu Supplementary data are available at Bioinformatics online.
Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A
2015-01-01
This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a flexible and extensible framework for all types of interaction between health record systems and research systems. CDIM, as core ontology of such an approach, enables simplicity and consistency of design across the heterogeneous software landscape and can support the specific needs of EHR-driven phenotyping research using primary care data.
Parrott, Dominic J.
2008-01-01
Theory and research on antigay aggression has identified different motives that facilitate aggression based on sexual orientation. However, the individual and situational determinants of antigay aggression associated with these motivations have yet to be organized within a single theoretical framework. This limits researchers’ ability to organize existing knowledge, link that knowledge with related aggression theory, and guide the application of new findings. To address these limitations, this article argues for the use of an existing conceptual framework to guide thinking and generate new research in this area of study. Contemporary theories of antigay aggression, and empirical support for these theories, are reviewed and interpreted within the unifying framework of the general aggression model [Anderson, C.A. & Bushman, B.J. (2002). Human aggression. Annual Review of Psychology, 53, 27–51.]. It is concluded that this conceptual framework will facilitate investigation of individual and situational risk factors that may contribute to antigay aggression and guide development of individual-level intervention. PMID:18355952
ERIC Educational Resources Information Center
Center for Mental Health in Schools at UCLA, 2005
2005-01-01
This report was developed to highlight the current state of affairs and illustrate the value of a unifying framework and integrated infrastructure for the many initiatives, projects, programs, and services schools pursue in addressing barriers to learning and promoting healthy development. Specifically, it highlights how initiatives can be…
Toward a unifying framework for evolutionary processes.
Paixão, Tiago; Badkobeh, Golnaz; Barton, Nick; Çörüş, Doğan; Dang, Duc-Cuong; Friedrich, Tobias; Lehre, Per Kristian; Sudholt, Dirk; Sutton, Andrew M; Trubenová, Barbora
2015-10-21
The theory of population genetics and evolutionary computation have been evolving separately for nearly 30 years. Many results have been independently obtained in both fields and many others are unique to its respective field. We aim to bridge this gap by developing a unifying framework for evolutionary processes that allows both evolutionary algorithms and population genetics models to be cast in the same formal framework. The framework we present here decomposes the evolutionary process into its several components in order to facilitate the identification of similarities between different models. In particular, we propose a classification of evolutionary operators based on the defining properties of the different components. We cast several commonly used operators from both fields into this common framework. Using this, we map different evolutionary and genetic algorithms to different evolutionary regimes and identify candidates with the most potential for the translation of results between the fields. This provides a unified description of evolutionary processes and represents a stepping stone towards new tools and results to both fields. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Probabilistic arithmetic automata and their applications.
Marschall, Tobias; Herms, Inke; Kaltenbach, Hans-Michael; Rahmann, Sven
2012-01-01
We present a comprehensive review on probabilistic arithmetic automata (PAAs), a general model to describe chains of operations whose operands depend on chance, along with two algorithms to numerically compute the distribution of the results of such probabilistic calculations. PAAs provide a unifying framework to approach many problems arising in computational biology and elsewhere. We present five different applications, namely 1) pattern matching statistics on random texts, including the computation of the distribution of occurrence counts, waiting times, and clump sizes under hidden Markov background models; 2) exact analysis of window-based pattern matching algorithms; 3) sensitivity of filtration seeds used to detect candidate sequence alignments; 4) length and mass statistics of peptide fragments resulting from enzymatic cleavage reactions; and 5) read length statistics of 454 and IonTorrent sequencing reads. The diversity of these applications indicates the flexibility and unifying character of the presented framework. While the construction of a PAA depends on the particular application, we single out a frequently applicable construction method: We introduce deterministic arithmetic automata (DAAs) to model deterministic calculations on sequences, and demonstrate how to construct a PAA from a given DAA and a finite-memory random text model. This procedure is used for all five discussed applications and greatly simplifies the construction of PAAs. Implementations are available as part of the MoSDi package. Its application programming interface facilitates the rapid development of new applications based on the PAA framework.
NASA Astrophysics Data System (ADS)
McClelland, Jamie R.; Modat, Marc; Arridge, Simon; Grimes, Helen; D'Souza, Derek; Thomas, David; O' Connell, Dylan; Low, Daniel A.; Kaza, Evangelia; Collins, David J.; Leach, Martin O.; Hawkes, David J.
2017-06-01
Surrogate-driven respiratory motion models relate the motion of the internal anatomy to easily acquired respiratory surrogate signals, such as the motion of the skin surface. They are usually built by first using image registration to determine the motion from a number of dynamic images, and then fitting a correspondence model relating the motion to the surrogate signals. In this paper we present a generalized framework that unifies the image registration and correspondence model fitting into a single optimization. This allows the use of ‘partial’ imaging data, such as individual slices, projections, or k-space data, where it would not be possible to determine the motion from an individual frame of data. Motion compensated image reconstruction can also be incorporated using an iterative approach, so that both the motion and a motion-free image can be estimated from the partial image data. The framework has been applied to real 4DCT, Cine CT, multi-slice CT, and multi-slice MR data, as well as simulated datasets from a computer phantom. This includes the use of a super-resolution reconstruction method for the multi-slice MR data. Good results were obtained for all datasets, including quantitative results for the 4DCT and phantom datasets where the ground truth motion was known or could be estimated.
McClelland, Jamie R; Modat, Marc; Arridge, Simon; Grimes, Helen; D'Souza, Derek; Thomas, David; Connell, Dylan O'; Low, Daniel A; Kaza, Evangelia; Collins, David J; Leach, Martin O; Hawkes, David J
2017-06-07
Surrogate-driven respiratory motion models relate the motion of the internal anatomy to easily acquired respiratory surrogate signals, such as the motion of the skin surface. They are usually built by first using image registration to determine the motion from a number of dynamic images, and then fitting a correspondence model relating the motion to the surrogate signals. In this paper we present a generalized framework that unifies the image registration and correspondence model fitting into a single optimization. This allows the use of 'partial' imaging data, such as individual slices, projections, or k-space data, where it would not be possible to determine the motion from an individual frame of data. Motion compensated image reconstruction can also be incorporated using an iterative approach, so that both the motion and a motion-free image can be estimated from the partial image data. The framework has been applied to real 4DCT, Cine CT, multi-slice CT, and multi-slice MR data, as well as simulated datasets from a computer phantom. This includes the use of a super-resolution reconstruction method for the multi-slice MR data. Good results were obtained for all datasets, including quantitative results for the 4DCT and phantom datasets where the ground truth motion was known or could be estimated.
McClelland, Jamie R; Modat, Marc; Arridge, Simon; Grimes, Helen; D’Souza, Derek; Thomas, David; Connell, Dylan O’; Low, Daniel A; Kaza, Evangelia; Collins, David J; Leach, Martin O; Hawkes, David J
2017-01-01
Abstract Surrogate-driven respiratory motion models relate the motion of the internal anatomy to easily acquired respiratory surrogate signals, such as the motion of the skin surface. They are usually built by first using image registration to determine the motion from a number of dynamic images, and then fitting a correspondence model relating the motion to the surrogate signals. In this paper we present a generalized framework that unifies the image registration and correspondence model fitting into a single optimization. This allows the use of ‘partial’ imaging data, such as individual slices, projections, or k-space data, where it would not be possible to determine the motion from an individual frame of data. Motion compensated image reconstruction can also be incorporated using an iterative approach, so that both the motion and a motion-free image can be estimated from the partial image data. The framework has been applied to real 4DCT, Cine CT, multi-slice CT, and multi-slice MR data, as well as simulated datasets from a computer phantom. This includes the use of a super-resolution reconstruction method for the multi-slice MR data. Good results were obtained for all datasets, including quantitative results for the 4DCT and phantom datasets where the ground truth motion was known or could be estimated. PMID:28195833
Kambhampati, Satya Samyukta; Singh, Vishal; Manikandan, M Sabarimalai; Ramkumar, Barathram
2015-08-01
In this Letter, the authors present a unified framework for fall event detection and classification using the cumulants extracted from the acceleration (ACC) signals acquired using a single waist-mounted triaxial accelerometer. The main objective of this Letter is to find suitable representative cumulants and classifiers in effectively detecting and classifying different types of fall and non-fall events. It was discovered that the first level of the proposed hierarchical decision tree algorithm implements fall detection using fifth-order cumulants and support vector machine (SVM) classifier. In the second level, the fall event classification algorithm uses the fifth-order cumulants and SVM. Finally, human activity classification is performed using the second-order cumulants and SVM. The detection and classification results are compared with those of the decision tree, naive Bayes, multilayer perceptron and SVM classifiers with different types of time-domain features including the second-, third-, fourth- and fifth-order cumulants and the signal magnitude vector and signal magnitude area. The experimental results demonstrate that the second- and fifth-order cumulant features and SVM classifier can achieve optimal detection and classification rates of above 95%, as well as the lowest false alarm rate of 1.03%.
A Unified Model of Performance for Predicting the Effects of Sleep and Caffeine
Ramakrishnan, Sridhar; Wesensten, Nancy J.; Kamimori, Gary H.; Moon, James E.; Balkin, Thomas J.; Reifman, Jaques
2016-01-01
Study Objectives: Existing mathematical models of neurobehavioral performance cannot predict the beneficial effects of caffeine across the spectrum of sleep loss conditions, limiting their practical utility. Here, we closed this research gap by integrating a model of caffeine effects with the recently validated unified model of performance (UMP) into a single, unified modeling framework. We then assessed the accuracy of this new UMP in predicting performance across multiple studies. Methods: We hypothesized that the pharmacodynamics of caffeine vary similarly during both wakefulness and sleep, and that caffeine has a multiplicative effect on performance. Accordingly, to represent the effects of caffeine in the UMP, we multiplied a dose-dependent caffeine factor (which accounts for the pharmacokinetics and pharmacodynamics of caffeine) to the performance estimated in the absence of caffeine. We assessed the UMP predictions in 14 distinct laboratory- and field-study conditions, including 7 different sleep-loss schedules (from 5 h of sleep per night to continuous sleep loss for 85 h) and 6 different caffeine doses (from placebo to repeated 200 mg doses to a single dose of 600 mg). Results: The UMP accurately predicted group-average psychomotor vigilance task performance data across the different sleep loss and caffeine conditions (6% < error < 27%), yielding greater accuracy for mild and moderate sleep loss conditions than for more severe cases. Overall, accounting for the effects of caffeine resulted in improved predictions (after caffeine consumption) by up to 70%. Conclusions: The UMP provides the first comprehensive tool for accurate selection of combinations of sleep schedules and caffeine countermeasure strategies to optimize neurobehavioral performance. Citation: Ramakrishnan S, Wesensten NJ, Kamimori GH, Moon JE, Balkin TJ, Reifman J. A unified model of performance for predicting the effects of sleep and caffeine. SLEEP 2016;39(10):1827–1841. PMID:27397562
A Unified Classification Framework for FP, DP and CP Data at X-Band in Southern China
NASA Astrophysics Data System (ADS)
Xie, Lei; Zhang, Hong; Li, Hhongzhong; Wang, Chao
2015-04-01
The main objective of this paper is to introduce an unified framework for crop classification in Southern China using data in fully polarimetric (FP), dual-pol (DP) and compact polarimetric (CP) modes. The TerraSAR-X data acquired over the Leizhou Peninsula, South China are used in our experiments. The study site involves four main crops (rice, banana, sugarcane eucalyptus). Through exploring the similarities between data in these three modes, a knowledge-based characteristic space is created and the unified framework is presented. The overall classification accuracies for data in the FP, coherent HH/VV are about 95%, and is about 91% in CP modes, which suggests that the proposed classification scheme is effective and promising. Compared with the Wishart Maximum Likelihood (ML) classifier, the proposed method exhibits higher classification accuracy.
The Unified Behavior Framework for the Simulation of Autonomous Agents
2015-03-01
1980s, researchers have designed a variety of robot control architectures intending to imbue robots with some degree of autonomy. A recently developed ...Identification Friend or Foe viii THE UNIFIED BEHAVIOR FRAMEWORK FOR THE SIMULATION OF AUTONOMOUS AGENTS I. Introduction The development of autonomy has...room for research by utilizing methods like simulation and modeling that consume less time and fewer monetary resources. A recently developed reactive
A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects
Slob, Wout
2015-01-01
Background When chemical health hazards have been identified, probabilistic dose–response assessment (“hazard characterization”) quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. Objectives We developed a unified framework for probabilistic dose–response assessment. Methods We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose–response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, “effect metrics” can be specified to define “toxicologically equivalent” sizes for this underlying individual response; and d) dose–response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose–response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Results Probabilistically derived exposure limits are based on estimating a “target human dose” (HDMI), which requires risk management–informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%–10% effect sizes. Conclusions Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions. Citation Chiu WA, Slob W. 2015. A unified probabilistic framework for dose–response assessment of human health effects. Environ Health Perspect 123:1241–1254; http://dx.doi.org/10.1289/ehp.1409385 PMID:26006063
How economic development and family planning programs combined to reduce Indonesian fertility.
Gertler, P J; Molyneaux, J W
1994-02-01
This paper examines the contributions of family planning programs, economic development, and women's status to Indonesian fertility decline from 1982 to 1987. Methodologically we unify seemingly conflicting demographic and economic frameworks into a single "structural" proximate-cause model as well as controlling statistically for the targeted (nonrandom) placement of family planning program inputs. The results are consistent with both frameworks: 75% of the fertility decline resulted from increased contraceptive use, but was induced primarily through economic development and improved education and economic opportunities for females. Even so, the dramatic impact of the changes in demand-side factors (education and economic development) on contraceptive use was possible only because there already existed a highly responsive contraceptive supply delivery system.
Kim, Sung-Cheol; Wunsch, Benjamin H; Hu, Huan; Smith, Joshua T; Austin, Robert H; Stolovitzky, Gustavo
2017-06-27
Deterministic lateral displacement (DLD) is a technique for size fractionation of particles in continuous flow that has shown great potential for biological applications. Several theoretical models have been proposed, but experimental evidence has demonstrated that a rich class of intermediate migration behavior exists, which is not predicted. We present a unified theoretical framework to infer the path of particles in the whole array on the basis of trajectories in a unit cell. This framework explains many of the unexpected particle trajectories reported and can be used to design arrays for even nanoscale particle fractionation. We performed experiments that verify these predictions and used our model to develop a condenser array that achieves full particle separation with a single fluidic input.
IDEA: Planning at the Core of Autonomous Reactive Agents
NASA Technical Reports Server (NTRS)
Muscettola, Nicola; Dorais, Gregory A.; Fry, Chuck; Levinson, Richard; Plaunt, Christian; Clancy, Daniel (Technical Monitor)
2002-01-01
Several successful autonomous systems are separated into technologically diverse functional layers operating at different levels of abstraction. This diversity makes them difficult to implement and validate. In this paper, we present IDEA (Intelligent Distributed Execution Architecture), a unified planning and execution framework. In IDEA a layered system can be implemented as separate agents, one per layer, each representing its interactions with the world in a model. At all levels, the model representation primitives and their semantics is the same. Moreover, each agent relies on a single model, plan database, plan runner and on a variety of planners, both reactive and deliberative. The framework allows the specification of agents that operate, within a guaranteed reaction time and supports flexible specification of reactive vs. deliberative agent behavior. Within the IDEA framework we are working to fully duplicate the functionalities of the DS1 Remote Agent and extend it to domains of higher complexity than autonomous spacecraft control.
NASA Astrophysics Data System (ADS)
Huber, Ludwig
2014-09-01
This comment addresses the first component of Fitch's framework: the computational power of single neurons [3]. Although I agree that traditional models of neural computation have vastly underestimated the computational power of single neurons, I am hesitant to follow him completely. The exclusive focus on neurons is likely to underestimate the importance of other cells in the brain. In the last years, two such cell types have received appropriate attention by neuroscientists: interneurons and glia. Interneurons are small, tightly packed cells involved in the control of information processing in learning and memory. Rather than transmitting externally (like motor or sensory neurons), these neurons process information within internal circuits of the brain (therefore also called 'relay neurons'). Some specialized interneuron subtypes temporally regulate the flow of information in a given cortical circuit during relevant behavioral events [4]. In the human brain approx. 100 billion interneurons control information processing and are implicated in disorders such as epilepsy and Parkinson's.
Physics of superheavy dark matter in supergravity
NASA Astrophysics Data System (ADS)
Addazi, Andrea; Marciano, Antonino; Ketov, Sergei V.; Khlopov, Maxim Yu.
New trends in inflationary model building and dark matter production in supergravity are considered. Starobinsky inflation is embedded into 𝒩 = 1 supergravity, avoiding instability problems, when the inflaton belongs to a vector superfield associated with a U(1) gauge symmetry, instead of a chiral superfield. This gauge symmetry can be spontaneously broken by the super-Higgs mechanism resulting in a massive vector supermultiplet including the (real scalar) inflaton field. Both supersymmetry (SUSY) and the R-symmetry can also be spontaneously broken by the Polonyi mechanism at high scales close to the inflationary scale. In this case, Polonyi particles and gravitinos become superheavy, and can be copiously produced during inflation by the Schwinger mechanism sourced by the universe expansion. The Polonyi mass slightly exceeds twice the gravitino mass, so that Polonyi particles are unstable and decay into gravitinos. Considering the mechanisms of superheavy gravitino production, we find that the right amount of cold dark matter composed of gravitinos can be achieved. In our scenario, the parameter space of the inflaton potential is directly related to the dark matter one, providing a new unifying framework of inflation and dark matter genesis. A multi-superfield extension of the supergravity framework with a single (inflaton) superfield can result in a formation of primordial nonlinear structures like mini- and stellar-mass black holes, primordial nongaussianity, and the running spectral index of density fluctuations. This framework can be embedded into the SUSY GUTs inspired by heterotic string compactifications on Calabi-Yau three-folds, thus unifying particle physics with quantum gravity.
A framework for streamlining research workflow in neuroscience and psychology
Kubilius, Jonas
2014-01-01
Successful accumulation of knowledge is critically dependent on the ability to verify and replicate every part of scientific conduct. However, such principles are difficult to enact when researchers continue to resort on ad-hoc workflows and with poorly maintained code base. In this paper I examine the needs of neuroscience and psychology community, and introduce psychopy_ext, a unifying framework that seamlessly integrates popular experiment building, analysis and manuscript preparation tools by choosing reasonable defaults and implementing relatively rigid patterns of workflow. This structure allows for automation of multiple tasks, such as generated user interfaces, unit testing, control analyses of stimuli, single-command access to descriptive statistics, and publication quality plotting. Taken together, psychopy_ext opens an exciting possibility for a faster, more robust code development and collaboration for researchers. PMID:24478691
Kim, Sung-Cheol; Wunsch, Benjamin H.; Hu, Huan; Smith, Joshua T.; Stolovitzky, Gustavo
2017-01-01
Deterministic lateral displacement (DLD) is a technique for size fractionation of particles in continuous flow that has shown great potential for biological applications. Several theoretical models have been proposed, but experimental evidence has demonstrated that a rich class of intermediate migration behavior exists, which is not predicted. We present a unified theoretical framework to infer the path of particles in the whole array on the basis of trajectories in a unit cell. This framework explains many of the unexpected particle trajectories reported and can be used to design arrays for even nanoscale particle fractionation. We performed experiments that verify these predictions and used our model to develop a condenser array that achieves full particle separation with a single fluidic input. PMID:28607075
A unifying model of the role of the infralimbic cortex in extinction and habits
Taylor, Jane R.; Chandler, L. Judson
2014-01-01
The infralimbic prefrontal cortex (IL) has been shown to be critical for the regulation of flexible behavior, but its precise function remains unclear. This region has been shown to be critical for the acquisition, consolidation, and expression of extinction learning, leading many to hypothesize that IL suppresses behavior as part of a “stop” network. However, this framework is at odds with IL function in habitual behavior in which the IL has been shown to be required for the expression and acquisition of ongoing habitual behavior. Here, we will review the current state of knowledge of IL anatomy and function in behavioral flexibility and provide a testable framework for a single IL mechanism underlying its function in both extinction and habit learning. PMID:25128534
Lappi, Otto; Mole, Callum
2018-06-11
The authors present an approach to the coordination of eye movements and locomotion in naturalistic steering tasks. It is based on recent empirical research, in particular, on driver eye movements, that poses challenges for existing accounts of how we visually steer a course. They first analyze how the ideas of feedback and feedforward processes and internal models are treated in control theoretical steering models within vision science and engineering, which share an underlying architecture but have historically developed in very separate ways. The authors then show how these traditions can be naturally (re)integrated with each other and with contemporary neuroscience, to better understand the skill and gaze strategies involved. They then propose a conceptual model that (a) gives a unified account to the coordination of gaze and steering control, (b) incorporates higher-level path planning, and (c) draws on the literature on paired forward and inverse models in predictive control. Although each of these (a-c) has been considered before (also in the context of driving), integrating them into a single framework and the authors' multiple waypoint identification hypothesis within that framework are novel. The proposed hypothesis is relevant to all forms of visually guided locomotion. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Evolution of spatially structured host-parasite interactions.
Lion, S; Gandon, S
2015-01-01
Spatial structure has dramatic effects on the demography and the evolution of species. A large variety of theoretical models have attempted to understand how local dispersal may shape the coevolution of interacting species such as host-parasite interactions. The lack of a unifying framework is a serious impediment for anyone willing to understand current theory. Here, we review previous theoretical studies in the light of a single epidemiological model that allows us to explore the effects of both host and parasite migration rates on the evolution and coevolution of various life-history traits. We discuss the impact of local dispersal on parasite virulence, various host defence strategies and local adaptation. Our analysis shows that evolutionary and coevolutionary outcomes crucially depend on the details of the host-parasite life cycle and on which life-history trait is involved in the interaction. We also discuss experimental studies that support the effects of spatial structure on the evolution of host-parasite interactions. This review highlights major similarities between some theoretical results, but it also reveals an important gap between evolutionary and coevolutionary models. We discuss possible ways to bridge this gap within a more unified framework that would reconcile spatial epidemiology, evolution and coevolution. © 2014 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2014 European Society For Evolutionary Biology.
Chao, Anne; Chiu, Chun-Huo; Colwell, Robert K; Magnago, Luiz Fernando S; Chazdon, Robin L; Gotelli, Nicholas J
2017-11-01
Estimating the species, phylogenetic, and functional diversity of a community is challenging because rare species are often undetected, even with intensive sampling. The Good-Turing frequency formula, originally developed for cryptography, estimates in an ecological context the true frequencies of rare species in a single assemblage based on an incomplete sample of individuals. Until now, this formula has never been used to estimate undetected species, phylogenetic, and functional diversity. Here, we first generalize the Good-Turing formula to incomplete sampling of two assemblages. The original formula and its two-assemblage generalization provide a novel and unified approach to notation, terminology, and estimation of undetected biological diversity. For species richness, the Good-Turing framework offers an intuitive way to derive the non-parametric estimators of the undetected species richness in a single assemblage, and of the undetected species shared between two assemblages. For phylogenetic diversity, the unified approach leads to an estimator of the undetected Faith's phylogenetic diversity (PD, the total length of undetected branches of a phylogenetic tree connecting all species), as well as a new estimator of undetected PD shared between two phylogenetic trees. For functional diversity based on species traits, the unified approach yields a new estimator of undetected Walker et al.'s functional attribute diversity (FAD, the total species-pairwise functional distance) in a single assemblage, as well as a new estimator of undetected FAD shared between two assemblages. Although some of the resulting estimators have been previously published (but derived with traditional mathematical inequalities), all taxonomic, phylogenetic, and functional diversity estimators are now derived under the same framework. All the derived estimators are theoretically lower bounds of the corresponding undetected diversities; our approach reveals the sufficient conditions under which the estimators are nearly unbiased, thus offering new insights. Simulation results are reported to numerically verify the performance of the derived estimators. We illustrate all estimators and assess their sampling uncertainty with an empirical dataset for Brazilian rain forest trees. These estimators should be widely applicable to many current problems in ecology, such as the effects of climate change on spatial and temporal beta diversity and the contribution of trait diversity to ecosystem multi-functionality. © 2017 by the Ecological Society of America.
Buckingham, C D; Adams, A
2000-10-01
This is the second of two linked papers exploring decision making in nursing. The first paper, 'Classifying clinical decision making: a unifying approach' investigated difficulties with applying a range of decision-making theories to nursing practice. This is due to the diversity of terminology and theoretical concepts used, which militate against nurses being able to compare the outcomes of decisions analysed within different frameworks. It is therefore problematic for nurses to assess how good their decisions are, and where improvements can be made. However, despite the range of nomenclature, it was argued that there are underlying similarities between all theories of decision processes and that these should be exposed through integration within a single explanatory framework. A proposed solution was to use a general model of psychological classification to clarify and compare terms, concepts and processes identified across the different theories. The unifying framework of classification was described and this paper operationalizes it to demonstrate how different approaches to clinical decision making can be re-interpreted as classification behaviour. Particular attention is focused on classification in nursing, and on re-evaluating heuristic reasoning, which has been particularly prone to theoretical and terminological confusion. Demonstrating similarities in how different disciplines make decisions should promote improved multidisciplinary collaboration and a weakening of clinical elitism, thereby enhancing organizational effectiveness in health care and nurses' professional status. This is particularly important as nurses' roles continue to expand to embrace elements of managerial, medical and therapeutic work. Analysing nurses' decisions as classification behaviour will also enhance clinical effectiveness, and assist in making nurses' expertise more visible. In addition, the classification framework explodes the myth that intuition, traditionally associated with nurses' decision making, is less rational and scientific than other approaches.
Disaster Metrics: A Comprehensive Framework for Disaster Evaluation Typologies.
Wong, Diana F; Spencer, Caroline; Boyd, Lee; Burkle, Frederick M; Archer, Frank
2017-10-01
Introduction The frequency of disasters is increasing around the world with more people being at risk. There is a moral imperative to improve the way in which disaster evaluations are undertaken and reported with the aim of reducing preventable mortality and morbidity in future events. Disasters are complex events and undertaking disaster evaluations is a specialized area of study at an international level. Hypothesis/Problem While some frameworks have been developed to support consistent disaster research and evaluation, they lack validation, consistent terminology, and standards for reporting across the different phases of a disaster. There is yet to be an agreed, comprehensive framework to structure disaster evaluation typologies. The aim of this paper is to outline an evolving comprehensive framework for disaster evaluation typologies. It is anticipated that this new framework will facilitate an agreement on identifying, structuring, and relating the various evaluations found in the disaster setting with a view to better understand the process, outcomes, and impacts of the effectiveness and efficiency of interventions. Research was undertaken in two phases: (1) a scoping literature review (peer-reviewed and "grey literature") was undertaken to identify current evaluation frameworks and typologies used in the disaster setting; and (2) a structure was developed that included the range of typologies identified in Phase One and suggests possible relationships in the disaster setting. No core, unifying framework to structure disaster evaluation and research was identified in the literature. The authors propose a "Comprehensive Framework for Disaster Evaluation Typologies" that identifies, structures, and suggests relationships for the various typologies detected. The proposed Comprehensive Framework for Disaster Evaluation Typologies outlines the different typologies of disaster evaluations that were identified in this study and brings them together into a single framework. This unique, unifying framework has relevance at an international level and is expected to benefit the disaster, humanitarian, and development sectors. The next step is to undertake a validation process that will include international leaders with experience in evaluation, in general, and disasters specifically. This work promotes an environment for constructive dialogue on evaluations in the disaster setting to strengthen the evidence base for interventions across the disaster spectrum. It remains a work in progress. Wong DF , Spencer C , Boyd L , Burkle FM Jr. , Archer F . Disaster metrics: a comprehensive framework for disaster evaluation typologies. Prehosp Disaster Med. 2017;32(5):501-514.
NASA Astrophysics Data System (ADS)
Tomar, Ruchi; Wadehra, Neha; Budhiraja, Vaishali; Prakash, Bhanu; Chakraverty, S.
2018-01-01
To characterize the physical properties of thin films without ambiguity and design interface with new functionalities, it is essential to have detailed knowledge of physical properties and appropriate estimation of the band profile of perovskite oxide substrates. We have developed and demonstrated a chemical free unified framework to realize single terminated surface of KTaO3, (LaAlO3)0.3 (Sr2AlTaO6)0.7 and SrTiO3 (001) oriented single crystals. The electronic band line-up of these single crystal substrates, using a combination of optical spectroscopy and Kelvin Probe Force Microscopy, has been constructed. A polar-polar interface of KTaO3 and LaBO3 (B-Transition metal ion) before and after the possible surface/electronic reconstruction has also been schematically presented.
Multivariate Lipschitz optimization: Survey and computational comparison
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, P.; Gourdin, E.; Jaumard, B.
1994-12-31
Many methods have been proposed to minimize a multivariate Lipschitz function on a box. They pertain the three approaches: (i) reduction to the univariate case by projection (Pijavskii) or by using a space-filling curve (Strongin); (ii) construction and refinement of a single upper bounding function (Pijavskii, Mladineo, Mayne and Polak, Jaumard Hermann and Ribault, Wood...); (iii) branch and bound with local upper bounding functions (Galperin, Pint{acute e}r, Meewella and Mayne, the present authors). A survey is made, stressing similarities of algorithms, expressed when possible within a unified framework. Moreover, an extensive computational comparison is reported on.
Theories and treatment of drug dependency: a neurochemical perspective.
Pakdaman, Sheila; Wilcox, Richard E; Miller, Joseph D
2014-01-01
Treatment of chemical dependence ("addiction") requires an understanding of its effects on the brain. To guide research in the area of chemical dependence, several foundational theories have been developed. These include the incentive salience, receptor down-regulation, opponent process, and psychomotor stimulant theories. These have been important both in summarizing and in guiding investigations. However, the extant theories do not provide a single unified framework nor have they yielded all of the guidance necessary for effective chemical dependence treatment. The present paper summarizes and then integrates these theories and suggests some implications for the treatment followed by this integration.
The Scaling Group of the 1-D Invisicid Euler Equations
NASA Astrophysics Data System (ADS)
Schmidt, Emma; Ramsey, Scott; Boyd, Zachary; Baty, Roy
2017-11-01
The one dimensional (1-D) compressible Euler equations in non-ideal media support scale invariant solutions under a variety of initial conditions. Famous scale invariant solutions include the Noh, Sedov, Guderley, and collapsing cavity hydrodynamic test problems. We unify many classical scale invariant solutions under a single scaling group analysis. The scaling symmetry group generator provides a framework for determining all scale invariant solutions emitted by the 1-D Euler equations for arbitrary geometry, initial conditions, and equation of state. We approach the Euler equations from a geometric standpoint, and conduct scaling analyses for a broad class of materials.
A Unified Model of Performance for Predicting the Effects of Sleep and Caffeine.
Ramakrishnan, Sridhar; Wesensten, Nancy J; Kamimori, Gary H; Moon, James E; Balkin, Thomas J; Reifman, Jaques
2016-10-01
Existing mathematical models of neurobehavioral performance cannot predict the beneficial effects of caffeine across the spectrum of sleep loss conditions, limiting their practical utility. Here, we closed this research gap by integrating a model of caffeine effects with the recently validated unified model of performance (UMP) into a single, unified modeling framework. We then assessed the accuracy of this new UMP in predicting performance across multiple studies. We hypothesized that the pharmacodynamics of caffeine vary similarly during both wakefulness and sleep, and that caffeine has a multiplicative effect on performance. Accordingly, to represent the effects of caffeine in the UMP, we multiplied a dose-dependent caffeine factor (which accounts for the pharmacokinetics and pharmacodynamics of caffeine) to the performance estimated in the absence of caffeine. We assessed the UMP predictions in 14 distinct laboratory- and field-study conditions, including 7 different sleep-loss schedules (from 5 h of sleep per night to continuous sleep loss for 85 h) and 6 different caffeine doses (from placebo to repeated 200 mg doses to a single dose of 600 mg). The UMP accurately predicted group-average psychomotor vigilance task performance data across the different sleep loss and caffeine conditions (6% < error < 27%), yielding greater accuracy for mild and moderate sleep loss conditions than for more severe cases. Overall, accounting for the effects of caffeine resulted in improved predictions (after caffeine consumption) by up to 70%. The UMP provides the first comprehensive tool for accurate selection of combinations of sleep schedules and caffeine countermeasure strategies to optimize neurobehavioral performance. © 2016 Associated Professional Sleep Societies, LLC.
Cusack, Lynette; Smith, Morgan; Hegney, Desley; Rees, Clare S; Breen, Lauren J; Witt, Regina R; Rogers, Cath; Williams, Allison; Cross, Wendy; Cheung, Kin
2016-01-01
Building nurses' resilience to complex and stressful practice environments is necessary to keep skilled nurses in the workplace and ensuring safe patient care. A unified theoretical framework titled Health Services Workplace Environmental Resilience Model (HSWERM), is presented to explain the environmental factors in the workplace that promote nurses' resilience. The framework builds on a previously-published theoretical model of individual resilience, which identified the key constructs of psychological resilience as self-efficacy, coping and mindfulness, but did not examine environmental factors in the workplace that promote nurses' resilience. This unified theoretical framework was developed using a literary synthesis drawing on data from international studies and literature reviews on the nursing workforce in hospitals. The most frequent workplace environmental factors were identified, extracted and clustered in alignment with key constructs for psychological resilience. Six major organizational concepts emerged that related to a positive resilience-building workplace and formed the foundation of the theoretical model. Three concepts related to nursing staff support (professional, practice, personal) and three related to nursing staff development (professional, practice, personal) within the workplace environment. The unified theoretical model incorporates these concepts within the workplace context, linking to the nurse, and then impacting on personal resilience and workplace outcomes, and its use has the potential to increase staff retention and quality of patient care.
Ridgeway, Jennifer L; Wang, Zhen; Finney Rutten, Lila J; van Ryn, Michelle; Griffin, Joan M; Murad, M Hassan; Asiedu, Gladys B; Egginton, Jason S; Beebe, Timothy J
2017-08-04
There exists a paucity of work in the development and testing of theoretical models specific to childhood health disparities even though they have been linked to the prevalence of adult health disparities including high rates of chronic disease. We conducted a systematic review and thematic analysis of existing models of health disparities specific to children to inform development of a unified conceptual framework. We systematically reviewed articles reporting theoretical or explanatory models of disparities on a range of outcomes related to child health. We searched Ovid Medline In-Process & Other Non-Indexed Citations, Ovid MEDLINE, Ovid Embase, Ovid Cochrane Central Register of Controlled Trials, Ovid Cochrane Database of Systematic Reviews, and Scopus (database inception to 9 July 2015). A metanarrative approach guided the analysis process. A total of 48 studies presenting 48 models were included. This systematic review found multiple models but no consensus on one approach. However, we did discover a fair amount of overlap, such that the 48 models reviewed converged into the unified conceptual framework. The majority of models included factors in three domains: individual characteristics and behaviours (88%), healthcare providers and systems (63%), and environment/community (56%), . Only 38% of models included factors in the health and public policies domain. A disease-agnostic unified conceptual framework may inform integration of existing knowledge of child health disparities and guide future research. This multilevel framework can focus attention among clinical, basic and social science research on the relationships between policy, social factors, health systems and the physical environment that impact children's health outcomes. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
A Unified Probabilistic Framework for Dose-Response Assessment of Human Health Effects.
Chiu, Weihsueh A; Slob, Wout
2015-12-01
When chemical health hazards have been identified, probabilistic dose-response assessment ("hazard characterization") quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. We developed a unified framework for probabilistic dose-response assessment. We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose-response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, "effect metrics" can be specified to define "toxicologically equivalent" sizes for this underlying individual response; and d) dose-response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose-response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Probabilistically derived exposure limits are based on estimating a "target human dose" (HDMI), which requires risk management-informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%-10% effect sizes. Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions.
Non-equilibrium reactive flux: A unified framework for slow and fast reaction kinetics.
Bose, Amartya; Makri, Nancy
2017-10-21
The flux formulation of reaction rate theory is recast in terms of the expectation value of the reactive flux with an initial condition that corresponds to a non-equilibrium, factorized reactant density. In the common case of slow reactive processes, the non-equilibrium expression reaches the plateau regime only slightly slower than the equilibrium flux form. When the reactants are described by a single quantum state, as in the case of electron transfer reactions, the factorized reactant density describes the true initial condition of the reactive process. In such cases, the time integral of the non-equilibrium flux expression yields the reactant population as a function of time, allowing characterization of the dynamics in cases where there is no clear separation of time scales and thus a plateau regime cannot be identified. The non-equilibrium flux offers a unified approach to the kinetics of slow and fast chemical reactions and is ideally suited to mixed quantum-classical methods.
A unified framework for approximation in inverse problems for distributed parameter systems
NASA Technical Reports Server (NTRS)
Banks, H. T.; Ito, K.
1988-01-01
A theoretical framework is presented that can be used to treat approximation techniques for very general classes of parameter estimation problems involving distributed systems that are either first or second order in time. Using the approach developed, one can obtain both convergence and stability (continuous dependence of parameter estimates with respect to the observations) under very weak regularity and compactness assumptions on the set of admissible parameters. This unified theory can be used for many problems found in the recent literature and in many cases offers significant improvements to existing results.
Identification and validation of loss of function variants in clinical contexts.
Lescai, Francesco; Marasco, Elena; Bacchelli, Chiara; Stanier, Philip; Mantovani, Vilma; Beales, Philip
2014-01-01
The choice of an appropriate variant calling pipeline for exome sequencing data is becoming increasingly more important in translational medicine projects and clinical contexts. Within GOSgene, which facilitates genetic analysis as part of a joint effort of the University College London and the Great Ormond Street Hospital, we aimed to optimize a variant calling pipeline suitable for our clinical context. We implemented the GATK/Queue framework and evaluated the performance of its two callers: the classical UnifiedGenotyper and the new variant discovery tool HaplotypeCaller. We performed an experimental validation of the loss-of-function (LoF) variants called by the two methods using Sequenom technology. UnifiedGenotyper showed a total validation rate of 97.6% for LoF single-nucleotide polymorphisms (SNPs) and 92.0% for insertions or deletions (INDELs), whereas HaplotypeCaller was 91.7% for SNPs and 55.9% for INDELs. We confirm that GATK/Queue is a reliable pipeline in translational medicine and clinical context. We conclude that in our working environment, UnifiedGenotyper is the caller of choice, being an accurate method, with a high validation rate of error-prone calls like LoF variants. We finally highlight the importance of experimental validation, especially for INDELs, as part of a standard pipeline in clinical environments.
In Search of a Unified Model of Language Contact
ERIC Educational Resources Information Center
Winford, Donald
2013-01-01
Much previous research has pointed to the need for a unified framework for language contact phenomena -- one that would include social factors and motivations, structural factors and linguistic constraints, and psycholinguistic factors involved in processes of language processing and production. While Contact Linguistics has devoted a great deal…
Geometric rectification of camera-captured document images.
Liang, Jian; DeMenthon, Daniel; Doermann, David
2008-04-01
Compared to typical scanners, handheld cameras offer convenient, flexible, portable, and non-contact image capture, which enables many new applications and breathes new life into existing ones. However, camera-captured documents may suffer from distortions caused by non-planar document shape and perspective projection, which lead to failure of current OCR technologies. We present a geometric rectification framework for restoring the frontal-flat view of a document from a single camera-captured image. Our approach estimates 3D document shape from texture flow information obtained directly from the image without requiring additional 3D/metric data or prior camera calibration. Our framework provides a unified solution for both planar and curved documents and can be applied in many, especially mobile, camera-based document analysis applications. Experiments show that our method produces results that are significantly more OCR compatible than the original images.
Image-guided regularization level set evolution for MR image segmentation and bias field correction.
Wang, Lingfeng; Pan, Chunhong
2014-01-01
Magnetic resonance (MR) image segmentation is a crucial step in surgical and treatment planning. In this paper, we propose a level-set-based segmentation method for MR images with intensity inhomogeneous problem. To tackle the initialization sensitivity problem, we propose a new image-guided regularization to restrict the level set function. The maximum a posteriori inference is adopted to unify segmentation and bias field correction within a single framework. Under this framework, both the contour prior and the bias field prior are fully used. As a result, the image intensity inhomogeneity can be well solved. Extensive experiments are provided to evaluate the proposed method, showing significant improvements in both segmentation and bias field correction accuracies as compared with other state-of-the-art approaches. Copyright © 2014 Elsevier Inc. All rights reserved.
Group sparse multiview patch alignment framework with view consistency for image classification.
Gui, Jie; Tao, Dacheng; Sun, Zhenan; Luo, Yong; You, Xinge; Tang, Yuan Yan
2014-07-01
No single feature can satisfactorily characterize the semantic concepts of an image. Multiview learning aims to unify different kinds of features to produce a consensual and efficient representation. This paper redefines part optimization in the patch alignment framework (PAF) and develops a group sparse multiview patch alignment framework (GSM-PAF). The new part optimization considers not only the complementary properties of different views, but also view consistency. In particular, view consistency models the correlations between all possible combinations of any two kinds of view. In contrast to conventional dimensionality reduction algorithms that perform feature extraction and feature selection independently, GSM-PAF enjoys joint feature extraction and feature selection by exploiting l(2,1)-norm on the projection matrix to achieve row sparsity, which leads to the simultaneous selection of relevant features and learning transformation, and thus makes the algorithm more discriminative. Experiments on two real-world image data sets demonstrate the effectiveness of GSM-PAF for image classification.
Connecting single cell to collective cell behavior in a unified theoretical framework
NASA Astrophysics Data System (ADS)
George, Mishel; Bullo, Francesco; Campàs, Otger
Collective cell behavior is an essential part of tissue and organ morphogenesis during embryonic development, as well as of various disease processes, such as cancer. In contrast to many in vitro studies of collective cell migration, most cases of in vivo collective cell migration involve rather small groups of cells, with large sheets of migrating cells being less common. The vast majority of theoretical descriptions of collective cell behavior focus on large numbers of cells, but fail to accurately capture the dynamics of small groups of cells. Here we introduce a low-dimensional theoretical description that successfully captures single cell migration, cell collisions, collective dynamics in small groups of cells, and force propagation during sheet expansion, all within a common theoretical framework. Our description is derived from first principles and also includes key phenomenological aspects of cell migration that control the dynamics of traction forces. Among other results, we explain the counter-intuitive observations that pairs of cells repel each other upon collision while they behave in a coordinated manner within larger clusters.
Characterizing the size and shape of sea ice floes
Gherardi, Marco; Lagomarsino, Marco Cosentino
2015-01-01
Monitoring drift ice in the Arctic and Antarctic regions directly and by remote sensing is important for the study of climate, but a unified modeling framework is lacking. Hence, interpretation of the data, as well as the decision of what to measure, represent a challenge for different fields of science. To address this point, we analyzed, using statistical physics tools, satellite images of sea ice from four different locations in both the northern and southern hemispheres, and measured the size and the elongation of ice floes (floating pieces of ice). We find that (i) floe size follows a distribution that can be characterized with good approximation by a single length scale , which we discuss in the framework of stochastic fragmentation models, and (ii) the deviation of their shape from circularity is reproduced with remarkable precision by a geometric model of coalescence by freezing, based on random Voronoi tessellations, with a single free parameter expressing the shape disorder. Although the physical interpretations remain open, this advocates the parameters and as two independent indicators of the environment in the polar regions, which are easily accessible by remote sensing. PMID:26014797
Fulmer, Erika; Rogers, Todd; Glasgow, LaShawn; Brown, Susan; Kuiper, Nicole
2018-03-01
The outcome indicator framework helps tobacco prevention and control programs (TCPs) plan and implement theory-driven evaluations of their efforts to reduce and prevent tobacco use. Tobacco use is the single-most preventable cause of morbidity and mortality in the United States. The implementation of public health best practices by comprehensive state TCPs has been shown to prevent the initiation of tobacco use, reduce tobacco use prevalence, and decrease tobacco-related health care expenditures. Achieving and sustaining program goals require TCPs to evaluate the effectiveness and impact of their programs. To guide evaluation efforts by TCPs, the Centers for Disease Control and Prevention's Office on Smoking and Health developed an outcome indicator framework that includes a high-level logic model and evidence-based outcome indicators for each tobacco prevention and control goal area. In this article, we describe how TCPs and other community organizations can use the outcome indicator framework in their evaluation efforts. We also discuss how the framework is used at the national level to unify tobacco prevention and control efforts across varying state contexts, identify promising practices, and expand the public health evidence base.
Cusack, Lynette; Smith, Morgan; Hegney, Desley; Rees, Clare S.; Breen, Lauren J.; Witt, Regina R.; Rogers, Cath; Williams, Allison; Cross, Wendy; Cheung, Kin
2016-01-01
Building nurses' resilience to complex and stressful practice environments is necessary to keep skilled nurses in the workplace and ensuring safe patient care. A unified theoretical framework titled Health Services Workplace Environmental Resilience Model (HSWERM), is presented to explain the environmental factors in the workplace that promote nurses' resilience. The framework builds on a previously-published theoretical model of individual resilience, which identified the key constructs of psychological resilience as self-efficacy, coping and mindfulness, but did not examine environmental factors in the workplace that promote nurses' resilience. This unified theoretical framework was developed using a literary synthesis drawing on data from international studies and literature reviews on the nursing workforce in hospitals. The most frequent workplace environmental factors were identified, extracted and clustered in alignment with key constructs for psychological resilience. Six major organizational concepts emerged that related to a positive resilience-building workplace and formed the foundation of the theoretical model. Three concepts related to nursing staff support (professional, practice, personal) and three related to nursing staff development (professional, practice, personal) within the workplace environment. The unified theoretical model incorporates these concepts within the workplace context, linking to the nurse, and then impacting on personal resilience and workplace outcomes, and its use has the potential to increase staff retention and quality of patient care. PMID:27242567
NASA Astrophysics Data System (ADS)
Abdi, Daniel S.; Giraldo, Francis X.
2016-09-01
A unified approach for the numerical solution of the 3D hyperbolic Euler equations using high order methods, namely continuous Galerkin (CG) and discontinuous Galerkin (DG) methods, is presented. First, we examine how classical CG that uses a global storage scheme can be constructed within the DG framework using constraint imposition techniques commonly used in the finite element literature. Then, we implement and test a simplified version in the Non-hydrostatic Unified Model of the Atmosphere (NUMA) for the case of explicit time integration and a diagonal mass matrix. Constructing CG within the DG framework allows CG to benefit from the desirable properties of DG such as, easier hp-refinement, better stability etc. Moreover, this representation allows for regional mixing of CG and DG depending on the flow regime in an area. The different flavors of CG and DG in the unified implementation are then tested for accuracy and performance using a suite of benchmark problems representative of cloud-resolving scale, meso-scale and global-scale atmospheric dynamics. The value of our unified approach is that we are able to show how to carry both CG and DG methods within the same code and also offer a simple recipe for modifying an existing CG code to DG and vice versa.
NASA Astrophysics Data System (ADS)
Papalexiou, Simon Michael
2018-05-01
Hydroclimatic processes come in all "shapes and sizes". They are characterized by different spatiotemporal correlation structures and probability distributions that can be continuous, mixed-type, discrete or even binary. Simulating such processes by reproducing precisely their marginal distribution and linear correlation structure, including features like intermittency, can greatly improve hydrological analysis and design. Traditionally, modelling schemes are case specific and typically attempt to preserve few statistical moments providing inadequate and potentially risky distribution approximations. Here, a single framework is proposed that unifies, extends, and improves a general-purpose modelling strategy, based on the assumption that any process can emerge by transforming a specific "parent" Gaussian process. A novel mathematical representation of this scheme, introducing parametric correlation transformation functions, enables straightforward estimation of the parent-Gaussian process yielding the target process after the marginal back transformation, while it provides a general description that supersedes previous specific parameterizations, offering a simple, fast and efficient simulation procedure for every stationary process at any spatiotemporal scale. This framework, also applicable for cyclostationary and multivariate modelling, is augmented with flexible parametric correlation structures that parsimoniously describe observed correlations. Real-world simulations of various hydroclimatic processes with different correlation structures and marginals, such as precipitation, river discharge, wind speed, humidity, extreme events per year, etc., as well as a multivariate example, highlight the flexibility, advantages, and complete generality of the method.
U.S. History Framework for the 2010 National Assessment of Educational Progress
ERIC Educational Resources Information Center
National Assessment Governing Board, 2009
2009-01-01
This framework identifies the main ideas, major events, key individuals, and unifying themes of American history as a basis for preparing the 2010 assessment. The framework recognizes that U.S. history includes powerful ideas, common and diverse traditions, economic developments, technological and scientific innovations, philosophical debates,…
Applying Laban's Movement Framework in Elementary Physical Education
ERIC Educational Resources Information Center
Langton, Terence W.
2007-01-01
This article recommends raising the bar in elementary physical education by using Laban's movement framework to develop curriculum content in the areas of games, gymnastics, and dance (with physical fitness concepts blended in) in order to help students achieve the NASPE content standards. The movement framework can permeate and unify an…
Covariant effective action for a Galilean invariant quantum Hall system
NASA Astrophysics Data System (ADS)
Geracie, Michael; Prabhu, Kartik; Roberts, Matthew M.
2016-09-01
We construct effective field theories for gapped quantum Hall systems coupled to background geometries with local Galilean invariance i.e. Bargmann spacetimes. Along with an electromagnetic field, these backgrounds include the effects of curved Galilean spacetimes, including torsion and a gravitational field, allowing us to study charge, energy, stress and mass currents within a unified framework. A shift symmetry specific to single constituent theories constraints the effective action to couple to an effective background gauge field and spin connection that is solved for by a self-consistent equation, providing a manifestly covariant extension of Hoyos and Son's improvement terms to arbitrary order in m.
A Practitioner's Perspective on Taxonomy, Ontology and Findability
NASA Technical Reports Server (NTRS)
Berndt, Sarah
2011-01-01
This slide presentation reviews the presenters perspective on developing a taxonomy for JSC to capitalize on the accomplishments of yesterday, while maintaining the flexibility needed for the evolving information of today. A clear vision and scope for the semantic system is integral to its success. The vision for the JSC Taxonomy is to connect information stovepipes to present a unified view for information and knowledge across the Center, across organizations, and across decades. Semantic search at JSC means seamless integration of disparate information sets into a single interface. Ever increasing use, interest, and organizational participation mark successful integration and provide the framework for future application.
Taxonomy, Ontology and Semantics at Johnson Space Center
NASA Technical Reports Server (NTRS)
Berndt, Sarah Ann
2011-01-01
At NASA Johnson Space Center (JSC), the Chief Knowledge Officer has been developing the JSC Taxonomy to capitalize on the accomplishments of yesterday while maintaining the flexibility needed for the evolving information environment of today. A clear vision and scope for the semantic system is integral to its success. The vision for the JSC Taxonomy is to connect information stovepipes to present a unified view for information and knowledge across the Center, across organizations, and across decades. Semantic search at JSC means seemless integration of disparate information sets into a single interface. Ever increasing use, interest, and organizational participation mark successful integration and provide the framework for future application.
NASA Astrophysics Data System (ADS)
Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The
2015-11-01
While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.
An Optimal Order Nonnested Mixed Multigrid Method for Generalized Stokes Problems
NASA Technical Reports Server (NTRS)
Deng, Qingping
1996-01-01
A multigrid algorithm is developed and analyzed for generalized Stokes problems discretized by various nonnested mixed finite elements within a unified framework. It is abstractly proved by an element-independent analysis that the multigrid algorithm converges with an optimal order if there exists a 'good' prolongation operator. A technique to construct a 'good' prolongation operator for nonnested multilevel finite element spaces is proposed. Its basic idea is to introduce a sequence of auxiliary nested multilevel finite element spaces and define a prolongation operator as a composite operator of two single grid level operators. This makes not only the construction of a prolongation operator much easier (the final explicit forms of such prolongation operators are fairly simple), but the verification of the approximate properties for prolongation operators is also simplified. Finally, as an application, the framework and technique is applied to seven typical nonnested mixed finite elements.
Computational Models of Anterior Cingulate Cortex: At the Crossroads between Prediction and Effort.
Vassena, Eliana; Holroyd, Clay B; Alexander, William H
2017-01-01
In the last two decades the anterior cingulate cortex (ACC) has become one of the most investigated areas of the brain. Extensive neuroimaging evidence suggests countless functions for this region, ranging from conflict and error coding, to social cognition, pain and effortful control. In response to this burgeoning amount of data, a proliferation of computational models has tried to characterize the neurocognitive architecture of ACC. Early seminal models provided a computational explanation for a relatively circumscribed set of empirical findings, mainly accounting for EEG and fMRI evidence. More recent models have focused on ACC's contribution to effortful control. In parallel to these developments, several proposals attempted to explain within a single computational framework a wider variety of empirical findings that span different cognitive processes and experimental modalities. Here we critically evaluate these modeling attempts, highlighting the continued need to reconcile the array of disparate ACC observations within a coherent, unifying framework.
Sheldon Glashow, the Electroweak Theory, and the Grand Unified Theory
] 'Glashow shared the 1979 Nobel Prize for physics with Steven Weinberg and Abdus Salam for unifying the particle physics and provides a framework for understanding how the early universe evolved and how the our universe came into being," says Lawrence R. Sulak, chairman of the Boston University physics
"UNICERT," or: Towards the Development of a Unified Language Certificate for German Universities.
ERIC Educational Resources Information Center
Voss, Bernd
The standardization of second language proficiency levels for university students in Germany is discussed. Problems with the current system, in which each university has developed its own program of study and proficiency certification, are examined and a framework for development of a unified language certificate for all universities is outlined.…
Analysis and Management of Animal Populations: Modeling, Estimation and Decision Making
Williams, B.K.; Nichols, J.D.; Conroy, M.J.
2002-01-01
This book deals with the processes involved in making informed decisions about the management of animal populations. It covers the modeling of population responses to management actions, the estimation of quantities needed in the modeling effort, and the application of these estimates and models to the development of sound management decisions. The book synthesizes and integrates in a single volume the methods associated with these themes, as they apply to ecological assessment and conservation of animal populations. KEY FEATURES * Integrates population modeling, parameter estimation and * decision-theoretic approaches to management in a single, cohesive framework * Provides authoritative, state-of-the-art descriptions of quantitative * approaches to modeling, estimation and decision-making * Emphasizes the role of mathematical modeling in the conduct of science * and management * Utilizes a unifying biological context, consistent mathematical notation, * and numerous biological examples
Crupi, Vincenzo; Nelson, Jonathan D; Meder, Björn; Cevolani, Gustavo; Tentori, Katya
2018-06-17
Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. Copyright © 2018 Cognitive Science Society, Inc.
29 CFR 779.218 - Methods to accomplish “unified operation.”
Code of Federal Regulations, 2010 CFR
2010-07-01
..., join together to perform some or all of their activities as a unified business or business system. They may accomplish such unification through agreements, franchises, grants, leases, or other arrangements... others so that they constitute a single business or unified business system. Whether in any particular...
A unified framework for heat and mass transport at the atomic scale
NASA Astrophysics Data System (ADS)
Ponga, Mauricio; Sun, Dingyi
2018-04-01
We present a unified framework to simulate heat and mass transport in systems of particles. The proposed framework is based on kinematic mean field theory and uses a phenomenological master equation to compute effective transport rates between particles without the need to evaluate operators. We exploit this advantage and apply the model to simulate transport phenomena at the nanoscale. We demonstrate that, when calibrated to experimentally-measured transport coefficients, the model can accurately predict transient and steady state temperature and concentration profiles even in scenarios where the length of the device is comparable to the mean free path of the carriers. Through several example applications, we demonstrate the validity of our model for all classes of materials, including ones that, until now, would have been outside the domain of computational feasibility.
A unified theoretical framework for mapping models for the multi-state Hamiltonian.
Liu, Jian
2016-11-28
We propose a new unified theoretical framework to construct equivalent representations of the multi-state Hamiltonian operator and present several approaches for the mapping onto the Cartesian phase space. After mapping an F-dimensional Hamiltonian onto an F+1 dimensional space, creation and annihilation operators are defined such that the F+1 dimensional space is complete for any combined excitation. Commutation and anti-commutation relations are then naturally derived, which show that the underlying degrees of freedom are neither bosons nor fermions. This sets the scene for developing equivalent expressions of the Hamiltonian operator in quantum mechanics and their classical/semiclassical counterparts. Six mapping models are presented as examples. The framework also offers a novel way to derive such as the well-known Meyer-Miller model.
ERIC Educational Resources Information Center
Partnership for 21st Century Skills, 2009
2009-01-01
To help practitioners integrate skills into the teaching of core academic subjects, the Partnership for 21st Century Skills has developed a unified, collective vision for learning known as the Framework for 21st Century Learning. This Framework describes the skills, knowledge and expertise students must master to succeed in work and life; it is a…
Toward a Unified Validation Framework in Mixed Methods Research
ERIC Educational Resources Information Center
Dellinger, Amy B.; Leech, Nancy L.
2007-01-01
The primary purpose of this article is to further discussions of validity in mixed methods research by introducing a validation framework to guide thinking about validity in this area. To justify the use of this framework, the authors discuss traditional terminology and validity criteria for quantitative and qualitative research, as well as…
[Arabian food pyramid: unified framework for nutritional health messages].
Shokr, Adel M
2008-01-01
There are several ways to present nutritional health messages, particularly pyramidic indices, but they have many deficiencies such as lack of agreement on a unified or clear methodology for food grouping and ignoring nutritional group inter-relation and integration. This causes confusion for health educators and target individuals. This paper presents an Arabian food pyramid that aims to unify the bases of nutritional health messages, bringing together the function, contents, source and nutritional group servings and indicating the inter-relation and integration of nutritional groups. This provides comprehensive, integrated, simple and flexible health messages.
A unified framework for the evaluation of surrogate endpoints in mental-health clinical trials.
Molenberghs, Geert; Burzykowski, Tomasz; Alonso, Ariel; Assam, Pryseley; Tilahun, Abel; Buyse, Marc
2010-06-01
For a number of reasons, surrogate endpoints are considered instead of the so-called true endpoint in clinical studies, especially when such endpoints can be measured earlier, and/or with less burden for patient and experimenter. Surrogate endpoints may occur more frequently than their standard counterparts. For these reasons, it is not surprising that the use of surrogate endpoints in clinical practice is increasing. Building on the seminal work of Prentice(1) and Freedman et al.,(2) Buyse et al. (3) framed the evaluation exercise within a meta-analytic setting, in an effort to overcome difficulties that necessarily surround evaluation efforts based on a single trial. In this article, we review the meta-analytic approach for continuous outcomes, discuss extensions to non-normal and longitudinal settings, as well as proposals to unify the somewhat disparate collection of validation measures currently on the market. Implications for design and for predicting the effect of treatment in a new trial, based on the surrogate, are discussed. A case study in schizophrenia is analysed.
A unified genetic association test robust to latent population structure for a count phenotype.
Song, Minsun
2018-06-04
Confounding caused by latent population structure in genome-wide association studies has been a big concern despite the success of genome-wide association studies at identifying genetic variants associated with complex diseases. In particular, because of the growing interest in association mapping using count phenotype data, it would be interesting to develop a testing framework for genetic associations that is immune to population structure when phenotype data consist of count measurements. Here, I propose a solution for testing associations between single nucleotide polymorphisms and a count phenotype in the presence of an arbitrary population structure. I consider a classical range of models for count phenotype data. Under these models, a unified test for genetic associations that protects against confounding was derived. An algorithm was developed to efficiently estimate the parameters that are required to fit the proposed model. I illustrate the proposed approach using simulation studies and an empirical study. Both simulated and real-data examples suggest that the proposed method successfully corrects population structure. Copyright © 2018 John Wiley & Sons, Ltd.
Collusion-resistant multimedia fingerprinting: a unified framework
NASA Astrophysics Data System (ADS)
Wu, Min; Trappe, Wade; Wang, Z. Jane; Liu, K. J. Ray
2004-06-01
Digital fingerprints are unique labels inserted in different copies of the same content before distribution. Each digital fingerprint is assigned to an inteded recipient, and can be used to trace the culprits who use their content for unintended purposes. Attacks mounted by multiple users, known as collusion attacks, provide a cost-effective method for attenuating the identifying fingerprint from each coluder, thus collusion poses a reeal challenge to protect the digital media data and enforce usage policies. This paper examines a few major design methodologies for collusion-resistant fingerprinting of multimedia, and presents a unified framework that helps highlight the common issues and the uniqueness of different fingerprinting techniques.
A development framework for semantically interoperable health information systems.
Lopez, Diego M; Blobel, Bernd G M E
2009-02-01
Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.
Rosenfeld, Daniel L; Burrow, Anthony L
2017-05-01
By departing from social norms regarding food behaviors, vegetarians acquire membership in a distinct social group and can develop a salient vegetarian identity. However, vegetarian identities are diverse, multidimensional, and unique to each individual. Much research has identified fundamental psychological aspects of vegetarianism, and an identity framework that unifies these findings into common constructs and conceptually defines variables is needed. Integrating psychological theories of identity with research on food choices and vegetarianism, this paper proposes a conceptual model for studying vegetarianism: The Unified Model of Vegetarian Identity (UMVI). The UMVI encompasses ten dimensions-organized into three levels (contextual, internalized, and externalized)-that capture the role of vegetarianism in an individual's self-concept. Contextual dimensions situate vegetarianism within contexts; internalized dimensions outline self-evaluations; and externalized dimensions describe enactments of identity through behavior. Together, these dimensions form a coherent vegetarian identity, characterizing one's thoughts, feelings, and behaviors regarding being vegetarian. By unifying dimensions that capture psychological constructs universally, the UMVI can prevent discrepancies in operationalization, capture the inherent diversity of vegetarian identities, and enable future research to generate greater insight into how people understand themselves and their food choices. Copyright © 2017 Elsevier Ltd. All rights reserved.
Franz, A; Triesch, J
2010-12-01
The perception of the unity of objects, their permanence when out of sight, and the ability to perceive continuous object trajectories even during occlusion belong to the first and most important capacities that infants have to acquire. Despite much research a unified model of the development of these abilities is still missing. Here we make an attempt to provide such a unified model. We present a recurrent artificial neural network that learns to predict the motion of stimuli occluding each other and that develops representations of occluded object parts. It represents completely occluded, moving objects for several time steps and successfully predicts their reappearance after occlusion. This framework allows us to account for a broad range of experimental data. Specifically, the model explains how the perception of object unity develops, the role of the width of the occluders, and it also accounts for differences between data for moving and stationary stimuli. We demonstrate that these abilities can be acquired by learning to predict the sensory input. The model makes specific predictions and provides a unifying framework that has the potential to be extended to other visual event categories. Copyright © 2010 Elsevier Inc. All rights reserved.
A Unified Theoretical Framework for Cognitive Sequencing.
Savalia, Tejas; Shukla, Anuj; Bapi, Raju S
2016-01-01
The capacity to sequence information is central to human performance. Sequencing ability forms the foundation stone for higher order cognition related to language and goal-directed planning. Information related to the order of items, their timing, chunking and hierarchical organization are important aspects in sequencing. Past research on sequencing has emphasized two distinct and independent dichotomies: implicit vs. explicit and goal-directed vs. habits. We propose a theoretical framework unifying these two streams. Our proposal relies on brain's ability to implicitly extract statistical regularities from the stream of stimuli and with attentional engagement organizing sequences explicitly and hierarchically. Similarly, sequences that need to be assembled purposively to accomplish a goal require engagement of attentional processes. With repetition, these goal-directed plans become habits with concomitant disengagement of attention. Thus, attention and awareness play a crucial role in the implicit-to-explicit transition as well as in how goal-directed plans become automatic habits. Cortico-subcortical loops basal ganglia-frontal cortex and hippocampus-frontal cortex loops mediate the transition process. We show how the computational principles of model-free and model-based learning paradigms, along with a pivotal role for attention and awareness, offer a unifying framework for these two dichotomies. Based on this framework, we make testable predictions related to the potential influence of response-to-stimulus interval (RSI) on developing awareness in implicit learning tasks.
A Unified Theoretical Framework for Cognitive Sequencing
Savalia, Tejas; Shukla, Anuj; Bapi, Raju S.
2016-01-01
The capacity to sequence information is central to human performance. Sequencing ability forms the foundation stone for higher order cognition related to language and goal-directed planning. Information related to the order of items, their timing, chunking and hierarchical organization are important aspects in sequencing. Past research on sequencing has emphasized two distinct and independent dichotomies: implicit vs. explicit and goal-directed vs. habits. We propose a theoretical framework unifying these two streams. Our proposal relies on brain's ability to implicitly extract statistical regularities from the stream of stimuli and with attentional engagement organizing sequences explicitly and hierarchically. Similarly, sequences that need to be assembled purposively to accomplish a goal require engagement of attentional processes. With repetition, these goal-directed plans become habits with concomitant disengagement of attention. Thus, attention and awareness play a crucial role in the implicit-to-explicit transition as well as in how goal-directed plans become automatic habits. Cortico-subcortical loops basal ganglia-frontal cortex and hippocampus-frontal cortex loops mediate the transition process. We show how the computational principles of model-free and model-based learning paradigms, along with a pivotal role for attention and awareness, offer a unifying framework for these two dichotomies. Based on this framework, we make testable predictions related to the potential influence of response-to-stimulus interval (RSI) on developing awareness in implicit learning tasks. PMID:27917146
The Pursuit of a "Better" Explanation as an Organizing Framework for Science Teaching and Learning
ERIC Educational Resources Information Center
Papadouris, Nicos; Vokos, Stamatis; Constantinou, Constantinos P.
2018-01-01
This article seeks to make the case for the pursuit of a "better" explanation being a productive organizing framework for science teaching and learning. Underlying this position is the idea that this framework allows promoting, in a unified manner, facility with the scientific practice of constructing explanations, appreciation of its…
A Unified Fisher's Ratio Learning Method for Spatial Filter Optimization.
Li, Xinyang; Guan, Cuntai; Zhang, Haihong; Ang, Kai Keng
To detect the mental task of interest, spatial filtering has been widely used to enhance the spatial resolution of electroencephalography (EEG). However, the effectiveness of spatial filtering is undermined due to the significant nonstationarity of EEG. Based on regularization, most of the conventional stationary spatial filter design methods address the nonstationarity at the cost of the interclass discrimination. Moreover, spatial filter optimization is inconsistent with feature extraction when EEG covariance matrices could not be jointly diagonalized due to the regularization. In this paper, we propose a novel framework for a spatial filter design. With Fisher's ratio in feature space directly used as the objective function, the spatial filter optimization is unified with feature extraction. Given its ratio form, the selection of the regularization parameter could be avoided. We evaluate the proposed method on a binary motor imagery data set of 16 subjects, who performed the calibration and test sessions on different days. The experimental results show that the proposed method yields improvement in classification performance for both single broadband and filter bank settings compared with conventional nonunified methods. We also provide a systematic attempt to compare different objective functions in modeling data nonstationarity with simulation studies.To detect the mental task of interest, spatial filtering has been widely used to enhance the spatial resolution of electroencephalography (EEG). However, the effectiveness of spatial filtering is undermined due to the significant nonstationarity of EEG. Based on regularization, most of the conventional stationary spatial filter design methods address the nonstationarity at the cost of the interclass discrimination. Moreover, spatial filter optimization is inconsistent with feature extraction when EEG covariance matrices could not be jointly diagonalized due to the regularization. In this paper, we propose a novel framework for a spatial filter design. With Fisher's ratio in feature space directly used as the objective function, the spatial filter optimization is unified with feature extraction. Given its ratio form, the selection of the regularization parameter could be avoided. We evaluate the proposed method on a binary motor imagery data set of 16 subjects, who performed the calibration and test sessions on different days. The experimental results show that the proposed method yields improvement in classification performance for both single broadband and filter bank settings compared with conventional nonunified methods. We also provide a systematic attempt to compare different objective functions in modeling data nonstationarity with simulation studies.
Kondo blockade due to quantum interference in single-molecule junctions
Mitchell, Andrew K.; Pedersen, Kim G. L.; Hedegård, Per; Paaske, Jens
2017-01-01
Molecular electronics offers unique scientific and technological possibilities, resulting from both the nanometre scale of the devices and their reproducible chemical complexity. Two fundamental yet different effects, with no classical analogue, have been demonstrated experimentally in single-molecule junctions: quantum interference due to competing electron transport pathways, and the Kondo effect due to entanglement from strong electronic interactions. Here we unify these phenomena, showing that transport through a spin-degenerate molecule can be either enhanced or blocked by Kondo correlations, depending on molecular structure, contacting geometry and applied gate voltages. An exact framework is developed, in terms of which the quantum interference properties of interacting molecular junctions can be systematically studied and understood. We prove that an exact Kondo-mediated conductance node results from destructive interference in exchange-cotunneling. Nonstandard temperature dependences and gate-tunable conductance peaks/nodes are demonstrated for prototypical molecular junctions, illustrating the intricate interplay of quantum effects beyond the single-orbital paradigm. PMID:28492236
Space-Time Processing for Tactical Mobile Ad Hoc Networks
2008-08-01
vision for multiple concurrent communication settings, i.e., a many-to-many framework where multi-packet transmissions (MPTs) and multi-packet...modelling framework of capacity-delay tradeoffs We have introduced the first unified modeling framework for the computation of fundamental limits o We...dalities in wireless n twor i-packet modelling framework to account for the use of m lti-packet reception (MPR) f ad hoc networks with MPT under
Unified formalism for higher order non-autonomous dynamical systems
NASA Astrophysics Data System (ADS)
Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso
2012-03-01
This work is devoted to giving a geometric framework for describing higher order non-autonomous mechanical systems. The starting point is to extend the Lagrangian-Hamiltonian unified formalism of Skinner and Rusk for these kinds of systems, generalizing previous developments for higher order autonomous mechanical systems and first-order non-autonomous mechanical systems. Then, we use this unified formulation to derive the standard Lagrangian and Hamiltonian formalisms, including the Legendre-Ostrogradsky map and the Euler-Lagrange and the Hamilton equations, both for regular and singular systems. As applications of our model, two examples of regular and singular physical systems are studied.
Unification of small and large time scales for biological evolution: deviations from power law.
Chowdhury, Debashish; Stauffer, Dietrich; Kunwar, Ambarish
2003-02-14
We develop a unified model that describes both "micro" and "macro" evolutions within a single theoretical framework. The ecosystem is described as a dynamic network; the population dynamics at each node of this network describes the "microevolution" over ecological time scales (i.e., birth, ageing, and natural death of individual organisms), while the appearance of new nodes, the slow changes of the links, and the disappearance of existing nodes accounts for the "macroevolution" over geological time scales (i.e., the origination, evolution, and extinction of species). In contrast to several earlier claims in the literature, we observe strong deviations from power law in the regime of long lifetimes.
Algorithm Building and Learning Programming Languages Using a New Educational Paradigm
NASA Astrophysics Data System (ADS)
Jain, Anshul K.; Singhal, Manik; Gupta, Manu Sheel
2011-08-01
This research paper presents a new concept of using a single tool to associate syntax of various programming languages, algorithms and basic coding techniques. A simple framework has been programmed in Python that helps students learn skills to develop algorithms, and implement them in various programming languages. The tool provides an innovative and a unified graphical user interface for development of multimedia objects, educational games and applications. It also aids collaborative learning amongst students and teachers through an integrated mechanism based on Remote Procedure Calls. The paper also elucidates an innovative method for code generation to enable students to learn the basics of programming languages using drag-n-drop methods for image objects.
Covariant effective action for a Galilean invariant quantum Hall system
Geracie, Michael; Prabhu, Kartik; Roberts, Matthew M.
2016-09-16
Here, we construct effective field theories for gapped quantum Hall systems coupled to background geometries with local Galilean invariance i.e. Bargmann spacetimes. Along with an electromagnetic field, these backgrounds include the effects of curved Galilean spacetimes, including torsion and a gravitational field, allowing us to study charge, energy, stress and mass currents within a unified framework. A shift symmetry specific to single constituent theories constraints the effective action to couple to an effective background gauge field and spin connection that is solved for by a self-consistent equation, providing a manifestly covariant extension of Hoyos and Son’s improvement terms to arbitrarymore » order in m.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lue Xing; Sun Kun; Wang Pan
In the framework of Bell-polynomial manipulations, under investigation hereby are three single-field bilinearizable equations: the (1+1)-dimensional shallow water wave model, Boiti-Leon-Manna-Pempinelli model, and (2+1)-dimensional Sawada-Kotera model. Based on the concept of scale invariance, a direct and unifying Bell-polynomial scheme is employed to achieve the Baecklund transformations and Lax pairs associated with those three soliton equations. Note that the Bell-polynomial expressions and Bell-polynomial-typed Baecklund transformations for those three soliton equations can be, respectively, cast into the bilinear equations and bilinear Baecklund transformations with symbolic computation. Consequently, it is also shown that the Bell-polynomial-typed Baecklund transformations can be linearized into the correspondingmore » Lax pairs.« less
Sparse Bayesian Learning for Nonstationary Data Sources
NASA Astrophysics Data System (ADS)
Fujimaki, Ryohei; Yairi, Takehisa; Machida, Kazuo
This paper proposes an online Sparse Bayesian Learning (SBL) algorithm for modeling nonstationary data sources. Although most learning algorithms implicitly assume that a data source does not change over time (stationary), one in the real world usually does due to such various factors as dynamically changing environments, device degradation, sudden failures, etc (nonstationary). The proposed algorithm can be made useable for stationary online SBL by setting time decay parameters to zero, and as such it can be interpreted as a single unified framework for online SBL for use with stationary and nonstationary data sources. Tests both on four types of benchmark problems and on actual stock price data have shown it to perform well.
Colaborated Architechture Framework for Composition UML 2.0 in Zachman Framework
NASA Astrophysics Data System (ADS)
Hermawan; Hastarista, Fika
2016-01-01
Zachman Framework (ZF) is the framework of enterprise architechture that most widely adopted in the Enterprise Information System (EIS) development. In this study, has been developed Colaborated Architechture Framework (CAF) to collaborate ZF with Unified Modeling Language (UML) 2.0 modeling. The CAF provides the composition of ZF matrix that each cell is consist of the Model Driven architechture (MDA) from the various UML models and many Software Requirement Specification (SRS) documents. Implementation of this modeling is used to develops Enterprise Resource Planning (ERP). Because ERP have a coverage of applications in large numbers and complexly relations, it is necessary to use Agile Model Driven Design (AMDD) approach as an advanced method to transforms MDA into components of application modules with efficiently and accurately. Finally, through the using of the CAF, give good achievement in fullfilment the needs from all stakeholders that are involved in the overall process stage of Rational Unified Process (RUP), and also obtaining a high satisfaction to fullfiled the functionality features of the ERP software in PT. Iglas (Persero) Gresik.
Family Systems Theory: A Unifying Framework for Codependence.
ERIC Educational Resources Information Center
Prest, Layne A.; Protinsky, Howard
1993-01-01
Considers addictions and construct of codependence. Offers critical review and synthesis of codependency literature, along with an intergenerational family systems framework for conceptualizing the relationship of the dysfunctional family to the construct of codependence. Presents theoretical basis for systemic clinical work and research in this…
[Research on tumor information grid framework].
Zhang, Haowei; Qin, Zhu; Liu, Ying; Tan, Jianghao; Cao, Haitao; Chen, Youping; Zhang, Ke; Ding, Yuqing
2013-10-01
In order to realize tumor disease information sharing and unified management, we utilized grid technology to make the data and software resources which distributed in various medical institutions for effective integration so that we could make the heterogeneous resources consistent and interoperable in both semantics and syntax aspects. This article describes the tumor grid framework, the type of the service being packaged in Web Service Description Language (WSDL) and extensible markup language schemas definition (XSD), the client use the serialized document to operate the distributed resources. The service objects could be built by Unified Modeling Language (UML) as middle ware to create application programming interface. All of the grid resources are registered in the index and released in the form of Web Services based on Web Services Resource Framework (WSRF). Using the system we can build a multi-center, large sample and networking tumor disease resource sharing framework to improve the level of development in medical scientific research institutions and the patient's quality of life.
Kwok, T; Smith, K A
2000-09-01
The aim of this paper is to study both the theoretical and experimental properties of chaotic neural network (CNN) models for solving combinatorial optimization problems. Previously we have proposed a unifying framework which encompasses the three main model types, namely, Chen and Aihara's chaotic simulated annealing (CSA) with decaying self-coupling, Wang and Smith's CSA with decaying timestep, and the Hopfield network with chaotic noise. Each of these models can be represented as a special case under the framework for certain conditions. This paper combines the framework with experimental results to provide new insights into the effect of the chaotic neurodynamics of each model. By solving the N-queen problem of various sizes with computer simulations, the CNN models are compared in different parameter spaces, with optimization performance measured in terms of feasibility, efficiency, robustness and scalability. Furthermore, characteristic chaotic neurodynamics crucial to effective optimization are identified, together with a guide to choosing the corresponding model parameters.
Torfs, Elena; Martí, M Carmen; Locatelli, Florent; Balemans, Sophie; Bürger, Raimund; Diehl, Stefan; Laurent, Julien; Vanrolleghem, Peter A; François, Pierre; Nopens, Ingmar
2017-02-01
A new perspective on the modelling of settling behaviour in water resource recovery facilities is introduced. The ultimate goal is to describe in a unified way the processes taking place both in primary settling tanks (PSTs) and secondary settling tanks (SSTs) for a more detailed operation and control. First, experimental evidence is provided, pointing out distributed particle properties (such as size, shape, density, porosity, and flocculation state) as an important common source of distributed settling behaviour in different settling unit processes and throughout different settling regimes (discrete, hindered and compression settling). Subsequently, a unified model framework that considers several particle classes is proposed in order to describe distributions in settling behaviour as well as the effect of variations in particle properties on the settling process. The result is a set of partial differential equations (PDEs) that are valid from dilute concentrations, where they correspond to discrete settling, to concentrated suspensions, where they correspond to compression settling. Consequently, these PDEs model both PSTs and SSTs.
Taguchi, Kaori; Fukusaki, Eiichiro; Bamba, Takeshi
2014-10-03
Chromatography techniques usually use a single state in the mobile phase, such as liquid, gas, or supercritical fluid. Chromatographers manage one of these techniques for their purpose but are sometimes required to use multiple methods, or even worse, multiple techniques when the target compounds have a wide range of chemical properties. To overcome this challenge, we developed a single method covering a diverse compound range by means of a "unified" chromatography which completely bridges supercritical fluid chromatography and liquid chromatography. In our method, the phase state was continuously changed in the following order; supercritical, subcritical and liquid. Moreover, the gradient of the mobile phase starting at almost 100% CO2 was replaced with 100% methanol at the end completely. As a result, this approach achieved further extension of the polarity range of the mobile phase in a single run, and successfully enabled the simultaneous analysis of fat- and water-soluble vitamins with a wide logP range of -2.11 to 10.12. Furthermore, the 17 vitamins were exceptionally separated in 4min. Our results indicated that the use of dense CO2 and the replacement of CO2 by methanol are practical approaches in unified chromatography covering diverse compounds. Additionally, this is a first report to apply the novel approach to unified chromatography, and can open another door for diverse compound analysis in a single chromatographic technique with single injection, single column and single system. Copyright © 2014. Published by Elsevier B.V.
Groundwater modelling in decision support: reflections on a unified conceptual framework
NASA Astrophysics Data System (ADS)
Doherty, John; Simmons, Craig T.
2013-11-01
Groundwater models are commonly used as basis for environmental decision-making. There has been discussion and debate in recent times regarding the issue of model simplicity and complexity. This paper contributes to this ongoing discourse. The selection of an appropriate level of model structural and parameterization complexity is not a simple matter. Although the metrics on which such selection should be based are simple, there are many competing, and often unquantifiable, considerations which must be taken into account as these metrics are applied. A unified conceptual framework is introduced and described which is intended to underpin groundwater modelling in decision support with a direct focus on matters regarding model simplicity and complexity.
Pattern-oriented modeling of agent-based complex systems: Lessons from ecology
Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.
2005-01-01
Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.
Pattern-Oriented Modeling of Agent-Based Complex Systems: Lessons from Ecology
NASA Astrophysics Data System (ADS)
Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.
2005-11-01
Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.
A Unified Model of Geostrophic Adjustment and Frontogenesis
NASA Astrophysics Data System (ADS)
Taylor, John; Shakespeare, Callum
2013-11-01
Fronts, or regions with strong horizontal density gradients, are ubiquitous and dynamically important features of the ocean and atmosphere. In the ocean, fronts are associated with enhanced air-sea fluxes, turbulence, and biological productivity, while atmospheric fronts are associated with some of the most extreme weather events. Here, we describe a new mathematical framework for describing the formation of fronts, or frontogenesis. This framework unifies two classical problems in geophysical fluid dynamics, geostrophic adjustment and strain-driven frontogenesis, and provides a number of important extensions beyond previous efforts. The model solutions closely match numerical simulations during the early stages of frontogenesis, and provide a means to describe the development of turbulence at mature fronts.
Integrating diverse databases into an unified analysis framework: a Galaxy approach
Blankenberg, Daniel; Coraor, Nathan; Von Kuster, Gregory; Taylor, James; Nekrutenko, Anton
2011-01-01
Recent technological advances have lead to the ability to generate large amounts of data for model and non-model organisms. Whereas, in the past, there have been a relatively small number of central repositories that serve genomic data, an increasing number of distinct specialized data repositories and resources have been established. Here, we describe a generic approach that provides for the integration of a diverse spectrum of data resources into a unified analysis framework, Galaxy (http://usegalaxy.org). This approach allows the simplified coupling of external data resources with the data analysis tools available to Galaxy users, while leveraging the native data mining facilities of the external data resources. Database URL: http://usegalaxy.org PMID:21531983
Putting the School Interoperability Framework to the Test
ERIC Educational Resources Information Center
Mercurius, Neil; Burton, Glenn; Hopkins, Bill; Larsen, Hans
2004-01-01
The Jurupa Unified School District in Southern California recently partnered with Microsoft, Dell and the Zone Integration Group for the implementation of a School Interoperability Framework (SIF) database repository model throughout the district (Magner 2002). A two-week project--the Integrated District Education Applications System, better known…
Generalized Multilevel Structural Equation Modeling
ERIC Educational Resources Information Center
Rabe-Hesketh, Sophia; Skrondal, Anders; Pickles, Andrew
2004-01-01
A unifying framework for generalized multilevel structural equation modeling is introduced. The models in the framework, called generalized linear latent and mixed models (GLLAMM), combine features of generalized linear mixed models (GLMM) and structural equation models (SEM) and consist of a response model and a structural model for the latent…
Code of Federal Regulations, 2010 CFR
2010-07-01
... otherwise, they are so performed as to constitute a unified business system organized for a common business... their activities in such manner as to be for all intents and purposes a single business system except..., in part or in whole, by the individual companies comprising the unified business system. The various...
ERIC Educational Resources Information Center
O'Keeffe, Shawn Edward
2013-01-01
The author developed a unified nD framework and process ontology for Building Information Modeling (BIM). The research includes a framework developed for 6D BIM, nD BIM, and nD ontology that defines the domain and sub-domain constructs for future nD BIM dimensions. The nD ontology defines the relationships of kinds within any new proposed…
2017-05-25
Operations, and Unified Land Operations) and the US Army’s leader development model identifies how the education , training, and experience of field-grade...officers have failed in their incorporation of the framework because they lack the education , training, and experience for the use of the framework... education , training, and experience of field-grade officers at the division level have influenced their use of the operational framework. The cause for
A unifying framework for systems modeling, control systems design, and system operation
NASA Technical Reports Server (NTRS)
Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.
2005-01-01
Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.
Toward a unified approach to dose-response modeling in ecotoxicology.
Ritz, Christian
2010-01-01
This study reviews dose-response models that are used in ecotoxicology. The focus lies on clarification of differences and similarities between models, and as a side effect, their different guises in ecotoxicology are unravelled. A look at frequently used dose-response models reveals major discrepancies, among other things in naming conventions. Therefore, there is a need for a unified view on dose-response modeling in order to improve the understanding of it and to facilitate communication and comparison of findings across studies, thus realizing its full potential. This study attempts to establish a general framework that encompasses most dose-response models that are of interest to ecotoxicologists in practice. The framework includes commonly used models such as the log-logistic and Weibull models, but also features entire suites of models as found in various guidance documents. An outline on how the proposed framework can be implemented in statistical software systems is also provided.
Theoretical Foundation of Copernicus: A Unified System for Trajectory Design and Optimization
NASA Technical Reports Server (NTRS)
Ocampo, Cesar; Senent, Juan S.; Williams, Jacob
2010-01-01
The fundamental methods are described for the general spacecraft trajectory design and optimization software system called Copernicus. The methods rely on a unified framework that is used to model, design, and optimize spacecraft trajectories that may operate in complex gravitational force fields, use multiple propulsion systems, and involve multiple spacecraft. The trajectory model, with its associated equations of motion and maneuver models, are discussed.
A Unified Framework for Monetary Theory and Policy Analysis.
ERIC Educational Resources Information Center
Lagos, Ricardo; Wright, Randall
2005-01-01
Search-theoretic models of monetary exchange are based on explicit descriptions of the frictions that make money essential. However, tractable versions of these models typically make strong assumptions that render them ill suited for monetary policy analysis. We propose a new framework, based on explicit micro foundations, within which macro…
Generalizability Theory as a Unifying Framework of Measurement Reliability in Adolescent Research
ERIC Educational Resources Information Center
Fan, Xitao; Sun, Shaojing
2014-01-01
In adolescence research, the treatment of measurement reliability is often fragmented, and it is not always clear how different reliability coefficients are related. We show that generalizability theory (G-theory) is a comprehensive framework of measurement reliability, encompassing all other reliability methods (e.g., Pearson "r,"…
Shuttle unified navigation filter, revision 1
NASA Technical Reports Server (NTRS)
Muller, E. S., Jr.
1973-01-01
Equations designed to meet the navigation requirements of the separate shuttle mission phases are presented in a series of reports entitled, Space Shuttle GN and C Equation Document. The development of these equations is based on performance studies carried out for each particular mission phase. Although navigation equations have been documented separately for each mission phase, a single unified navigation filter design is embodied in these separate designs. The purpose of this document is to present the shuttle navigation equations in a form in which they would most likely be coded-as the single unified navigation filter used in each mission phase. This document will then serve as a single general reference for the navigation equations replacing each of the individual mission phase navigation documents (which may still be used as a description of a particular navigation phase).
Hu, Shiang; Yao, Dezhong; Valdes-Sosa, Pedro A
2018-01-01
The choice of reference for the electroencephalogram (EEG) is a long-lasting unsolved issue resulting in inconsistent usages and endless debates. Currently, both the average reference (AR) and the reference electrode standardization technique (REST) are two primary, apparently irreconcilable contenders. We propose a theoretical framework to resolve this reference issue by formulating both (a) estimation of potentials at infinity, and (b) determination of the reference, as a unified Bayesian linear inverse problem, which can be solved by maximum a posterior estimation. We find that AR and REST are very particular cases of this unified framework: AR results from biophysically non-informative prior; while REST utilizes the prior based on the EEG generative model. To allow for simultaneous denoising and reference estimation, we develop the regularized versions of AR and REST, named rAR and rREST, respectively. Both depend on a regularization parameter that is the noise to signal variance ratio. Traditional and new estimators are evaluated with this framework, by both simulations and analysis of real resting EEGs. Toward this end, we leverage the MRI and EEG data from 89 subjects which participated in the Cuban Human Brain Mapping Project. Generated artificial EEGs-with a known ground truth, show that relative error in estimating the EEG potentials at infinity is lowest for rREST. It also reveals that realistic volume conductor models improve the performances of REST and rREST. Importantly, for practical applications, it is shown that an average lead field gives the results comparable to the individual lead field. Finally, it is shown that the selection of the regularization parameter with Generalized Cross-Validation (GCV) is close to the "oracle" choice based on the ground truth. When evaluated with the real 89 resting state EEGs, rREST consistently yields the lowest GCV. This study provides a novel perspective to the EEG reference problem by means of a unified inverse solution framework. It may allow additional principled theoretical formulations and numerical evaluation of performance.
RT-18: Value of Flexibility. Phase 1
2010-09-25
an analytical framework based on sound mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory...framework that is mathematically consistent, domain independent and applicable under varying information levels. This report presents our advances in...During this period, we also explored the development of an analytical framework based on sound mathematical constructs. A review of the current state
Framework Design of Unified Cross-Authentication Based on the Fourth Platform Integrated Payment
NASA Astrophysics Data System (ADS)
Yong, Xu; Yujin, He
The essay advances a unified authentication based on the fourth integrated payment platform. The research aims at improving the compatibility of the authentication in electronic business and providing a reference for the establishment of credit system by seeking a way to carry out a standard unified authentication on a integrated payment platform. The essay introduces the concept of the forth integrated payment platform and finally put forward the whole structure and different components. The main issue of the essay is about the design of the credit system of the fourth integrated payment platform and the PKI/CA structure design.
The Aeroacoustics of Turbulent Flows
NASA Technical Reports Server (NTRS)
Goldstein, M. E.
2008-01-01
Aerodynamic noise prediction has been an important and challenging research area since James Lighthill first introduced his Acoustic Analogy Approach over fifty years ago. This talk attempts to provide a unified framework for the subsequent theoretical developments in this field. It assumes that there is no single approach that is optimal in all situations and uses the framework as a basis for discussing the strengths weaknesses of the various approaches to this topic. But the emphasis here will be on the important problem of predicting the noise from high speed air jets. Specific results will presented for round jets in the 0.5 to 1.4 Mach number range and compared with experimental data taken on the Glenn SHAR rig. It is demonstrated that nonparallel mean flow effects play an important role in predicting the noise at the supersonic Mach numbers. The results explain the failure of previous attempts based on the parallel flow Lilley model (which has served as the foundation for most jet noise analyses during past two decades).
Multichannel blind iterative image restoration.
Sroubek, Filip; Flusser, Jan
2003-01-01
Blind image deconvolution is required in many applications of microscopy imaging, remote sensing, and astronomical imaging. Unfortunately in a single-channel framework, serious conceptual and numerical problems are often encountered. Very recently, an eigenvector-based method (EVAM) was proposed for a multichannel framework which determines perfectly convolution masks in a noise-free environment if channel disparity, called co-primeness, is satisfied. We propose a novel iterative algorithm based on recent anisotropic denoising techniques of total variation and a Mumford-Shah functional with the EVAM restoration condition included. A linearization scheme of half-quadratic regularization together with a cell-centered finite difference discretization scheme is used in the algorithm and provides a unified approach to the solution of total variation or Mumford-Shah. The algorithm performs well even on very noisy images and does not require an exact estimation of mask orders. We demonstrate capabilities of the algorithm on synthetic data. Finally, the algorithm is applied to defocused images taken with a digital camera and to data from astronomical ground-based observations of the Sun.
A unified framework for building high performance DVEs
NASA Astrophysics Data System (ADS)
Lei, Kaibin; Ma, Zhixia; Xiong, Hua
2011-10-01
A unified framework for integrating PC cluster based parallel rendering with distributed virtual environments (DVEs) is presented in this paper. While various scene graphs have been proposed in DVEs, it is difficult to enable collaboration of different scene graphs. This paper proposes a technique for non-distributed scene graphs with the capability of object and event distribution. With the increase of graphics data, DVEs require more powerful rendering ability. But general scene graphs are inefficient in parallel rendering. The paper also proposes a technique to connect a DVE and a PC cluster based parallel rendering environment. A distributed multi-player video game is developed to show the interaction of different scene graphs and the parallel rendering performance on a large tiled display wall.
NASA Astrophysics Data System (ADS)
Sewell, Stephen
This thesis introduces a software framework that effectively utilizes low-cost commercially available Graphic Processing Units (GPUs) to simulate complex scientific plasma phenomena that are modeled using the Particle-In-Cell (PIC) paradigm. The software framework that was developed conforms to the Compute Unified Device Architecture (CUDA), a standard for general purpose graphic processing that was introduced by NVIDIA Corporation. This framework has been verified for correctness and applied to advance the state of understanding of the electromagnetic aspects of the development of the Aurora Borealis and Aurora Australis. For each phase of the PIC methodology, this research has identified one or more methods to exploit the problem's natural parallelism and effectively map it for execution on the graphic processing unit and its host processor. The sources of overhead that can reduce the effectiveness of parallelization for each of these methods have also been identified. One of the novel aspects of this research was the utilization of particle sorting during the grid interpolation phase. The final representation resulted in simulations that executed about 38 times faster than simulations that were run on a single-core general-purpose processing system. The scalability of this framework to larger problem sizes and future generation systems has also been investigated.
Unified Behavior Framework for Discrete Event Simulation Systems
2015-03-26
I would like to thank Dr. Hodson for his guidance and direction throughout the AFIT program. I also would like to thank my thesis committee members...SPA Sense-Plan-Act SSL System Service Layer TCA Task Control Architecture TRP Teleo-Reactive Program UAV Unmanned Aerial Vehicle UBF Unified Behavior...a teleo-reactive architecture [11]. Teleo-Reactive Programs ( TRPs ) are composed of a list of rules, where each has a condition and an action. When the
Evolutionary game theory meets social science: is there a unifying rule for human cooperation?
Rosas, Alejandro
2010-05-21
Evolutionary game theory has shown that human cooperation thrives in different types of social interactions with a PD structure. Models treat the cooperative strategies within the different frameworks as discrete entities and sometimes even as contenders. Whereas strong reciprocity was acclaimed as superior to classic reciprocity for its ability to defeat defectors in public goods games, recent experiments and simulations show that costly punishment fails to promote cooperation in the IR and DR games, where classic reciprocity succeeds. My aim is to show that cooperative strategies across frameworks are capable of a unified treatment, for they are governed by a common underlying rule or norm. An analysis of the reputation and action rules that govern some representative cooperative strategies both in models and in economic experiments confirms that the different frameworks share a conditional action rule and several reputation rules. The common conditional rule contains an option between costly punishment and withholding benefits that provides alternative enforcement methods against defectors. Depending on the framework, individuals can switch to the appropriate strategy and method of enforcement. The stability of human cooperation looks more promising if one mechanism controls successful strategies across frameworks. Published by Elsevier Ltd.
Peer-to-peer Monte Carlo simulation of photon migration in topical applications of biomedical optics
NASA Astrophysics Data System (ADS)
Doronin, Alexander; Meglinski, Igor
2012-09-01
In the framework of further development of the unified approach of photon migration in complex turbid media, such as biological tissues we present a peer-to-peer (P2P) Monte Carlo (MC) code. The object-oriented programming is used for generalization of MC model for multipurpose use in various applications of biomedical optics. The online user interface providing multiuser access is developed using modern web technologies, such as Microsoft Silverlight, ASP.NET. The emerging P2P network utilizing computers with different types of compute unified device architecture-capable graphics processing units (GPUs) is applied for acceleration and to overcome the limitations, imposed by multiuser access in the online MC computational tool. The developed P2P MC was validated by comparing the results of simulation of diffuse reflectance and fluence rate distribution for semi-infinite scattering medium with known analytical results, results of adding-doubling method, and with other GPU-based MC techniques developed in the past. The best speedup of processing multiuser requests in a range of 4 to 35 s was achieved using single-precision computing, and the double-precision computing for floating-point arithmetic operations provides higher accuracy.
Doronin, Alexander; Meglinski, Igor
2012-09-01
In the framework of further development of the unified approach of photon migration in complex turbid media, such as biological tissues we present a peer-to-peer (P2P) Monte Carlo (MC) code. The object-oriented programming is used for generalization of MC model for multipurpose use in various applications of biomedical optics. The online user interface providing multiuser access is developed using modern web technologies, such as Microsoft Silverlight, ASP.NET. The emerging P2P network utilizing computers with different types of compute unified device architecture-capable graphics processing units (GPUs) is applied for acceleration and to overcome the limitations, imposed by multiuser access in the online MC computational tool. The developed P2P MC was validated by comparing the results of simulation of diffuse reflectance and fluence rate distribution for semi-infinite scattering medium with known analytical results, results of adding-doubling method, and with other GPU-based MC techniques developed in the past. The best speedup of processing multiuser requests in a range of 4 to 35 s was achieved using single-precision computing, and the double-precision computing for floating-point arithmetic operations provides higher accuracy.
Strategic forum. Number 70. Regional deterrence strategies for new proliferation threats
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kahan, J.H.
1996-04-01
The deterrence of armed aggression against the United States, its vital national interests, or its allies has moved beyond the requirements of conventional force deterrence. The proliferation of nuclear, biological, and chemical (NBC) weapons requires a new strategy to ensure effective deterrence against their use by regional states that could not win in a conventional conflict with the United States. Because proliferation has expanded to a number of regional actors, a single strategy is unlikely to be sufficient in deterring states with varied motivations, and social, economic, religious, cultural, and political backgrounds. The Unified Commands-principally the Pacific, Central and Europeanmore » Commands- provide a ready-made framework in which general U.S. deterrence strategies can be tailored to each proliferant state While the Unified Commands would shape the individual deterrence strategies, the national command authority (NCA) would retain control of key decisions. Guidelines for NBC regional deterrence should include developing credible counterproliferation postures, profiling potential adversaries, tailoring our military capabilities to specific threats, integrating NBC preparedness into exercises and warplans, and actively pursuing coalitions designed to deter regional proliferators from threatening to use or using NBC weapons.« less
NASA Technical Reports Server (NTRS)
Randall, David A.
1990-01-01
A bulk planetary boundary layer (PBL) model was developed with a simple internal vertical structure and a simple second-order closure, designed for use as a PBL parameterization in a large-scale model. The model allows the mean fields to vary with height within the PBL, and so must address the vertical profiles of the turbulent fluxes, going beyond the usual mixed-layer assumption that the fluxes of conservative variables are linear with height. This is accomplished using the same convective mass flux approach that has also been used in cumulus parameterizations. The purpose is to show that such a mass flux model can include, in a single framework, the compensating subsidence concept, downgradient mixing, and well-mixed layers.
General System Theory: Toward a Conceptual Framework for Science and Technology Education for All.
ERIC Educational Resources Information Center
Chen, David; Stroup, Walter
1993-01-01
Suggests using general system theory as a unifying theoretical framework for science and technology education for all. Five reasons are articulated: the multidisciplinary nature of systems theory, the ability to engage complexity, the capacity to describe system dynamics, the ability to represent the relationship between microlevel and…
Making Learning Personally Meaningful: A New Framework for Relevance Research
ERIC Educational Resources Information Center
Priniski, Stacy J.; Hecht, Cameron A.; Harackiewicz, Judith M.
2018-01-01
Personal relevance goes by many names in the motivation literature, stemming from a number of theoretical frameworks. Currently these lines of research are being conducted in parallel with little synthesis across them, perhaps because there is no unifying definition of the relevance construct within which this research can be situated. In this…
Unifying Different Theories of Learning: Theoretical Framework and Empirical Evidence
ERIC Educational Resources Information Center
Phan, Huy Phuong
2008-01-01
The main aim of this research study was to test out a conceptual model encompassing the theoretical frameworks of achievement goals, study processing strategies, effort, and reflective thinking practice. In particular, it was postulated that the causal influences of achievement goals on academic performance are direct and indirect through study…
ERIC Educational Resources Information Center
MacLean, Justine; Mulholland, Rosemary; Gray, Shirley; Horrell, Andrew
2015-01-01
Background: Curriculum for Excellence, a new national policy initiative in Scottish Schools, provides a unified curricular framework for children aged 3-18. Within this framework, Physical Education (PE) now forms part of a collective alongside physical activity and sport, subsumed by the newly created curriculum area of "Health and…
Brainerd, C J; Reyna, V F; Howe, M L
2009-10-01
One of the most extensively investigated topics in the adult memory literature, dual memory processes, has had virtually no impact on the study of early memory development. The authors remove the key obstacles to such research by formulating a trichotomous theory of recall that combines the traditional dual processes of recollection and familiarity with a reconstruction process. The theory is then embedded in a hidden Markov model that measures all 3 processes with low-burden tasks that are appropriate for even young children. These techniques are applied to a large corpus of developmental studies of recall, yielding stable findings about the emergence of dual memory processes between childhood and young adulthood and generating tests of many theoretical predictions. The techniques are extended to the study of healthy aging and to the memory sequelae of common forms of neurocognitive impairment, resulting in a theoretical framework that is unified over 4 major domains of memory research: early development, mainstream adult research, aging, and neurocognitive impairment. The techniques are also extended to recognition, creating a unified dual process framework for recall and recognition.
BIRCH: a user-oriented, locally-customizable, bioinformatics system.
Fristensky, Brian
2007-02-09
Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere.
BIRCH: A user-oriented, locally-customizable, bioinformatics system
Fristensky, Brian
2007-01-01
Background Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. Results BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. Conclusion BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere. PMID:17291351
Kim, Jane J.; Schapira, Marilyn M.; Tosteson, Anna N. A.; Zauber, Ann G.; Geiger, Ann M.; Kamineni, Aruna; Weaver, Donald L.; Tiro, Jasmin A.
2015-01-01
General frameworks of the cancer screening process are available, but none directly compare the process in detail across different organ sites. This limits the ability of medical and public health professionals to develop and evaluate coordinated screening programs that apply resources and population management strategies available for one cancer site to other sites. We present a trans-organ conceptual model that incorporates a single screening episode for breast, cervical, and colorectal cancers into a unified framework based on clinical guidelines and protocols; the model concepts could be expanded to other organ sites. The model covers four types of care in the screening process: risk assessment, detection, diagnosis, and treatment. Interfaces between different provider teams (eg, primary care and specialty care), including communication and transfer of responsibility, may occur when transitioning between types of care. Our model highlights across each organ site similarities and differences in steps, interfaces, and transitions in the screening process and documents the conclusion of a screening episode. This model was developed within the National Cancer Institute–funded consortium Population-based Research Optimizing Screening through Personalized Regimens (PROSPR). PROSPR aims to optimize the screening process for breast, cervical, and colorectal cancer and includes seven research centers and a statistical coordinating center. Given current health care reform initiatives in the United States, this conceptual model can facilitate the development of comprehensive quality metrics for cancer screening and promote trans-organ comparative cancer screening research. PROSPR findings will support the design of interventions that improve screening outcomes across multiple cancer sites. PMID:25957378
A unifying model of concurrent spatial and temporal modularity in muscle activity.
Delis, Ioannis; Panzeri, Stefano; Pozzo, Thierry; Berret, Bastien
2014-02-01
Modularity in the central nervous system (CNS), i.e., the brain capability to generate a wide repertoire of movements by combining a small number of building blocks ("modules"), is thought to underlie the control of movement. Numerous studies reported evidence for such a modular organization by identifying invariant muscle activation patterns across various tasks. However, previous studies relied on decompositions differing in both the nature and dimensionality of the identified modules. Here, we derive a single framework that encompasses all influential models of muscle activation modularity. We introduce a new model (named space-by-time decomposition) that factorizes muscle activations into concurrent spatial and temporal modules. To infer these modules, we develop an algorithm, referred to as sample-based nonnegative matrix trifactorization (sNM3F). We test the space-by-time decomposition on a comprehensive electromyographic dataset recorded during execution of arm pointing movements and show that it provides a low-dimensional yet accurate, highly flexible and task-relevant representation of muscle patterns. The extracted modules have a well characterized functional meaning and implement an efficient trade-off between replication of the original muscle patterns and task discriminability. Furthermore, they are compatible with the modules extracted from existing models, such as synchronous synergies and temporal primitives, and generalize time-varying synergies. Our results indicate the effectiveness of a simultaneous but separate condensation of spatial and temporal dimensions of muscle patterns. The space-by-time decomposition accommodates a unified view of the hierarchical mapping from task parameters to coordinated muscle activations, which could be employed as a reference framework for studying compositional motor control.
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties observed in biological systems at different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be enough to confer intrinsic robustness in order to tolerate intrinsic parameter fluctuations, genetic robustness for buffering genetic variations, and environmental robustness for resisting environmental disturbances. With this, the phenotypic stability of biological network can be maintained, thus guaranteeing phenotype robustness. This paper presents a survey on biological systems and then develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation in systems and evolutionary biology. Further, from the unifying mathematical framework, it was discovered that the phenotype robustness criterion for biological networks at different levels relies upon intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness. When this is true, the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in systems and evolutionary biology can also be investigated through their corresponding phenotype robustness criterion from the systematic point of view. PMID:23515240
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties observed in biological systems at different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be enough to confer intrinsic robustness in order to tolerate intrinsic parameter fluctuations, genetic robustness for buffering genetic variations, and environmental robustness for resisting environmental disturbances. With this, the phenotypic stability of biological network can be maintained, thus guaranteeing phenotype robustness. This paper presents a survey on biological systems and then develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation in systems and evolutionary biology. Further, from the unifying mathematical framework, it was discovered that the phenotype robustness criterion for biological networks at different levels relies upon intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness. When this is true, the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in systems and evolutionary biology can also be investigated through their corresponding phenotype robustness criterion from the systematic point of view.
A new view of Baryon symmetric cosmology based on grand unified theories
NASA Technical Reports Server (NTRS)
Stecker, F. W.
1981-01-01
Within the framework of grand unified theories, it is shown how spontaneous CP violation leads to a domain structure in the universe with the domains evolving into separate regions of matter and antimatter excesses. Subsequent to exponential horizon growth, this can result in a universe of matter galaxies and antimatter galaxies. Various astrophysical data appear to favor this form of big bang cosmology. Future direct tests for cosmologically significant antimatter are discussed.
GUDM: Automatic Generation of Unified Datasets for Learning and Reasoning in Healthcare.
Ali, Rahman; Siddiqi, Muhammad Hameed; Idris, Muhammad; Ali, Taqdir; Hussain, Shujaat; Huh, Eui-Nam; Kang, Byeong Ho; Lee, Sungyoung
2015-07-02
A wide array of biomedical data are generated and made available to healthcare experts. However, due to the diverse nature of data, it is difficult to predict outcomes from it. It is therefore necessary to combine these diverse data sources into a single unified dataset. This paper proposes a global unified data model (GUDM) to provide a global unified data structure for all data sources and generate a unified dataset by a "data modeler" tool. The proposed tool implements user-centric priority based approach which can easily resolve the problems of unified data modeling and overlapping attributes across multiple datasets. The tool is illustrated using sample diabetes mellitus data. The diverse data sources to generate the unified dataset for diabetes mellitus include clinical trial information, a social media interaction dataset and physical activity data collected using different sensors. To realize the significance of the unified dataset, we adopted a well-known rough set theory based rules creation process to create rules from the unified dataset. The evaluation of the tool on six different sets of locally created diverse datasets shows that the tool, on average, reduces 94.1% time efforts of the experts and knowledge engineer while creating unified datasets.
GUDM: Automatic Generation of Unified Datasets for Learning and Reasoning in Healthcare
Ali, Rahman; Siddiqi, Muhammad Hameed; Idris, Muhammad; Ali, Taqdir; Hussain, Shujaat; Huh, Eui-Nam; Kang, Byeong Ho; Lee, Sungyoung
2015-01-01
A wide array of biomedical data are generated and made available to healthcare experts. However, due to the diverse nature of data, it is difficult to predict outcomes from it. It is therefore necessary to combine these diverse data sources into a single unified dataset. This paper proposes a global unified data model (GUDM) to provide a global unified data structure for all data sources and generate a unified dataset by a “data modeler” tool. The proposed tool implements user-centric priority based approach which can easily resolve the problems of unified data modeling and overlapping attributes across multiple datasets. The tool is illustrated using sample diabetes mellitus data. The diverse data sources to generate the unified dataset for diabetes mellitus include clinical trial information, a social media interaction dataset and physical activity data collected using different sensors. To realize the significance of the unified dataset, we adopted a well-known rough set theory based rules creation process to create rules from the unified dataset. The evaluation of the tool on six different sets of locally created diverse datasets shows that the tool, on average, reduces 94.1% time efforts of the experts and knowledge engineer while creating unified datasets. PMID:26147731
Ochs, Christopher; Geller, James; Perl, Yehoshua; Musen, Mark A.
2016-01-01
Software tools play a critical role in the development and maintenance of biomedical ontologies. One important task that is difficult without software tools is ontology quality assurance. In previous work, we have introduced different kinds of abstraction networks to provide a theoretical foundation for ontology quality assurance tools. Abstraction networks summarize the structure and content of ontologies. One kind of abstraction network that we have used repeatedly to support ontology quality assurance is the partial-area taxonomy. It summarizes structurally and semantically similar concepts within an ontology. However, the use of partial-area taxonomies was ad hoc and not generalizable. In this paper, we describe the Ontology Abstraction Framework (OAF), a unified framework and software system for deriving, visualizing, and exploring partial-area taxonomy abstraction networks. The OAF includes support for various ontology representations (e.g., OWL and SNOMED CT's relational format). A Protégé plugin for deriving “live partial-area taxonomies” is demonstrated. PMID:27345947
Ochs, Christopher; Geller, James; Perl, Yehoshua; Musen, Mark A
2016-08-01
Software tools play a critical role in the development and maintenance of biomedical ontologies. One important task that is difficult without software tools is ontology quality assurance. In previous work, we have introduced different kinds of abstraction networks to provide a theoretical foundation for ontology quality assurance tools. Abstraction networks summarize the structure and content of ontologies. One kind of abstraction network that we have used repeatedly to support ontology quality assurance is the partial-area taxonomy. It summarizes structurally and semantically similar concepts within an ontology. However, the use of partial-area taxonomies was ad hoc and not generalizable. In this paper, we describe the Ontology Abstraction Framework (OAF), a unified framework and software system for deriving, visualizing, and exploring partial-area taxonomy abstraction networks. The OAF includes support for various ontology representations (e.g., OWL and SNOMED CT's relational format). A Protégé plugin for deriving "live partial-area taxonomies" is demonstrated. Copyright © 2016 Elsevier Inc. All rights reserved.
Celedonio Aguirre-Bravo; Carlos Rodriguez Franco
1999-01-01
The general objective of this Symposium was to build on the best science and technology available to assure that the data and information produced in future inventory and monitoring programs are comparable, quality assured, available, and adequate for their intended purposes, thereby providing a reliable framework for characterization, assessment, and management of...
ERIC Educational Resources Information Center
Molina, Otilia Alejandro; Ratté, Sylvie
2017-01-01
This research introduces a method to construct a unified representation of teachers and students perspectives based on the actionable knowledge discovery (AKD) and delivery framework. The representation is constructed using two models: one obtained from student evaluations and the other obtained from teachers' reflections about their teaching…
Metzger, Marc J.; Bunce, Robert G.H.; Jongman, Rob H.G.; Sayre, Roger G.; Trabucco, Antonio; Zomer, Robert
2013-01-01
Main conclusions: The GEnS provides a robust spatial analytical framework for the aggregation of local observations, identification of gaps in current monitoring efforts and systematic design of complementary and new monitoring and research. The dataset is available for non-commercial use through the GEO portal (http://www.geoportal.org).
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…
Teaching Introductory Business Statistics Using the DCOVA Framework
ERIC Educational Resources Information Center
Levine, David M.; Stephan, David F.
2011-01-01
Introductory business statistics students often receive little guidance on how to apply the methods they learn to further business objectives they may one day face. And those students may fail to see the continuity among the topics taught in an introductory course if they learn those methods outside a context that provides a unifying framework.…
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…
Evaluating Health Information Systems Using Ontologies
Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan
2016-01-01
Background There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. Objectives The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems—whether similar or heterogeneous—by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. Methods On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. Results The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. Conclusions The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems. PMID:27311735
Evaluating Health Information Systems Using Ontologies.
Eivazzadeh, Shahryar; Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan
2016-06-16
There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems-whether similar or heterogeneous-by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems.
Practical robustness measures in multivariable control system analysis. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Lehtomaki, N. A.
1981-01-01
The robustness of the stability of multivariable linear time invariant feedback control systems with respect to model uncertainty is considered using frequency domain criteria. Available robustness tests are unified under a common framework based on the nature and structure of model errors. These results are derived using a multivariable version of Nyquist's stability theorem in which the minimum singular value of the return difference transfer matrix is shown to be the multivariable generalization of the distance to the critical point on a single input, single output Nyquist diagram. Using the return difference transfer matrix, a very general robustness theorem is presented from which all of the robustness tests dealing with specific model errors may be derived. The robustness tests that explicitly utilized model error structure are able to guarantee feedback system stability in the face of model errors of larger magnitude than those robustness tests that do not. The robustness of linear quadratic Gaussian control systems are analyzed.
Beyond Containment and Deterrence: A Security Framework for Europe in the 21st Century
1990-04-02
decades of the 21st Century in Europe, and examines DDO FJoA 1473 E. T1O. Of INOV 65 IS OBSOLETE Uaf eSECRIT CUnclassified SECURITY CLASSIFICATION’ OF THIS... Poland , and parts of France and Russia, but it did not truely unify Germany. Bismarck unified only parts of Germany which he could constrain under...Europe, Central Europe, the Balkans, and the Soviet Union. Central Europe includes Vest Germany, East Germany, Austria, Czechoslavakia, Poland , and
Towards a Unified Description of the Electroweak Nuclear Response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benhar, Omar; Lovato, Alessandro
2015-06-01
We briefly review the growing efforts to set up a unified framework for the description of neutrino interactions with atomic nuclei and nuclear matter, applicable in the broad kinematical region corresponding to neutrino energies ranging between few MeV and few GeV. The emerging picture suggests that the formalism of nuclear many-body theory (NMBT) can be exploited to obtain the neutrino-nucleus cross-sections needed for both the interpretation of oscillation signals and simulations of neutrino transport in compact stars
A theoretical formulation of wave-vortex interactions
NASA Technical Reports Server (NTRS)
Wu, J. Z.; Wu, J. M.
1989-01-01
A unified theoretical formulation for wave-vortex interaction, designated the '(omega, Pi) framework,' is presented. Based on the orthogonal decomposition of fluid dynamic interactions, the formulation can be used to study a variety of problems, including the interaction of a longitudinal (acoustic) wave and/or transverse (vortical) wave with a main vortex flow. Moreover, the formulation permits a unified treatment of wave-vortex interaction at various approximate levels, where the normal 'piston' process and tangential 'rubbing' process can be approximated dfferently.
42 CFR 411.352 - Group practice.
Code of Federal Regulations, 2010 CFR
2010-10-01
... considered to be a single legal entity notwithstanding that it is composed of multiple legal entities... ownership, governance, and operation; and (3) Organization of the group practice into multiple entities is...). (f) Unified business. (1) The group practice must be a unified business having at least the following...
Hu, Shiang; Yao, Dezhong; Valdes-Sosa, Pedro A.
2018-01-01
The choice of reference for the electroencephalogram (EEG) is a long-lasting unsolved issue resulting in inconsistent usages and endless debates. Currently, both the average reference (AR) and the reference electrode standardization technique (REST) are two primary, apparently irreconcilable contenders. We propose a theoretical framework to resolve this reference issue by formulating both (a) estimation of potentials at infinity, and (b) determination of the reference, as a unified Bayesian linear inverse problem, which can be solved by maximum a posterior estimation. We find that AR and REST are very particular cases of this unified framework: AR results from biophysically non-informative prior; while REST utilizes the prior based on the EEG generative model. To allow for simultaneous denoising and reference estimation, we develop the regularized versions of AR and REST, named rAR and rREST, respectively. Both depend on a regularization parameter that is the noise to signal variance ratio. Traditional and new estimators are evaluated with this framework, by both simulations and analysis of real resting EEGs. Toward this end, we leverage the MRI and EEG data from 89 subjects which participated in the Cuban Human Brain Mapping Project. Generated artificial EEGs—with a known ground truth, show that relative error in estimating the EEG potentials at infinity is lowest for rREST. It also reveals that realistic volume conductor models improve the performances of REST and rREST. Importantly, for practical applications, it is shown that an average lead field gives the results comparable to the individual lead field. Finally, it is shown that the selection of the regularization parameter with Generalized Cross-Validation (GCV) is close to the “oracle” choice based on the ground truth. When evaluated with the real 89 resting state EEGs, rREST consistently yields the lowest GCV. This study provides a novel perspective to the EEG reference problem by means of a unified inverse solution framework. It may allow additional principled theoretical formulations and numerical evaluation of performance. PMID:29780302
In quest of a systematic framework for unifying and defining nanoscience
2009-01-01
This article proposes a systematic framework for unifying and defining nanoscience based on historic first principles and step logic that led to a “central paradigm” (i.e., unifying framework) for traditional elemental/small-molecule chemistry. As such, a Nanomaterials classification roadmap is proposed, which divides all nanomatter into Category I: discrete, well-defined and Category II: statistical, undefined nanoparticles. We consider only Category I, well-defined nanoparticles which are >90% monodisperse as a function of Critical Nanoscale Design Parameters (CNDPs) defined according to: (a) size, (b) shape, (c) surface chemistry, (d) flexibility, and (e) elemental composition. Classified as either hard (H) (i.e., inorganic-based) or soft (S) (i.e., organic-based) categories, these nanoparticles were found to manifest pervasive atom mimicry features that included: (1) a dominance of zero-dimensional (0D) core–shell nanoarchitectures, (2) the ability to self-assemble or chemically bond as discrete, quantized nanounits, and (3) exhibited well-defined nanoscale valencies and stoichiometries reminiscent of atom-based elements. These discrete nanoparticle categories are referred to as hard or soft particle nanoelements. Many examples describing chemical bonding/assembly of these nanoelements have been reported in the literature. We refer to these hard:hard (H-n:H-n), soft:soft (S-n:S-n), or hard:soft (H-n:S-n) nanoelement combinations as nanocompounds. Due to their quantized features, many nanoelement and nanocompound categories are reported to exhibit well-defined nanoperiodic property patterns. These periodic property patterns are dependent on their quantized nanofeatures (CNDPs) and dramatically influence intrinsic physicochemical properties (i.e., melting points, reactivity/self-assembly, sterics, and nanoencapsulation), as well as important functional/performance properties (i.e., magnetic, photonic, electronic, and toxicologic properties). We propose this perspective as a modest first step toward more clearly defining synthetic nanochemistry as well as providing a systematic framework for unifying nanoscience. With further progress, one should anticipate the evolution of future nanoperiodic table(s) suitable for predicting important risk/benefit boundaries in the field of nanoscience. Electronic supplementary material The online version of this article (doi:10.1007/s11051-009-9632-z) contains supplementary material, which is available to authorized users. PMID:21170133
NASA Astrophysics Data System (ADS)
Beretta, Gian Paolo
2014-10-01
By suitable reformulations, we cast the mathematical frameworks of several well-known different approaches to the description of nonequilibrium dynamics into a unified formulation valid in all these contexts, which extends to such frameworks the concept of steepest entropy ascent (SEA) dynamics introduced by the present author in previous works on quantum thermodynamics. Actually, the present formulation constitutes a generalization also for the quantum thermodynamics framework. The analysis emphasizes that in the SEA modeling principle a key role is played by the geometrical metric with respect to which to measure the length of a trajectory in state space. In the near-thermodynamic-equilibrium limit, the metric tensor is directly related to the Onsager's generalized resistivity tensor. Therefore, through the identification of a suitable metric field which generalizes the Onsager generalized resistance to the arbitrarily far-nonequilibrium domain, most of the existing theories of nonequilibrium thermodynamics can be cast in such a way that the state exhibits the spontaneous tendency to evolve in state space along the path of SEA compatible with the conservation constraints and the boundary conditions. The resulting unified family of SEA dynamical models is intrinsically and strongly consistent with the second law of thermodynamics. The non-negativity of the entropy production is a general and readily proved feature of SEA dynamics. In several of the different approaches to nonequilibrium description we consider here, the SEA concept has not been investigated before. We believe it defines the precise meaning and the domain of general validity of the so-called maximum entropy production principle. Therefore, it is hoped that the present unifying approach may prove useful in providing a fresh basis for effective, thermodynamically consistent, numerical models and theoretical treatments of irreversible conservative relaxation towards equilibrium from far nonequilibrium states. The mathematical frameworks we consider are the following: (A) statistical or information-theoretic models of relaxation; (B) small-scale and rarefied gas dynamics (i.e., kinetic models for the Boltzmann equation); (C) rational extended thermodynamics, macroscopic nonequilibrium thermodynamics, and chemical kinetics; (D) mesoscopic nonequilibrium thermodynamics, continuum mechanics with fluctuations; and (E) quantum statistical mechanics, quantum thermodynamics, mesoscopic nonequilibrium quantum thermodynamics, and intrinsic quantum thermodynamics.
Development and application of unified algorithms for problems in computational science
NASA Technical Reports Server (NTRS)
Shankar, Vijaya; Chakravarthy, Sukumar
1987-01-01
A framework is presented for developing computationally unified numerical algorithms for solving nonlinear equations that arise in modeling various problems in mathematical physics. The concept of computational unification is an attempt to encompass efficient solution procedures for computing various nonlinear phenomena that may occur in a given problem. For example, in Computational Fluid Dynamics (CFD), a unified algorithm will be one that allows for solutions to subsonic (elliptic), transonic (mixed elliptic-hyperbolic), and supersonic (hyperbolic) flows for both steady and unsteady problems. The objectives are: development of superior unified algorithms emphasizing accuracy and efficiency aspects; development of codes based on selected algorithms leading to validation; application of mature codes to realistic problems; and extension/application of CFD-based algorithms to problems in other areas of mathematical physics. The ultimate objective is to achieve integration of multidisciplinary technologies to enhance synergism in the design process through computational simulation. Specific unified algorithms for a hierarchy of gas dynamics equations and their applications to two other areas: electromagnetic scattering, and laser-materials interaction accounting for melting.
The Little State That Couldn't Could? The Politics of "Single-Payer" Health Coverage in Vermont.
Fox, Ashley M; Blanchet, Nathan J
2015-06-01
In May 2011, a year after the passage of the Affordable Care Act (ACA), Vermont became the first state to lay the groundwork for a single-payer health care system, known as Green Mountain Care. What can other states learn from the Vermont experience? This article summarizes the findings from interviews with nearly 120 stakeholders as part of a study to inform the design of the health reform legislation. Comparing Vermont's failed effort to adopt single-payer legislation in 1994 to present efforts, we find that Vermont faced similar challenges but greater opportunities in 2010 that enabled reform. A closely contested gubernatorial election and a progressive social movement opened a window of opportunity to advance legislation to design three comprehensive health reform options for legislative consideration. With a unified Democratic government under the leadership of a single-payer proponent, a high-profile policy proposal, and relatively weak opposition, a framework for a single-payer system was adopted by the legislature - though with many details and political battles to be fought in the future. Other states looking to reform their health systems more comprehensively than national reform can learn from Vermont's design and political strategy. Copyright © 2015 by Duke University Press.
Integration of Multidisciplinary Sensory Data:
Miller, Perry L.; Nadkarni, Prakash; Singer, Michael; Marenco, Luis; Hines, Michael; Shepherd, Gordon
2001-01-01
The paper provides an overview of neuroinformatics research at Yale University being performed as part of the national Human Brain Project. This research is exploring the integration of multidisciplinary sensory data, using the olfactory system as a model domain. The neuroinformatics activities fall into three main areas: 1) building databases and related tools that support experimental olfactory research at Yale and can also serve as resources for the field as a whole, 2) using computer models (molecular models and neuronal models) to help understand data being collected experimentally and to help guide further laboratory experiments, 3) performing basic neuroinformatics research to develop new informatics technologies, including a flexible data model (EAV/CR, entity-attribute-value with classes and relationships) designed to facilitate the integration of diverse heterogeneous data within a single unifying framework. PMID:11141511
Heterarchies: Reconciling Networks and Hierarchies.
Cumming, Graeme S
2016-08-01
Social-ecological systems research suffers from a disconnect between hierarchical (top-down or bottom-up) and network (peer-to-peer) analyses. The concept of the heterarchy unifies these perspectives in a single framework. Here, I review the history and application of 'heterarchy' in neuroscience, ecology, archaeology, multiagent control systems, business and organisational studies, and politics. Recognising complex system architecture as a continuum along vertical and lateral axes ('flat versus hierarchical' and 'individual versus networked') suggests four basic types of heterarchy: reticulated, polycentric, pyramidal, and individualistic. Each has different implications for system functioning and resilience. Systems can also shift predictably and abruptly between architectures. Heterarchies suggest new ways of contextualising and generalising from case studies and new methods for analysing complex structure-function relations. Copyright © 2016 Elsevier Ltd. All rights reserved.
Design of a multiple kernel learning algorithm for LS-SVM by convex programming.
Jian, Ling; Xia, Zhonghang; Liang, Xijun; Gao, Chuanhou
2011-06-01
As a kernel based method, the performance of least squares support vector machine (LS-SVM) depends on the selection of the kernel as well as the regularization parameter (Duan, Keerthi, & Poo, 2003). Cross-validation is efficient in selecting a single kernel and the regularization parameter; however, it suffers from heavy computational cost and is not flexible to deal with multiple kernels. In this paper, we address the issue of multiple kernel learning for LS-SVM by formulating it as semidefinite programming (SDP). Furthermore, we show that the regularization parameter can be optimized in a unified framework with the kernel, which leads to an automatic process for model selection. Extensive experimental validations are performed and analyzed. Copyright © 2011 Elsevier Ltd. All rights reserved.
With or without you: predictive coding and Bayesian inference in the brain
Aitchison, Laurence; Lengyel, Máté
2018-01-01
Two theoretical ideas have emerged recently with the ambition to provide a unifying functional explanation of neural population coding and dynamics: predictive coding and Bayesian inference. Here, we describe the two theories and their combination into a single framework: Bayesian predictive coding. We clarify how the two theories can be distinguished, despite sharing core computational concepts and addressing an overlapping set of empirical phenomena. We argue that predictive coding is an algorithmic / representational motif that can serve several different computational goals of which Bayesian inference is but one. Conversely, while Bayesian inference can utilize predictive coding, it can also be realized by a variety of other representations. We critically evaluate the experimental evidence supporting Bayesian predictive coding and discuss how to test it more directly. PMID:28942084
Protein O-GlcNAcylation: emerging mechanisms and functions
Yang, Xiaoyong; Qian, Kevin
2017-01-01
O-GlcNAcylation—the attachment of O-linked N-acetylglucosamine (O-GlcNAc) moieties to cytoplasmic, nuclear and mitochondrial proteins—is a post-translational modification that regulates fundamental cellular processes in metazoans. A single pair of enzymes—O-GlcNAc transferase (OGT) and O-GlcNAcase (OGA)—controls the dynamic cycling of this post-translational modification in a nutrient- and stress-responsive manner. Recent years have seen remarkable advances in our understanding of O-GlcNAcylation at levels ranging from structural and molecular biology to cell signalling and gene regulation to physiology and disease. Emerging from these recent developments are new mechanisms and functions of O-GlcNAcylation that enable us to begin constructing a unified conceptual framework through which to understand the significance of this modification in cellular and organismal physiology. PMID:28488703
Superposition-model analysis of rare-earth doped BaY2F8
NASA Astrophysics Data System (ADS)
Magnani, N.; Amoretti, G.; Baraldi, A.; Capelletti, R.
The energy level schemes of four rare-earth dopants (Ce3+ , Nd3+ , Dy3+ , and Er3+) in BaY2 F-8 , as determined by optical absorption spectra, were fitted with a single-ion Hamiltonian and analysed within Newman's Superposition Model for the crystal field. A unified picture for the four dopants was obtained, by assuming a distortion of the F- ligand cage around the RE site; within the framework of the Superposition Model, this distortion is found to have a marked anisotropic behaviour for heavy rare earths, while it turns into an isotropic expansion of the nearest-neighbours polyhedron for light rare earths. It is also inferred that the substituting ion may occupy an off-center position with respect to the original Y3+ site in the crystal.
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…
ERIC Educational Resources Information Center
National Center for Education Statistics, 2011
2011-01-01
Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…
A unified framework for image retrieval using keyword and visual features.
Jing, Feng; Li, Mingling; Zhang, Hong-Jiang; Zhang, Bo
2005-07-01
In this paper, a unified image retrieval framework based on both keyword annotations and visual features is proposed. In this framework, a set of statistical models are built based on visual features of a small set of manually labeled images to represent semantic concepts and used to propagate keywords to other unlabeled images. These models are updated periodically when more images implicitly labeled by users become available through relevance feedback. In this sense, the keyword models serve the function of accumulation and memorization of knowledge learned from user-provided relevance feedback. Furthermore, two sets of effective and efficient similarity measures and relevance feedback schemes are proposed for query by keyword scenario and query by image example scenario, respectively. Keyword models are combined with visual features in these schemes. In particular, a new, entropy-based active learning strategy is introduced to improve the efficiency of relevance feedback for query by keyword. Furthermore, a new algorithm is proposed to estimate the keyword features of the search concept for query by image example. It is shown to be more appropriate than two existing relevance feedback algorithms. Experimental results demonstrate the effectiveness of the proposed framework.
40 CFR 300.105 - General organization concepts.
Code of Federal Regulations, 2010 CFR
2010-07-01
... capabilities. (b) Three fundamental kinds of activities are performed pursuant to the NCP: (1) Preparedness....205(c). (d) The basic framework for the response management structure is a system (e.g., a unified...
A unified and efficient framework for court-net sports video analysis using 3D camera modeling
NASA Astrophysics Data System (ADS)
Han, Jungong; de With, Peter H. N.
2007-01-01
The extensive amount of video data stored on available media (hard and optical disks) necessitates video content analysis, which is a cornerstone for different user-friendly applications, such as, smart video retrieval and intelligent video summarization. This paper aims at finding a unified and efficient framework for court-net sports video analysis. We concentrate on techniques that are generally applicable for more than one sports type to come to a unified approach. To this end, our framework employs the concept of multi-level analysis, where a novel 3-D camera modeling is utilized to bridge the gap between the object-level and the scene-level analysis. The new 3-D camera modeling is based on collecting features points from two planes, which are perpendicular to each other, so that a true 3-D reference is obtained. Another important contribution is a new tracking algorithm for the objects (i.e. players). The algorithm can track up to four players simultaneously. The complete system contributes to summarization by various forms of information, of which the most important are the moving trajectory and real-speed of each player, as well as 3-D height information of objects and the semantic event segments in a game. We illustrate the performance of the proposed system by evaluating it for a variety of court-net sports videos containing badminton, tennis and volleyball, and we show that the feature detection performance is above 92% and events detection about 90%.
Generic-distributed framework for cloud services marketplace based on unified ontology.
Hasan, Samer; Valli Kumari, V
2017-11-01
Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo) to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors' knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.
A unified framework for group independent component analysis for multi-subject fMRI data
Guo, Ying; Pagnoni, Giuseppe
2008-01-01
Independent component analysis (ICA) is becoming increasingly popular for analyzing functional magnetic resonance imaging (fMRI) data. While ICA has been successfully applied to single-subject analysis, the extension of ICA to group inferences is not straightforward and remains an active topic of research. Current group ICA models, such as the GIFT (Calhoun et al., 2001) and tensor PICA (Beckmann and Smith, 2005), make different assumptions about the underlying structure of the group spatio-temporal processes and are thus estimated using algorithms tailored for the assumed structure, potentially leading to diverging results. To our knowledge, there are currently no methods for assessing the validity of different model structures in real fMRI data and selecting the most appropriate one among various choices. In this paper, we propose a unified framework for estimating and comparing group ICA models with varying spatio-temporal structures. We consider a class of group ICA models that can accommodate different group structures and include existing models, such as the GIFT and tensor PICA, as special cases. We propose a maximum likelihood (ML) approach with a modified Expectation-Maximization (EM) algorithm for the estimation of the proposed class of models. Likelihood ratio tests (LRT) are presented to compare between different group ICA models. The LRT can be used to perform model comparison and selection, to assess the goodness-of-fit of a model in a particular data set, and to test group differences in the fMRI signal time courses between subject subgroups. Simulation studies are conducted to evaluate the performance of the proposed method under varying structures of group spatio-temporal processes. We illustrate our group ICA method using data from an fMRI study that investigates changes in neural processing associated with the regular practice of Zen meditation. PMID:18650105
Beaber, Elisabeth F; Kim, Jane J; Schapira, Marilyn M; Tosteson, Anna N A; Zauber, Ann G; Geiger, Ann M; Kamineni, Aruna; Weaver, Donald L; Tiro, Jasmin A
2015-06-01
General frameworks of the cancer screening process are available, but none directly compare the process in detail across different organ sites. This limits the ability of medical and public health professionals to develop and evaluate coordinated screening programs that apply resources and population management strategies available for one cancer site to other sites. We present a trans-organ conceptual model that incorporates a single screening episode for breast, cervical, and colorectal cancers into a unified framework based on clinical guidelines and protocols; the model concepts could be expanded to other organ sites. The model covers four types of care in the screening process: risk assessment, detection, diagnosis, and treatment. Interfaces between different provider teams (eg, primary care and specialty care), including communication and transfer of responsibility, may occur when transitioning between types of care. Our model highlights across each organ site similarities and differences in steps, interfaces, and transitions in the screening process and documents the conclusion of a screening episode. This model was developed within the National Cancer Institute-funded consortium Population-based Research Optimizing Screening through Personalized Regimens (PROSPR). PROSPR aims to optimize the screening process for breast, cervical, and colorectal cancer and includes seven research centers and a statistical coordinating center. Given current health care reform initiatives in the United States, this conceptual model can facilitate the development of comprehensive quality metrics for cancer screening and promote trans-organ comparative cancer screening research. PROSPR findings will support the design of interventions that improve screening outcomes across multiple cancer sites. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Bavassi, M Luz; Tagliazucchi, Enzo; Laje, Rodrigo
2013-02-01
Time processing in the few hundred milliseconds range is involved in the human skill of sensorimotor synchronization, like playing music in an ensemble or finger tapping to an external beat. In finger tapping, a mechanistic explanation in biologically plausible terms of how the brain achieves synchronization is still missing despite considerable research. In this work we show that nonlinear effects are important for the recovery of synchronization following a perturbation (a step change in stimulus period), even for perturbation magnitudes smaller than 10% of the period, which is well below the amount of perturbation needed to evoke other nonlinear effects like saturation. We build a nonlinear mathematical model for the error correction mechanism and test its predictions, and further propose a framework that allows us to unify the description of the three common types of perturbations. While previous authors have used two different model mechanisms for fitting different perturbation types, or have fitted different parameter value sets for different perturbation magnitudes, we propose the first unified description of the behavior following all perturbation types and magnitudes as the dynamical response of a compound model with fixed terms and a single set of parameter values. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Yeo, Haram; Ki, Hyungson
2018-03-01
In this article, we present a novel numerical method for computing thermal residual stresses from a viewpoint of fluid-structure interaction (FSI). In a thermal processing of a material, residual stresses are developed as the material undergoes melting and solidification, and liquid, solid, and a mixture of liquid and solid (or mushy state) coexist and interact with each other during the process. In order to accurately account for the stress development during phase changes, we derived a unified momentum equation from the momentum equations of incompressible fluids and elastoplastic solids. In this approach, the whole fluid-structure system is treated as a single continuum, and the interaction between fluid and solid phases across the mushy zone is naturally taken into account in a monolithic way. For thermal analysis, an enthalpy-based method was employed. As a numerical example, a two-dimensional laser heating problem was considered, where a carbon steel sheet was heated by a Gaussian laser beam. Momentum and energy equations were discretized on a uniform Cartesian grid in a finite volume framework, and temperature-dependent material properties were used. The austenite-martensite phase transformation of carbon steel was also considered. In this study, the effects of solid strains, fluid flow, mushy zone size, and laser heating time on residual stress formation were investigated.
Are There Unifying Trends in the Psychologies of 1990?
ERIC Educational Resources Information Center
Anastasi, Anne
The complexity and rapid expansion of the entire field of psychology make it appropriate to speak of "psychologies" when acknowledging the need for specialization of training and expertise. Nevertheless, unifying trends (UTs) exist in psychology, even though there can be no single set of theoretical principles to account for all empirical…
Unified constitutive models for high-temperature structural applications
NASA Technical Reports Server (NTRS)
Lindholm, U. S.; Chan, K. S.; Bodner, S. R.; Weber, R. M.; Walker, K. P.
1988-01-01
Unified constitutive models are characterized by the use of a single inelastic strain rate term for treating all aspects of inelastic deformation, including plasticity, creep, and stress relaxation under monotonic or cyclic loading. The structure of this class of constitutive theory pertinent for high temperature structural applications is first outlined and discussed. The effectiveness of the unified approach for representing high temperature deformation of Ni-base alloys is then evaluated by extensive comparison of experimental data and predictions of the Bodner-Partom and the Walker models. The use of the unified approach for hot section structural component analyses is demonstrated by applying the Walker model in finite element analyses of a benchmark notch problem and a turbine blade problem.
LIFE CYCLE ENGINEERING GUIDELINES
This document provides guidelines for the implementation of LCE concepts, information, and techniques in engineering products, systems, processes, and facilities. To make this document as practical and useable as possible, a unifying LCE framework is presented. Subsequent topics ...
Value of Flexibility - Phase 1
2010-09-25
weaknesses of each approach. During this period, we also explored the development of an analytical framework based on sound mathematical constructs... mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory or guidance on best approaches to...research activities is in developing a coherent value based definition of flexibility that is based on an analytical framework that is mathematically
Combining statistical inference and decisions in ecology
Williams, Perry J.; Hooten, Mevin B.
2016-01-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.
Nanothermodynamics in the strong coupling regime
NASA Astrophysics Data System (ADS)
Jarzynski, Christopher
In macroscopic thermodynamics, energy gained by a system is lost by its surroundings (or vice-versa), in accordance with the first law of thermodynamics. However, if the system-environment interaction energy cannot be neglected - as in the case of a microscopic system such as a single molecule in solution - then it is not immediately clear where to draw the line between the energy of the system and that of the environment. To which subsystem does the interaction energy belong? I will describe a microscopic formulation of both the first and second laws of thermodynamics that applies to this situation. In this framework, seven key thermodynamic quantities - internal energy, entropy, volume, enthalpy, Gibbs free energy, heat and work - are given precise microscopic definitions, and the first and second laws are preserved without requiring corrections due to finite system-environment coupling. Furthermore, these definitions reduce to the usual ones in the limit of macroscopic systems of interest. This condition establishes that a unifying framework can be developed, encompassing stochastic thermodynamics at one end and macroscopic thermodynamics at the other. A central element of this framework is a thermodynamic definition of the volume of the system of interest, which converges to the usual geometric definition when the system is large. This research was supported by the U.S. National Science Foundation through Grant No. DMR-1506969.
Unified approach to redshift in cosmological/black hole spacetimes and synchronous frame
NASA Astrophysics Data System (ADS)
Toporensky, A. V.; Zaslavskii, O. B.; Popov, S. B.
2018-01-01
Usually, interpretation of redshift in static spacetimes (for example, near black holes) is opposed to that in cosmology. In this methodological note, we show that both explanations are unified in a natural picture. This is achieved if, considering the static spacetime, one (i) makes a transition to a synchronous frame, and (ii) returns to the original frame by means of local Lorentz boost. To reach our goal, we consider a rather general class of spherically symmetric spacetimes. In doing so, we construct frames that generalize the well-known Lemaitre and Painlevé-Gullstand ones and elucidate the relation between them. This helps us to understand, in a unifying approach, how gravitation reveals itself in different branches of general relativity. This framework can be useful for general relativity university courses.
Impact of Beads and Drops on a Repellent Solid Surface: A Unified Description
NASA Astrophysics Data System (ADS)
Arora, S.; Fromental, J.-M.; Mora, S.; Phou, Ty; Ramos, L.; Ligoure, C.
2018-04-01
We investigate freely expanding sheets formed by ultrasoft gel beads, and liquid and viscoelastic drops, produced by the impact of the bead or drop on a silicon wafer covered with a thin layer of liquid nitrogen that suppresses viscous dissipation thanks to an inverse Leidenfrost effect. Our experiments show a unified behavior for the impact dynamics that holds for solids, liquids, and viscoelastic fluids and that we rationalize by properly taking into account elastocapillary effects. In this framework, the classical impact dynamics of solids and liquids, as far as viscous dissipation is negligible, appears as the asymptotic limits of a universal theoretical description. A novel material-dependent characteristic velocity that includes both capillary and bulk elasticity emerges from this unified description of the physics of impact.
SCIFIO: an extensible framework to support scientific image formats.
Hiner, Mark C; Rueden, Curtis T; Eliceiri, Kevin W
2016-12-07
No gold standard exists in the world of scientific image acquisition; a proliferation of instruments each with its own proprietary data format has made out-of-the-box sharing of that data nearly impossible. In the field of light microscopy, the Bio-Formats library was designed to translate such proprietary data formats to a common, open-source schema, enabling sharing and reproduction of scientific results. While Bio-Formats has proved successful for microscopy images, the greater scientific community was lacking a domain-independent framework for format translation. SCIFIO (SCientific Image Format Input and Output) is presented as a freely available, open-source library unifying the mechanisms of reading and writing image data. The core of SCIFIO is its modular definition of formats, the design of which clearly outlines the components of image I/O to encourage extensibility, facilitated by the dynamic discovery of the SciJava plugin framework. SCIFIO is structured to support coexistence of multiple domain-specific open exchange formats, such as Bio-Formats' OME-TIFF, within a unified environment. SCIFIO is a freely available software library developed to standardize the process of reading and writing scientific image formats.
Zenni, Rafael Dudeque; Dickie, Ian A; Wingfield, Michael J; Hirsch, Heidi; Crous, Casparus J; Meyerson, Laura A; Burgess, Treena I; Zimmermann, Thalita G; Klock, Metha M; Siemann, Evan; Erfmeier, Alexandra; Aragon, Roxana; Montti, Lia; Le Roux, Johannes J
2016-12-30
Evolutionary processes greatly impact the outcomes of biological invasions. An extensive body of research suggests that invasive populations often undergo phenotypic and ecological divergence from their native sources. Evolution also operates at different and distinct stages during the invasion process. Thus, it is important to incorporate evolutionary change into frameworks of biological invasions because it allows us to conceptualize how these processes may facilitate or hinder invasion success. Here, we review such processes, with an emphasis on tree invasions, and place them in the context of the unified framework for biological invasions. The processes and mechanisms described are pre-introduction evolutionary history, sampling effect, founder effect, genotype-by-environment interactions, admixture, hybridization, polyploidization, rapid evolution, epigenetics, and second-genomes. For the last, we propose that co-evolved symbionts, both beneficial and harmful, which are closely physiologically associated with invasive species, contain critical genetic traits that affect the evolutionary dynamics of biological invasions. By understanding the mechanisms underlying invasion success, researchers will be better equipped to predict, understand, and manage biological invasions. Published by Oxford University Press on behalf of the Annals of Botany Company.
Dickie, Ian A.; Wingfield, Michael J.; Hirsch, Heidi; Crous, Casparus J.; Meyerson, Laura A.; Burgess, Treena I.; Zimmermann, Thalita G.; Klock, Metha M.; Siemann, Evan; Erfmeier, Alexandra; Aragon, Roxana; Montti, Lia; Le Roux, Johannes J.
2017-01-01
Abstract Evolutionary processes greatly impact the outcomes of biological invasions. An extensive body of research suggests that invasive populations often undergo phenotypic and ecological divergence from their native sources. Evolution also operates at different and distinct stages during the invasion process. Thus, it is important to incorporate evolutionary change into frameworks of biological invasions because it allows us to conceptualize how these processes may facilitate or hinder invasion success. Here, we review such processes, with an emphasis on tree invasions, and place them in the context of the unified framework for biological invasions. The processes and mechanisms described are pre-introduction evolutionary history, sampling effect, founder effect, genotype-by-environment interactions, admixture, hybridization, polyploidization, rapid evolution, epigenetics and second-genomes. For the last, we propose that co-evolved symbionts, both beneficial and harmful, which are closely physiologically associated with invasive species, contain critical genetic traits that affect the evolutionary dynamics of biological invasions. By understanding the mechanisms underlying invasion success, researchers will be better equipped to predict, understand and manage biological invasions. PMID:28039118
NASA Astrophysics Data System (ADS)
Pathirana, A.; Radhakrishnan, M.; Zevenbergen, C.; Quan, N. H.
2016-12-01
The need to address the shortcomings of urban systems - adaptation deficit - and shortcomings in response to climate change - `adaptation gap' - are both major challenges in maintaining the livability and sustainability of cities. However, the adaptation actions defined in terms of type I (addressing adaptation deficits) and type II (addressing adaptation gaps), often compete and conflict each other in the secondary cities of the global south. Extending the concept of the environmental Kuznets curve, this paper argues that a unified framework that calls for synergistic action on type I and type II adaptation is essential in order for these cities to maintain their livability, sustainability and resilience facing extreme rates of urbanization and rapid onset of climate change. The proposed framework has been demonstrated in Can Tho, Vietnam, where there are significant adaptation deficits due to rapid urbanisation and adaptation gaps due to climate change and socio-economic changes. The analysis in Can Tho reveals the lack of integration between type I and type II measures that could be overcome by closer integration between various stakeholders in terms of planning, prioritising and implementing the adaptation measures.
Unified framework for automated iris segmentation using distantly acquired face images.
Tan, Chun-Wei; Kumar, Ajay
2012-09-01
Remote human identification using iris biometrics has high civilian and surveillance applications and its success requires the development of robust segmentation algorithm to automatically extract the iris region. This paper presents a new iris segmentation framework which can robustly segment the iris images acquired using near infrared or visible illumination. The proposed approach exploits multiple higher order local pixel dependencies to robustly classify the eye region pixels into iris or noniris regions. Face and eye detection modules have been incorporated in the unified framework to automatically provide the localized eye region from facial image for iris segmentation. We develop robust postprocessing operations algorithm to effectively mitigate the noisy pixels caused by the misclassification. Experimental results presented in this paper suggest significant improvement in the average segmentation errors over the previously proposed approaches, i.e., 47.5%, 34.1%, and 32.6% on UBIRIS.v2, FRGC, and CASIA.v4 at-a-distance databases, respectively. The usefulness of the proposed approach is also ascertained from recognition experiments on three different publicly available databases.
Trajectory optimization for lunar soft landing with complex constraints
NASA Astrophysics Data System (ADS)
Chu, Huiping; Ma, Lin; Wang, Kexin; Shao, Zhijiang; Song, Zhengyu
2017-11-01
A unified trajectory optimization framework with initialization strategies is proposed in this paper for lunar soft landing for various missions with specific requirements. Two main missions of interest are Apollo-like Landing from low lunar orbit and Vertical Takeoff Vertical Landing (a promising mobility method) on the lunar surface. The trajectory optimization is characterized by difficulties arising from discontinuous thrust, multi-phase connections, jump of attitude angle, and obstacles avoidance. Here R-function is applied to deal with the discontinuities of thrust, checkpoint constraints are introduced to connect multiple landing phases, attitude angular rate is designed to get rid of radical changes, and safeguards are imposed to avoid collision with obstacles. The resulting dynamic problems are generally with complex constraints. The unified framework based on Gauss Pseudospectral Method (GPM) and Nonlinear Programming (NLP) solver are designed to solve the problems efficiently. Advanced initialization strategies are developed to enhance both the convergence and computation efficiency. Numerical results demonstrate the adaptability of the framework for various landing missions, and the performance of successful solution of difficult dynamic problems.
Modelling Trial-by-Trial Changes in the Mismatch Negativity
Lieder, Falk; Daunizeau, Jean; Garrido, Marta I.; Friston, Karl J.; Stephan, Klaas E.
2013-01-01
The mismatch negativity (MMN) is a differential brain response to violations of learned regularities. It has been used to demonstrate that the brain learns the statistical structure of its environment and predicts future sensory inputs. However, the algorithmic nature of these computations and the underlying neurobiological implementation remain controversial. This article introduces a mathematical framework with which competing ideas about the computational quantities indexed by MMN responses can be formalized and tested against single-trial EEG data. This framework was applied to five major theories of the MMN, comparing their ability to explain trial-by-trial changes in MMN amplitude. Three of these theories (predictive coding, model adjustment, and novelty detection) were formalized by linking the MMN to different manifestations of the same computational mechanism: approximate Bayesian inference according to the free-energy principle. We thereby propose a unifying view on three distinct theories of the MMN. The relative plausibility of each theory was assessed against empirical single-trial MMN amplitudes acquired from eight healthy volunteers in a roving oddball experiment. Models based on the free-energy principle provided more plausible explanations of trial-by-trial changes in MMN amplitude than models representing the two more traditional theories (change detection and adaptation). Our results suggest that the MMN reflects approximate Bayesian learning of sensory regularities, and that the MMN-generating process adjusts a probabilistic model of the environment according to prediction errors. PMID:23436989
Albert, Philipp J.; Schwarz, Ulrich S.
2016-01-01
The collective dynamics of multicellular systems arise from the interplay of a few fundamental elements: growth, division and apoptosis of single cells; their mechanical and adhesive interactions with neighboring cells and the extracellular matrix; and the tendency of polarized cells to move. Micropatterned substrates are increasingly used to dissect the relative roles of these fundamental processes and to control the resulting dynamics. Here we show that a unifying computational framework based on the cellular Potts model can describe the experimentally observed cell dynamics over all relevant length scales. For single cells, the model correctly predicts the statistical distribution of the orientation of the cell division axis as well as the final organisation of the two daughters on a large range of micropatterns, including those situations in which a stable configuration is not achieved and rotation ensues. Large ensembles migrating in heterogeneous environments form non-adhesive regions of inward-curved arcs like in epithelial bridge formation. Collective migration leads to swirl formation with variations in cell area as observed experimentally. In each case, we also use our model to predict cell dynamics on patterns that have not been studied before. PMID:27054883
Semantically enabled image similarity search
NASA Astrophysics Data System (ADS)
Casterline, May V.; Emerick, Timothy; Sadeghi, Kolia; Gosse, C. A.; Bartlett, Brent; Casey, Jason
2015-05-01
Georeferenced data of various modalities are increasingly available for intelligence and commercial use, however effectively exploiting these sources demands a unified data space capable of capturing the unique contribution of each input. This work presents a suite of software tools for representing geospatial vector data and overhead imagery in a shared high-dimension vector or embedding" space that supports fused learning and similarity search across dissimilar modalities. While the approach is suitable for fusing arbitrary input types, including free text, the present work exploits the obvious but computationally difficult relationship between GIS and overhead imagery. GIS is comprised of temporally-smoothed but information-limited content of a GIS, while overhead imagery provides an information-rich but temporally-limited perspective. This processing framework includes some important extensions of concepts in literature but, more critically, presents a means to accomplish them as a unified framework at scale on commodity cloud architectures.
Motor symptoms in Parkinson's disease: A unified framework.
Moustafa, Ahmed A; Chakravarthy, Srinivasa; Phillips, Joseph R; Gupta, Ankur; Keri, Szabolcs; Polner, Bertalan; Frank, Michael J; Jahanshahi, Marjan
2016-09-01
Parkinson's disease (PD) is characterized by a range of motor symptoms. Besides the cardinal symptoms (akinesia and bradykinesia, tremor and rigidity), PD patients show additional motor deficits, including: gait disturbance, impaired handwriting, grip force and speech deficits, among others. Some of these motor symptoms (e.g., deficits of gait, speech, and handwriting) have similar clinical profiles, neural substrates, and respond similarly to dopaminergic medication and deep brain stimulation (DBS). Here, we provide an extensive review of the clinical characteristics and neural substrates of each of these motor symptoms, to highlight precisely how PD and its medical and surgical treatments impact motor symptoms. In conclusion, we offer a unified framework for understanding the range of motor symptoms in PD. We argue that various motor symptoms in PD reflect dysfunction of neural structures responsible for action selection, motor sequencing, and coordination and execution of movement. Copyright © 2016 Elsevier Ltd. All rights reserved.
Robust nonlinear control of vectored thrust aircraft
NASA Technical Reports Server (NTRS)
Doyle, John C.; Murray, Richard; Morris, John
1993-01-01
An interdisciplinary program in robust control for nonlinear systems with applications to a variety of engineering problems is outlined. Major emphasis will be placed on flight control, with both experimental and analytical studies. This program builds on recent new results in control theory for stability, stabilization, robust stability, robust performance, synthesis, and model reduction in a unified framework using Linear Fractional Transformations (LFT's), Linear Matrix Inequalities (LMI's), and the structured singular value micron. Most of these new advances have been accomplished by the Caltech controls group independently or in collaboration with researchers in other institutions. These recent results offer a new and remarkably unified framework for all aspects of robust control, but what is particularly important for this program is that they also have important implications for system identification and control of nonlinear systems. This combines well with Caltech's expertise in nonlinear control theory, both in geometric methods and methods for systems with constraints and saturations.
Discrete shearlet transform: faithful digitization concept and its applications
NASA Astrophysics Data System (ADS)
Lim, Wang-Q.
2011-09-01
Over the past years, various representation systems which sparsely approximate functions governed by anisotropic features such as edges in images have been proposed. Alongside the theoretical development of these systems, algorithmic realizations of the associated transforms were provided. However, one of the most common short-comings of these frameworks is the lack of providing a unified treatment of the continuum and digital world, i.e., allowing a digital theory to be a natural digitization of the continuum theory. Shearlets were introduced as means to sparsely encode anisotropic singularities of multivariate data while providing a unified treatment of the continuous and digital realm. In this paper, we introduce a discrete framework which allows a faithful digitization of the continuum domain shearlet transform based on compactly supported shearlets. Finally, we show numerical experiments demonstrating the potential of the discrete shearlet transform in several image processing applications.
Snoopy--a unifying Petri net framework to investigate biomolecular networks.
Rohr, Christian; Marwan, Wolfgang; Heiner, Monika
2010-04-01
To investigate biomolecular networks, Snoopy provides a unifying Petri net framework comprising a family of related Petri net classes. Models can be hierarchically structured, allowing for the mastering of larger networks. To move easily between the qualitative, stochastic and continuous modelling paradigms, models can be converted into each other. We get models sharing structure, but specialized by their kinetic information. The analysis and iterative reverse engineering of biomolecular networks is supported by the simultaneous use of several Petri net classes, while the graphical user interface adapts dynamically to the active one. Built-in animation and simulation are complemented by exports to various analysis tools. Snoopy facilitates the addition of new Petri net classes thanks to its generic design. Our tool with Petri net samples is available free of charge for non-commercial use at http://www-dssz.informatik.tu-cottbus.de/snoopy.html; supported operating systems: Mac OS X, Windows and Linux (selected distributions).
NASA Astrophysics Data System (ADS)
Sugimoto, M.; Song, Y.
2015-12-01
Areas of scientific specialties have been segmentalized nowadays. Each natural hazard are researched by scientific researchers. A huge variety of textbooks on one or few natural hazard are published by a single researcher in the world. There are possibilities are several natural disaster in one place. People have to learn from each hazard. However such disaster textbook is not unified education publish. Education in disaster mitigation covers many fields. There is a strong need for a single unified textbook. When I teach disaster education to children in kindergartens and schools, I understand students are confused by each different direction in such textbooks. "Doctor, which is right when a earthquake happens, cover my head or go out of a building? " I would like to discuss what the most valuable disaster textbook is as my following disatser handbook with audiences. This is publisehd for developping countries. You can freely download UNESCO disaster handbook following URL:http://www.icharm.pwri.go.jp/publication/pdf/handbook_on_local_disaster_management_experiences.pdf
Models and methods in delay discounting.
Tesch, Aaron D; Sanfey, Alan G
2008-04-01
Delay discounting (DD) is a term typically used to describe the devaluation of rewards over time, and much research across a wide variety of domains has illustrated that people in general prefer a smaller reward delivered soon as opposed to a larger reward delivered at a later stage. Despite numerous attempts, a single unified model of DD that accounts for the varied pattern of results typically observed has been elusive. One of the difficulties in deriving a unified model is the presence of many framing and context effects, situations in which changing, apparently irrelevant, aspects of the choice scenarios lead to different selections. Additionally, different paradigms of DD research use quite different methodology, which poses challenges for a unified model. This chapter describes some of the difficulties in creating a single DD model and suggests some experiments that would help integrate different paradigms to create a clearer picture of DD.
Loop transfer recovery for general nonminimum phase discrete time systems. I - Analysis
NASA Technical Reports Server (NTRS)
Chen, Ben M.; Saberi, Ali; Sannuti, Peddapullaiah; Shamash, Yacov
1992-01-01
A complete analysis of loop transfer recovery (LTR) for general nonstrictly proper, not necessarily minimum phase discrete time systems is presented. Three different observer-based controllers, namely, `prediction estimator' and full or reduced-order type `current estimator' based controllers, are used. The analysis corresponding to all these three controllers is unified into a single mathematical framework. The LTR analysis given here focuses on three fundamental issues: (1) the recoverability of a target loop when it is arbitrarily given, (2) the recoverability of a target loop while taking into account its specific characteristics, and (3) the establishment of necessary and sufficient conditions on the given system so that it has at least one recoverable target loop transfer function or sensitivity function. Various differences that arise in LTR analysis of continuous and discrete systems are pointed out.
Scale Space for Camera Invariant Features.
Puig, Luis; Guerrero, José J; Daniilidis, Kostas
2014-09-01
In this paper we propose a new approach to compute the scale space of any central projection system, such as catadioptric, fisheye or conventional cameras. Since these systems can be explained using a unified model, the single parameter that defines each type of system is used to automatically compute the corresponding Riemannian metric. This metric, is combined with the partial differential equations framework on manifolds, allows us to compute the Laplace-Beltrami (LB) operator, enabling the computation of the scale space of any central projection system. Scale space is essential for the intrinsic scale selection and neighborhood description in features like SIFT. We perform experiments with synthetic and real images to validate the generalization of our approach to any central projection system. We compare our approach with the best-existing methods showing competitive results in all type of cameras: catadioptric, fisheye, and perspective.
The intrapsychics of gender: a model of self-socialization.
Tobin, Desiree D; Menon, Meenakshi; Menon, Madhavi; Spatta, Brooke C; Hodges, Ernest V E; Perry, David G
2010-04-01
This article outlines a model of the structure and the dynamics of gender cognition in childhood. The model incorporates 3 hypotheses featured in different contemporary theories of childhood gender cognition and unites them under a single theoretical framework. Adapted from Greenwald et al. (2002), the model distinguishes three constructs: gender identity, gender stereotypes, and attribute self-perceptions. The model specifies 3 causal processes among the constructs: Gender identity and stereotypes interactively influence attribute self-perceptions (stereotype emulation hypothesis); gender identity and attribute self-perceptions interactively influence gender stereotypes (stereotype construction hypothesis); and gender stereotypes and attribute self-perceptions interactively influence identity (identity construction hypothesis). The model resolves nagging ambiguities in terminology, organizes diverse hypotheses and empirical findings under a unifying conceptual umbrella, and stimulates many new research directions. PsycINFO Database Record (c) 2010 APA, all rights reserved.
Unified framework for information integration based on information geometry
Oizumi, Masafumi; Amari, Shun-ichi
2016-01-01
Assessment of causal influences is a ubiquitous and important subject across diverse research fields. Drawn from consciousness studies, integrated information is a measure that defines integration as the degree of causal influences among elements. Whereas pairwise causal influences between elements can be quantified with existing methods, quantifying multiple influences among many elements poses two major mathematical difficulties. First, overestimation occurs due to interdependence among influences if each influence is separately quantified in a part-based manner and then simply summed over. Second, it is difficult to isolate causal influences while avoiding noncausal confounding influences. To resolve these difficulties, we propose a theoretical framework based on information geometry for the quantification of multiple causal influences with a holistic approach. We derive a measure of integrated information, which is geometrically interpreted as the divergence between the actual probability distribution of a system and an approximated probability distribution where causal influences among elements are statistically disconnected. This framework provides intuitive geometric interpretations harmonizing various information theoretic measures in a unified manner, including mutual information, transfer entropy, stochastic interaction, and integrated information, each of which is characterized by how causal influences are disconnected. In addition to the mathematical assessment of consciousness, our framework should help to analyze causal relationships in complex systems in a complete and hierarchical manner. PMID:27930289
Ghosh, Avijit; Scott, Dennis O; Maurer, Tristan S
2014-02-14
In this work, we provide a unified theoretical framework describing how drug molecules can permeate across membranes in neutral and ionized forms for unstirred in vitro systems. The analysis provides a self-consistent basis for the origin of the unstirred water layer (UWL) within the Nernst-Planck framework in the fully unstirred limit and further provides an accounting mechanism based simply on the bulk aqueous solvent diffusion constant of the drug molecule. Our framework makes no new assumptions about the underlying physics of molecular permeation. We hold simply that Nernst-Planck is a reasonable approximation at low concentrations and all physical systems must conserve mass. The applicability of the derived framework has been examined both with respect to the effect of stirring and externally applied voltages to measured permeability. The analysis contains data for 9 compounds extracted from the literature representing a range of permeabilities and aqueous diffusion coefficients. Applicability with respect to ionized permeation is examined using literature data for the permanently charged cation, crystal violet, providing a basis for the underlying mechanism for ionized drug permeation for this molecule as being due to mobile counter-current flow. Copyright © 2013 Elsevier B.V. All rights reserved.
A Unified Framework for Street-View Panorama Stitching
Li, Li; Yao, Jian; Xie, Renping; Xia, Menghan; Zhang, Wei
2016-01-01
In this paper, we propose a unified framework to generate a pleasant and high-quality street-view panorama by stitching multiple panoramic images captured from the cameras mounted on the mobile platform. Our proposed framework is comprised of four major steps: image warping, color correction, optimal seam line detection and image blending. Since the input images are captured without a precisely common projection center from the scenes with the depth differences with respect to the cameras to different extents, such images cannot be precisely aligned in geometry. Therefore, an efficient image warping method based on the dense optical flow field is proposed to greatly suppress the influence of large geometric misalignment at first. Then, to lessen the influence of photometric inconsistencies caused by the illumination variations and different exposure settings, we propose an efficient color correction algorithm via matching extreme points of histograms to greatly decrease color differences between warped images. After that, the optimal seam lines between adjacent input images are detected via the graph cut energy minimization framework. At last, the Laplacian pyramid blending algorithm is applied to further eliminate the stitching artifacts along the optimal seam lines. Experimental results on a large set of challenging street-view panoramic images captured form the real world illustrate that the proposed system is capable of creating high-quality panoramas. PMID:28025481
Brain-Mind Operational Architectonics Imaging: Technical and Methodological Aspects
Fingelkurts, Andrew A; Fingelkurts, Alexander A
2008-01-01
This review paper deals with methodological and technical foundations of the Operational Architectonics framework of brain and mind functioning. This theory provides a framework for mapping and understanding important aspects of the brain mechanisms that constitute perception, cognition, and eventually consciousness. The methods utilized within Operational Architectonics framework allow analyzing with an incredible detail the operational behavior of local neuronal assemblies and their joint activity in the form of unified and metastable operational modules, which constitute the whole hierarchy of brain operations, operations of cognition and phenomenal consciousness. PMID:19526071
A Unified Nonlinear Adaptive Approach for Detection and Isolation of Engine Faults
NASA Technical Reports Server (NTRS)
Tang, Liang; DeCastro, Jonathan A.; Zhang, Xiaodong; Farfan-Ramos, Luis; Simon, Donald L.
2010-01-01
A challenging problem in aircraft engine health management (EHM) system development is to detect and isolate faults in system components (i.e., compressor, turbine), actuators, and sensors. Existing nonlinear EHM methods often deal with component faults, actuator faults, and sensor faults separately, which may potentially lead to incorrect diagnostic decisions and unnecessary maintenance. Therefore, it would be ideal to address sensor faults, actuator faults, and component faults under one unified framework. This paper presents a systematic and unified nonlinear adaptive framework for detecting and isolating sensor faults, actuator faults, and component faults for aircraft engines. The fault detection and isolation (FDI) architecture consists of a parallel bank of nonlinear adaptive estimators. Adaptive thresholds are appropriately designed such that, in the presence of a particular fault, all components of the residual generated by the adaptive estimator corresponding to the actual fault type remain below their thresholds. If the faults are sufficiently different, then at least one component of the residual generated by each remaining adaptive estimator should exceed its threshold. Therefore, based on the specific response of the residuals, sensor faults, actuator faults, and component faults can be isolated. The effectiveness of the approach was evaluated using the NASA C-MAPSS turbofan engine model, and simulation results are presented.
Reframing Information Literacy as a Metaliteracy
ERIC Educational Resources Information Center
Mackey, Thomas P.; Jacobson, Trudi E.
2011-01-01
Social media environments and online communities are innovative collaborative technologies that challenge traditional definitions of information literacy. Metaliteracy is an overarching and self-referential framework that integrates emerging technologies and unifies multiple literacy types. This redefinition of information literacy expands the…
A UML profile for framework modeling.
Xu, Xiao-liang; Wang, Le-yu; Zhou, Hong
2004-01-01
The current standard Unified Modeling Language(UML) could not model framework flexibility and extendability adequately due to lack of appropriate constructs to distinguish framework hot-spots from kernel elements. A new UML profile that may customize UML for framework modeling was presented using the extension mechanisms of UML, providing a group of UML extensions to meet the needs of framework modeling. In this profile, the extended class diagrams and sequence diagrams were defined to straightforwardly identify the hot-spots and describe their instantiation restrictions. A transformation model based on design patterns was also put forward, such that the profile based framework design diagrams could be automatically mapped to the corresponding implementation diagrams. It was proved that the presented profile makes framework modeling more straightforwardly and therefore easier to understand and instantiate.
SB 1082 -- Unified hazardous materials/waste program: Local implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, W.
California Senate Bill 1082 was signed into law in the fall of 1993 because business and industry believed there were too many hazardous materials inspectors asking the same questions, looking at the same items and requiring similar information on several variations of the same form. Industry was not happy with the large diversity of programs, each with its own inspectors, permits and fees, essentially doing what industry believed was the same inspection. SB 1082 will allow local city and county agencies to apply to the California Environmental Protection Agency to become a Certified Unified Program Agency (CUPA) or work withmore » a CUPA as a Participating Agency (PA) to manage specific program elements. The CUPA will unify six regulatory programs including hazardous waste/tiered permitting, aboveground storage tanks, underground storage tanks, business and area plans/inventory or disclosure, acutely hazardous materials/risk management prevention and Uniform Fire Code programs related to hazardous materials inventory/plan requirements. The bill requires the CUPA to (1) implement a permit consolidation program; (2) implement a single fee system with a state surcharge; (3) consolidate, coordinate and make consistent any local or regional requirements or guidance documents; and (4) implement a single unified inspection and enforcement program.« less
Wapenaar, Kees
2017-06-01
A unified scalar wave equation is formulated, which covers three-dimensional (3D) acoustic waves, 2D horizontally-polarised shear waves, 2D transverse-electric EM waves, 2D transverse-magnetic EM waves, 3D quantum-mechanical waves and 2D flexural waves. The homogeneous Green's function of this wave equation is a combination of the causal Green's function and its time-reversal, such that their singularities at the source position cancel each other. A classical representation expresses this homogeneous Green's function as a closed boundary integral. This representation finds applications in holographic imaging, time-reversed wave propagation and Green's function retrieval by cross correlation. The main drawback of the classical representation in those applications is that it requires access to a closed boundary around the medium of interest, whereas in many practical situations the medium can be accessed from one side only. Therefore, a single-sided representation is derived for the homogeneous Green's function of the unified scalar wave equation. Like the classical representation, this single-sided representation fully accounts for multiple scattering. The single-sided representation has the same applications as the classical representation, but unlike the classical representation it is applicable in situations where the medium of interest is accessible from one side only.
Unified double- and single-sided homogeneous Green’s function representations
van der Neut, Joost; Slob, Evert
2016-01-01
In wave theory, the homogeneous Green’s function consists of the impulse response to a point source, minus its time-reversal. It can be represented by a closed boundary integral. In many practical situations, the closed boundary integral needs to be approximated by an open boundary integral because the medium of interest is often accessible from one side only. The inherent approximations are acceptable as long as the effects of multiple scattering are negligible. However, in case of strongly inhomogeneous media, the effects of multiple scattering can be severe. We derive double- and single-sided homogeneous Green’s function representations. The single-sided representation applies to situations where the medium can be accessed from one side only. It correctly handles multiple scattering. It employs a focusing function instead of the backward propagating Green’s function in the classical (double-sided) representation. When reflection measurements are available at the accessible boundary of the medium, the focusing function can be retrieved from these measurements. Throughout the paper, we use a unified notation which applies to acoustic, quantum-mechanical, electromagnetic and elastodynamic waves. We foresee many interesting applications of the unified single-sided homogeneous Green’s function representation in holographic imaging and inverse scattering, time-reversed wave field propagation and interferometric Green’s function retrieval. PMID:27436983
Unified double- and single-sided homogeneous Green's function representations
NASA Astrophysics Data System (ADS)
Wapenaar, Kees; van der Neut, Joost; Slob, Evert
2016-06-01
In wave theory, the homogeneous Green's function consists of the impulse response to a point source, minus its time-reversal. It can be represented by a closed boundary integral. In many practical situations, the closed boundary integral needs to be approximated by an open boundary integral because the medium of interest is often accessible from one side only. The inherent approximations are acceptable as long as the effects of multiple scattering are negligible. However, in case of strongly inhomogeneous media, the effects of multiple scattering can be severe. We derive double- and single-sided homogeneous Green's function representations. The single-sided representation applies to situations where the medium can be accessed from one side only. It correctly handles multiple scattering. It employs a focusing function instead of the backward propagating Green's function in the classical (double-sided) representation. When reflection measurements are available at the accessible boundary of the medium, the focusing function can be retrieved from these measurements. Throughout the paper, we use a unified notation which applies to acoustic, quantum-mechanical, electromagnetic and elastodynamic waves. We foresee many interesting applications of the unified single-sided homogeneous Green's function representation in holographic imaging and inverse scattering, time-reversed wave field propagation and interferometric Green's function retrieval.
Creating a library holding group: an approach to large system integration.
Huffman, Isaac R; Martin, Heather J; Delawska-Elliott, Basia
2016-10-01
Faced with resource constraints, many hospital libraries have considered joint operations. This case study describes how Providence Health & Services created a single group to provide library services. Using a holding group model, staff worked to unify more than 6,100 nonlibrary subscriptions and 14 internal library sites. Our library services grew by unifying 2,138 nonlibrary subscriptions and 11 library sites and hiring more library staff. We expanded access to 26,018 more patrons. A model with built-in flexibility allowed successful library expansion. Although challenges remain, this success points to a viable model of unified operations.
Self-Efficacy: Toward a Unifying Theory of Behavioral Change
ERIC Educational Resources Information Center
Bandura, Albert
1977-01-01
This research presents an integrative theoretical framework to explain and to predict psychological changes achieved by different modes of treatment. This theory states that psychological procedures, whatever their form, alter the level and strength of "self-efficacy". (Editor/RK)
COMPLEMENTARITY OF ECOLOGICAL GOAL FUNCTIONS
This paper summarizes, in the framework of network environ analysis, a set of analyses of energy-matter flow and storage in steady state systems. The network perspective is used to codify and unify ten ecological orientors or external principles: maximum power (Lotka), maximum st...
Chimaera simulation of complex states of flowing matter
2016-01-01
We discuss a unified mesoscale framework (chimaera) for the simulation of complex states of flowing matter across scales of motion. The chimaera framework can deal with each of the three macro–meso–micro levels through suitable ‘mutations’ of the basic mesoscale formulation. The idea is illustrated through selected simulations of complex micro- and nanoscale flows. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698031
Pinto, Rogério M; da Silva, Sueli Bulhões; Soriano, Rafaela
2012-03-01
Community health workers (CHWs) play a pivotal role in primary care, serving as liaisons between community members and medical providers. However, the growing reliance of health care systems worldwide on CHWs has outpaced research explaining their praxis - how they combine indigenous and technical knowledge, overcome challenges and impact patient outcomes. This paper thus articulates the CHW Praxis and Patient Health Behavior Framework. Such a framework is needed to advance research on CHW impact on patient outcomes and to advance CHW training. The project that originated this framework followed community-based participatory research principles. A team of U.S.-Brazil research partners, including CHWs, worked together from conceptualization of the study to dissemination of its findings. The framework is built on an integrated conceptual foundation including learning/teaching and individual behavior theories. The empirical base of the framework comprises in-depth interviews with 30 CHWs in Brazil's Unified Health System, Mesquita, Rio de Janeiro. Data collection for the project which originated this report occurred in 2008-10. Semi-structured questions examined how CHWs used their knowledge/skills; addressed personal and environmental challenges; and how they promoted patient health behaviors. This study advances an explanation of how CHWs use self-identified strategies--i.e., empathic communication and perseverance--to help patients engage in health behaviors. Grounded in our proposed framework, survey measures can be developed and used in predictive models testing the effects of CHW praxis on health behaviors. Training for CHWs can explicitly integrate indigenous and technical knowledge in order for CHWs to overcome contextual challenges and enhance service delivery. Copyright © 2012 Elsevier Ltd. All rights reserved.
Pinto, Rogério M.; da Silva, Sueli Bulhões; Soriano, Rafaela
2012-01-01
Community Health Workers (CHWs) play a pivotal role in primary care, serving as liaisons between community members and medical providers. However, the growing reliance of health care systems worldwide on CHWs has outpaced research explaining their praxis – how they combine indigenous and technical knowledge, overcome challenges and impact patient outcomes. This paper thus articulates the CHW Praxis and Patient Health Behavior Framework. Such a framework is needed to advance research on CHW impact on patient outcomes and to advance CHW training. The project that originated this framework followed Community-Based Participatory Research principles. A team of U.S.-Brazil research partners, including CHWs, worked together from conceptualization of the study to dissemination of its findings. The framework is built on an integrated conceptual foundation including learning/teaching and individual behavior theories. The empirical base of the framework comprises in-depth interviews with 30 CHWs in Brazil's Unified Health System, Mesquita, Rio de Janeiro. Data collection for the project which originated this report occurred in 2008–10. Semi-structured questions examined how CHWs used their knowledge/skills; addressed personal and environmental challenges; and how they promoted patient health behaviors. This study advances an explanation of how CHWs use self-identified strategies – i.e., empathic communication and perseverance – to help patients engage in health behaviors. Grounded in our proposed framework, survey measures can be developed and used in predictive models testing the effects of CHW praxis on health behaviors. Training for CHWs can explicitly integrate indigenous and technical knowledge in order for CHWs to overcome contextual challenges and enhance service delivery. PMID:22305469
DOE Office of Scientific and Technical Information (OSTI.GOV)
Machnes, S.; Institute for Theoretical Physics, University of Ulm, D-89069 Ulm; Sander, U.
2011-08-15
For paving the way to novel applications in quantum simulation, computation, and technology, increasingly large quantum systems have to be steered with high precision. It is a typical task amenable to numerical optimal control to turn the time course of pulses, i.e., piecewise constant control amplitudes, iteratively into an optimized shape. Here, we present a comparative study of optimal-control algorithms for a wide range of finite-dimensional applications. We focus on the most commonly used algorithms: GRAPE methods which update all controls concurrently, and Krotov-type methods which do so sequentially. Guidelines for their use are given and open research questions aremore » pointed out. Moreover, we introduce a unifying algorithmic framework, DYNAMO (dynamic optimization platform), designed to provide the quantum-technology community with a convenient matlab-based tool set for optimal control. In addition, it gives researchers in optimal-control techniques a framework for benchmarking and comparing newly proposed algorithms with the state of the art. It allows a mix-and-match approach with various types of gradients, update and step-size methods as well as subspace choices. Open-source code including examples is made available at http://qlib.info.« less
Papadimitriou, Konstantinos I.; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M.
2014-01-01
The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4th) order topology. PMID:25653579
NASA Technical Reports Server (NTRS)
2005-01-01
A number of titanium matrix composite (TMC) systems are currently being investigated for high-temperature air frame and propulsion system applications. As a result, numerous computational methodologies for predicting both deformation and life for this class of materials are under development. An integral part of these methodologies is an accurate and computationally efficient constitutive model for the metallic matrix constituent. Furthermore, because these systems are designed to operate at elevated temperatures, the required constitutive models must account for both time-dependent and time-independent deformations. To accomplish this, the NASA Lewis Research Center is employing a recently developed, complete, potential-based framework. This framework, which utilizes internal state variables, was put forth for the derivation of reversible and irreversible constitutive equations. The framework, and consequently the resulting constitutive model, is termed complete because the existence of the total (integrated) form of the Gibbs complementary free energy and complementary dissipation potentials are assumed a priori. The specific forms selected here for both the Gibbs and complementary dissipation potentials result in a fully associative, multiaxial, nonisothermal, unified viscoplastic model with nonlinear kinematic hardening. This model constitutes one of many models in the Generalized Viscoplasticity with Potential Structure (GVIPS) class of inelastic constitutive equations.
Papadimitriou, Konstantinos I; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M
2014-01-01
The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4(th)) order topology.
Energy efficiency as a unifying principle for human, environmental, and global health
Fontana, Luigi; Atella, Vincenzo; Kammen, Daniel M
2013-01-01
A strong analogy exists between over/under consumption of energy at the level of the human body and of the industrial metabolism of humanity. Both forms of energy consumption have profound implications for human, environmental, and global health. Globally, excessive fossil-fuel consumption, and individually, excessive food energy consumption are both responsible for a series of interrelated detrimental effects, including global warming, extreme weather conditions, damage to ecosystems, loss of biodiversity, widespread pollution, obesity, cancer, chronic respiratory disease, and other lethal chronic diseases. In contrast, data show that the efficient use of energy—in the form of food as well as fossil fuels and other resources—is vital for promoting human, environmental, and planetary health and sustainable economic development. While it is not new to highlight how efficient use of energy and food can address some of the key problems our world is facing, little research and no unifying framework exists to harmonize these concepts of sustainable system management across diverse scientific fields into a single theoretical body. Insights beyond reductionist views of efficiency are needed to encourage integrated changes in the use of the world’s natural resources, with the aim of achieving a wiser use of energy, better farming systems, and healthier dietary habits. This perspective highlights a range of scientific-based opportunities for cost-effective pro-growth and pro-health policies while using less energy and natural resources. PMID:24555053
Energy efficiency as a unifying principle for human, environmental, and global health.
Fontana, Luigi; Atella, Vincenzo; Kammen, Daniel M
2013-01-01
A strong analogy exists between over/under consumption of energy at the level of the human body and of the industrial metabolism of humanity. Both forms of energy consumption have profound implications for human, environmental, and global health. Globally, excessive fossil-fuel consumption, and individually, excessive food energy consumption are both responsible for a series of interrelated detrimental effects, including global warming, extreme weather conditions, damage to ecosystems, loss of biodiversity, widespread pollution, obesity, cancer, chronic respiratory disease, and other lethal chronic diseases. In contrast, data show that the efficient use of energy-in the form of food as well as fossil fuels and other resources-is vital for promoting human, environmental, and planetary health and sustainable economic development. While it is not new to highlight how efficient use of energy and food can address some of the key problems our world is facing, little research and no unifying framework exists to harmonize these concepts of sustainable system management across diverse scientific fields into a single theoretical body. Insights beyond reductionist views of efficiency are needed to encourage integrated changes in the use of the world's natural resources, with the aim of achieving a wiser use of energy, better farming systems, and healthier dietary habits. This perspective highlights a range of scientific-based opportunities for cost-effective pro-growth and pro-health policies while using less energy and natural resources.
Browning, Brian L.; Browning, Sharon R.
2009-01-01
We present methods for imputing data for ungenotyped markers and for inferring haplotype phase in large data sets of unrelated individuals and parent-offspring trios. Our methods make use of known haplotype phase when it is available, and our methods are computationally efficient so that the full information in large reference panels with thousands of individuals is utilized. We demonstrate that substantial gains in imputation accuracy accrue with increasingly large reference panel sizes, particularly when imputing low-frequency variants, and that unphased reference panels can provide highly accurate genotype imputation. We place our methodology in a unified framework that enables the simultaneous use of unphased and phased data from trios and unrelated individuals in a single analysis. For unrelated individuals, our imputation methods produce well-calibrated posterior genotype probabilities and highly accurate allele-frequency estimates. For trios, our haplotype-inference method is four orders of magnitude faster than the gold-standard PHASE program and has excellent accuracy. Our methods enable genotype imputation to be performed with unphased trio or unrelated reference panels, thus accounting for haplotype-phase uncertainty in the reference panel. We present a useful measure of imputation accuracy, allelic R2, and show that this measure can be estimated accurately from posterior genotype probabilities. Our methods are implemented in version 3.0 of the BEAGLE software package. PMID:19200528
Classical Markov Chains: A Unifying Framework for Understanding Avian Reproductive Success
Traditional methods for monitoring and analysis of avian nesting success have several important shortcomings, including 1) inability to handle multiple classes of nest failure, and 2) inability to provide estimates of annual reproductive success (because birds can, and typically ...
Do changes in connectivity explain desertification?
USDA-ARS?s Scientific Manuscript database
Desertification, broad-scale land degradation in drylands, is a major environmental hazard facing inhabitants of the world’s deserts as well as an important component of global change. There is no unifying framework that simply and effectively explains different forms of desertification. Here we arg...
Landscape approach to the formation of the ecological frame of Moscow
NASA Astrophysics Data System (ADS)
Nizovtsev, Vyacheslav; Natalia, Erman
2015-04-01
The territory of Moscow, in particular in its former borders, is distinct for its strong transformation of the natural properties of virtually all types of landscape complexes. The modern landscape structure is characterized by fragmentation of natural land cover. Natural and quasinatural (natural and anthropogenic) landscape complexes with preserved natural structure are represented by isolated areas and occupy small areas. During recent years landscape diversity in general and biodiversity in particular have been rapidly declining, and many of the natural landscape complexes are under ever-increasing degradation. Ecological balance is broken, and preserved natural landscapes are not able to maintain it. Effective territorial organization of Moscow and the rational use of its territory are impossible without taking into account the natural component of the city as well as the properties and potential of the landscape complexes that integrate all natural features in specific areas. The formation of the ecological framework of the city is particularly important. It should be a single system of interrelated and complementary components that make up a single environmental space: habitat-forming cores (junctions), ecological corridors and elements of environmental infrastructure. Systemic unity of the environmental framework can support the territorial ecological compensation where a break of the ecological functions of one part of the system is compensated by maintaining or restoring them in another part and contribute to the polarization of incompatible types of land use. Habitat-forming cores should include as mandatory parts all the specifically protected natural areas (SPNAs), particularly valuable landscape complexes, as well as preserved adjacent forest areas. Their most important function should be to maintain resources and area reproducing abilities of landscapes, landscape diversity and biodiversity. Ecological corridors which perform environmental and operating transit functions should include unified landscape systems of river valleys, their hollow-beam upstreams and drained lows. The most important elements of environmental infrastructure include the most valuable forest and wetland complexes, springs and other landscape and aquatic complexes, cultural and historical landscape complexes, landscape complexes with high concentration of cultural heritage sites, sites of natural and green areas with great potential of natural and recreational resources, natural and recreational parks, natural monuments. They can serve as centers of landscape and biological diversity and perform partial transit (migration) and buffer functions. The territory of the ecological framework can be used for strictly regulated or limited recreation (tourism, short leisure). The adjacent natural and green spaces and natural parks may play a buffer role for the SPNAs and valuable landscape complexes. The spatial pattern of the landscape complexes of Moscow allows to create a single ecological framework based on the landscape, distinct for its interrelated and complementary components. Its basis may be consisted of uniform landscape complexes of valley outwash plains and river valleys, their hollow-beam upstreams and drained lows which perform system forming, environmental and transit functions. In the plan river valleys and small erosional forms are as if enclosed in the gullies and constitute single paradynamic systems unified by lateral flows. Therefore not only the edges of river valleys, but also the rear seams of the valley outwash plains should become important natural boundaries, limiting urban development of the area. Their most important functional feature is that they serve as local collectors and surface water runoff channels. These landscape complexes are distinct for most dynamic natural processes and thus negative exogenous processes. The authors have drawn indigenous (conditionally restored) and modern landscapes of Moscow on a scale of 1: 50,000 and on their basis an ecological framework map of Moscow. These maps are an important natural basis for the analysis of conditions and identification of limiting factors of the urban development of the big city.
Cho, Kwang-Hyun; Choo, Sang-Mok; Wellstead, Peter; Wolkenhauer, Olaf
2005-08-15
We propose a unified framework for the identification of functional interaction structures of biomolecular networks in a way that leads to a new experimental design procedure. In developing our approach, we have built upon previous work. Thus we begin by pointing out some of the restrictions associated with existing structure identification methods and point out how these restrictions may be eased. In particular, existing methods use specific forms of experimental algebraic equations with which to identify the functional interaction structure of a biomolecular network. In our work, we employ an extended form of these experimental algebraic equations which, while retaining their merits, also overcome some of their disadvantages. Experimental data are required in order to estimate the coefficients of the experimental algebraic equation set associated with the structure identification task. However, experimentalists are rarely provided with guidance on which parameters to perturb, and to what extent, to perturb them. When a model of network dynamics is required then there is also the vexed question of sample rate and sample time selection to be resolved. Supplying some answers to these questions is the main motivation of this paper. The approach is based on stationary and/or temporal data obtained from parameter perturbations, and unifies the previous approaches of Kholodenko et al. (PNAS 99 (2002) 12841-12846) and Sontag et al. (Bioinformatics 20 (2004) 1877-1886). By way of demonstration, we apply our unified approach to a network model which cannot be properly identified by existing methods. Finally, we propose an experiment design methodology, which is not limited by the amount of parameter perturbations, and illustrate its use with an in numero example.
Wolfrum, Ed (ORCID:0000000273618931); Knoshug, Eric (ORCID:000000025709914X); Laurens, Lieve (ORCID:0000000349303267); Harmon, Valerie; Dempster, Thomas (ORCID:000000029550488X); McGowan, John (ORCID:0000000266920518); Rosov, Theresa; Cardello, David; Arrowsmith, Sarah; Kempkes, Sarah; Bautista, Maria; Lundquist, Tryg; Crowe, Brandon; Murawsky, Garrett; Nicolai, Eric; Rowe, Egan; Knurek, Emily; Javar, Reyna; Saracco Alvarez, Marcela; Schlosser, Steve; Riddle, Mary; Withstandley, Chris; Chen, Yongsheng; Van Ginkel, Steven; Igou, Thomas; Xu, Chunyan; Hu, Zixuan
2017-10-20
ATP3 Unified Field Study Data The Algae Testbed Public-Private Partnership (ATP3) was established with the goal of investigating open pond algae cultivation across different geographic, climatic, seasonal, and operational conditions while setting the benchmark for quality data collection, analysis, and dissemination. Identical algae cultivation systems and data analysis methodologies were established at testbed sites across the continental United States and Hawaii. Within this framework, the Unified Field Studies (UFS) were designed to characterize the cultivation of different algal strains during all 4 seasons across this testbed network. The dataset presented here is the complete, curated, climatic, cultivation, harvest, and biomass composition data for each season at each site. These data enable others to do in-depth cultivation, harvest, techno-economic, life cycle, resource, and predictive growth modeling analysis, as well as develop crop protection strategies for the nascent algae industry. NREL Sub award Number: DE-AC36-08-GO28308
A Unified Methodology for Computing Accurate Quaternion Color Moments and Moment Invariants.
Karakasis, Evangelos G; Papakostas, George A; Koulouriotis, Dimitrios E; Tourassis, Vassilios D
2014-02-01
In this paper, a general framework for computing accurate quaternion color moments and their corresponding invariants is proposed. The proposed unified scheme arose by studying the characteristics of different orthogonal polynomials. These polynomials are used as kernels in order to form moments, the invariants of which can easily be derived. The resulted scheme permits the usage of any polynomial-like kernel in a unified and consistent way. The resulted moments and moment invariants demonstrate robustness to noisy conditions and high discriminative power. Additionally, in the case of continuous moments, accurate computations take place to avoid approximation errors. Based on this general methodology, the quaternion Tchebichef, Krawtchouk, Dual Hahn, Legendre, orthogonal Fourier-Mellin, pseudo Zernike and Zernike color moments, and their corresponding invariants are introduced. A selected paradigm presents the reconstruction capability of each moment family, whereas proper classification scenarios evaluate the performance of color moment invariants.
Multilayer network of language: A unified framework for structural analysis of linguistic subsystems
NASA Astrophysics Data System (ADS)
Martinčić-Ipšić, Sanda; Margan, Domagoj; Meštrović, Ana
2016-09-01
Recently, the focus of complex networks' research has shifted from the analysis of isolated properties of a system toward a more realistic modeling of multiple phenomena - multilayer networks. Motivated by the prosperity of multilayer approach in social, transport or trade systems, we introduce the multilayer networks for language. The multilayer network of language is a unified framework for modeling linguistic subsystems and their structural properties enabling the exploration of their mutual interactions. Various aspects of natural language systems can be represented as complex networks, whose vertices depict linguistic units, while links model their relations. The multilayer network of language is defined by three aspects: the network construction principle, the linguistic subsystem and the language of interest. More precisely, we construct a word-level (syntax and co-occurrence) and a subword-level (syllables and graphemes) network layers, from four variations of original text (in the modeled language). The analysis and comparison of layers at the word and subword-levels are employed in order to determine the mechanism of the structural influences between linguistic units and subsystems. The obtained results suggest that there are substantial differences between the networks' structures of different language subsystems, which are hidden during the exploration of an isolated layer. The word-level layers share structural properties regardless of the language (e.g. Croatian or English), while the syllabic subword-level expresses more language dependent structural properties. The preserved weighted overlap quantifies the similarity of word-level layers in weighted and directed networks. Moreover, the analysis of motifs reveals a close topological structure of the syntactic and syllabic layers for both languages. The findings corroborate that the multilayer network framework is a powerful, consistent and systematic approach to model several linguistic subsystems simultaneously and hence to provide a more unified view on language.
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental disturbances, is also proposed, together with a simulation example. PMID:23515190
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental disturbances, is also proposed, together with a simulation example.
Reconciling intuitive physics and Newtonian mechanics for colliding objects.
Sanborn, Adam N; Mansinghka, Vikash K; Griffiths, Thomas L
2013-04-01
People have strong intuitions about the influence objects exert upon one another when they collide. Because people's judgments appear to deviate from Newtonian mechanics, psychologists have suggested that people depend on a variety of task-specific heuristics. This leaves open the question of how these heuristics could be chosen, and how to integrate them into a unified model that can explain human judgments across a wide range of physical reasoning tasks. We propose an alternative framework, in which people's judgments are based on optimal statistical inference over a Newtonian physical model that incorporates sensory noise and intrinsic uncertainty about the physical properties of the objects being viewed. This noisy Newton framework can be applied to a multitude of judgments, with people's answers determined by the uncertainty they have for physical variables and the constraints of Newtonian mechanics. We investigate a range of effects in mass judgments that have been taken as strong evidence for heuristic use and show that they are well explained by the interplay between Newtonian constraints and sensory uncertainty. We also consider an extended model that handles causality judgments, and obtain good quantitative agreement with human judgments across tasks that involve different judgment types with a single consistent set of parameters.
Cancer and intercellular cooperation
Dieli, Anna Maria
2017-01-01
The major transitions approach in evolutionary biology has shown that the intercellular cooperation that characterizes multicellular organisms would never have emerged without some kind of multilevel selection. Relying on this view, the Evolutionary Somatic view of cancer considers cancer as a breakdown of intercellular cooperation and as a loss of the balance between selection processes that take place at different levels of organization (particularly single cell and individual organism). This seems an elegant unifying framework for healthy organism, carcinogenesis, tumour proliferation, metastasis and other phenomena such as ageing. However, the gene-centric version of Darwinian evolution, which is often adopted in cancer research, runs into empirical problems: proto-tumoural and tumoural features in precancerous cells that would undergo ‘natural selection’ have proved hard to demonstrate; cells are radically context-dependent, and some stages of cancer are poorly related to genetic change. Recent perspectives propose that breakdown of intercellular cooperation could depend on ‘fields’ and other higher-level phenomena, and could be even mutations independent. Indeed, the field would be the context, allowing (or preventing) genetic mutations to undergo an intra-organism process analogous to natural selection. The complexities surrounding somatic evolution call for integration between multiple incomplete frameworks for interpreting intercellular cooperation and its pathologies. PMID:29134064
Combining statistical inference and decisions in ecology.
Williams, Perry J; Hooten, Mevin B
2016-09-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.
Wang, Minghuai; Larson, Vincent E.; Ghan, Steven; ...
2015-04-18
In this study, a higher-order turbulence closure scheme, called Cloud Layers Unified by Binormals (CLUBB), is implemented into a Multi-scale Modeling Framework (MMF) model to improve low cloud simulations. The performance of CLUBB in MMF simulations with two different microphysics configurations (one-moment cloud microphysics without aerosol treatment and two-moment cloud microphysics coupled with aerosol treatment) is evaluated against observations and further compared with results from the Community Atmosphere Model, Version 5 (CAM5) with conventional cloud parameterizations. CLUBB is found to improve low cloud simulations in the MMF, and the improvement is particularly evident in the stratocumulus-to-cumulus transition regions. Compared tomore » the single-moment cloud microphysics, CLUBB with two-moment microphysics produces clouds that are closer to the coast, and agrees better with observations. In the stratocumulus-to cumulus transition regions, CLUBB with two-moment cloud microphysics produces shortwave cloud forcing in better agreement with observations, while CLUBB with single moment cloud microphysics overestimates shortwave cloud forcing. CLUBB is further found to produce quantitatively similar improvements in the MMF and CAM5, with slightly better performance in the MMF simulations (e.g., MMF with CLUBB generally produces low clouds that are closer to the coast than CAM5 with CLUBB). As a result, improved low cloud simulations in MMF make it an even more attractive tool for studying aerosol-cloud-precipitation interactions.« less
NASA Technical Reports Server (NTRS)
1978-01-01
A unified framework for comparing intercity passenger and freight transportation systems is presented. Composite measures for cost, service/demand, energy, and environmental impact were determined. A set of 14 basic measures were articulated to form the foundation for computing the composite measures. A parameter dependency diagram, constructed to explicitly interrelate the composite and basic measures is discussed. Ground rules and methodology for developing the values of the basic measures are provided and the use of the framework with existing cost and service data is illustrated for various freight systems.
Tavazoie, Saeed
2013-01-01
Here we explore the possibility that a core function of sensory cortex is the generation of an internal simulation of sensory environment in real-time. A logical elaboration of this idea leads to a dynamical neural architecture that oscillates between two fundamental network states, one driven by external input, and the other by recurrent synaptic drive in the absence of sensory input. Synaptic strength is modified by a proposed synaptic state matching (SSM) process that ensures equivalence of spike statistics between the two network states. Remarkably, SSM, operating locally at individual synapses, generates accurate and stable network-level predictive internal representations, enabling pattern completion and unsupervised feature detection from noisy sensory input. SSM is a biologically plausible substrate for learning and memory because it brings together sequence learning, feature detection, synaptic homeostasis, and network oscillations under a single unifying computational framework. PMID:23991161
Understanding trends in C-H bond activation in heterogeneous catalysis.
Latimer, Allegra A; Kulkarni, Ambarish R; Aljama, Hassan; Montoya, Joseph H; Yoo, Jong Suk; Tsai, Charlie; Abild-Pedersen, Frank; Studt, Felix; Nørskov, Jens K
2017-02-01
While the search for catalysts capable of directly converting methane to higher value commodity chemicals and liquid fuels has been active for over a century, a viable industrial process for selective methane activation has yet to be developed. Electronic structure calculations are playing an increasingly relevant role in this search, but large-scale materials screening efforts are hindered by computationally expensive transition state barrier calculations. The purpose of the present letter is twofold. First, we show that, for the wide range of catalysts that proceed via a radical intermediate, a unifying framework for predicting C-H activation barriers using a single universal descriptor can be established. Second, we combine this scaling approach with a thermodynamic analysis of active site formation to provide a map of methane activation rates. Our model successfully rationalizes the available empirical data and lays the foundation for future catalyst design strategies that transcend different catalyst classes.
Understanding trends in C–H bond activation in heterogeneous catalysis
Latimer, Allegra A.; Kulkarni, Ambarish R.; Aljama, Hassan; ...
2016-10-10
While the search for catalysts capable of directly converting methane to higher value commodity chemicals and liquid fuels has been active for over a century, a viable industrial process for selective methane activation has yet to be developed1. Electronic structure calculations are playing an increasingly relevant role in this search, but large-scale materials screening efforts are hindered by computationally expensive transition state barrier calculations. The purpose of the present letter is twofold. First, we show that, for the wide range of catalysts that proceed via a radical intermediate, a unifying framework for predicting C–H activation barriers using a single universalmore » descriptor can be established. Second, we combine this scaling approach with a thermodynamic analysis of active site formation to provide a map of methane activation rates. Lastly, our model successfully rationalizes the available empirical data and lays the foundation for future catalyst design strategies that transcend different catalyst classes.« less
Cluster Correspondence Analysis.
van de Velden, M; D'Enza, A Iodice; Palumbo, F
2017-03-01
A method is proposed that combines dimension reduction and cluster analysis for categorical data by simultaneously assigning individuals to clusters and optimal scaling values to categories in such a way that a single between variance maximization objective is achieved. In a unified framework, a brief review of alternative methods is provided and we show that the proposed method is equivalent to GROUPALS applied to categorical data. Performance of the methods is appraised by means of a simulation study. The results of the joint dimension reduction and clustering methods are compared with the so-called tandem approach, a sequential analysis of dimension reduction followed by cluster analysis. The tandem approach is conjectured to perform worse when variables are added that are unrelated to the cluster structure. Our simulation study confirms this conjecture. Moreover, the results of the simulation study indicate that the proposed method also consistently outperforms alternative joint dimension reduction and clustering methods.
Understanding trends in C-H bond activation in heterogeneous catalysis
NASA Astrophysics Data System (ADS)
Latimer, Allegra A.; Kulkarni, Ambarish R.; Aljama, Hassan; Montoya, Joseph H.; Yoo, Jong Suk; Tsai, Charlie; Abild-Pedersen, Frank; Studt, Felix; Nørskov, Jens K.
2017-02-01
While the search for catalysts capable of directly converting methane to higher value commodity chemicals and liquid fuels has been active for over a century, a viable industrial process for selective methane activation has yet to be developed. Electronic structure calculations are playing an increasingly relevant role in this search, but large-scale materials screening efforts are hindered by computationally expensive transition state barrier calculations. The purpose of the present letter is twofold. First, we show that, for the wide range of catalysts that proceed via a radical intermediate, a unifying framework for predicting C-H activation barriers using a single universal descriptor can be established. Second, we combine this scaling approach with a thermodynamic analysis of active site formation to provide a map of methane activation rates. Our model successfully rationalizes the available empirical data and lays the foundation for future catalyst design strategies that transcend different catalyst classes.
Hinterleitner, Gernot; Leopold-Wildburger, Ulrike
2015-01-01
This paper deals with the market structure at the opening of the trading day and its influence on subsequent trading. We compare a single continuous double auction and two complement markets with different call auction designs as opening mechanisms in a unified experimental framework. The call auctions differ with respect to their levels of transparency. We find that a call auction not only improves market efficiency and liquidity at the beginning of the trading day when compared to the stand-alone continuous double auction, but also causes positive spillover effects on subsequent trading. Concerning the design of the opening call auction, we find no significant differences between the transparent and nontransparent specification with respect to opening prices and liquidity. In the course of subsequent continuous trading, however, market quality is slightly higher after a nontransparent call auction. PMID:26351653
Observational constraints on black hole accretion disks
NASA Technical Reports Server (NTRS)
Liang, Edison P.
1994-01-01
We review the empirical constraints on accretion disk models of stellar-mass black holes based on recent multiwavelength observational results. In addition to time-averaged emission spectra, the time evolutions of the intensity and spectrum provide critical information about the structure, stability, and dynamics of the disk. Using the basic thermal Keplerian disk paradigm, we consider in particular generalizations of the standard optically thin disk models needed to accommodate the extremely rich variety of dynamical phenomena exhibited by black hole candidates ranging from flares of electron-positron annihilations and quasiperiodic oscillations in the X-ray intensity to X-ray novae activity. These in turn provide probes of the disk structure and global geometry. The goal is to construct a single unified framework to interpret a large variety of black hole phenomena. This paper will concentrate on the interface between basic theory and observational data modeling.
Physiological utility theory and the neuroeconomics of choice
Glimcher, Paul W.; Dorris, Michael C.; Bayer, Hannah M.
2006-01-01
Over the past half century economists have responded to the challenges of Allais [Econometrica (1953) 53], Ellsberg [Quart. J. Econ. (1961) 643] and others raised to neoclassicism either by bounding the reach of economic theory or by turning to descriptive approaches. While both of these strategies have been enormously fruitful, neither has provided a clear programmatic approach that aspires to a complete understanding of human decision making as did neoclassicism. There is, however, growing evidence that economists and neurobiologists are now beginning to reveal the physical mechanisms by which the human neuroarchitecture accomplishes decision making. Although in their infancy, these studies suggest both a single unified framework for understanding human decision making and a methodology for constraining the scope and structure of economic theory. Indeed, there is already evidence that these studies place mathematical constraints on existing economic models. This article reviews some of those constraints and suggests the outline of a neuroeconomic theory of decision. PMID:16845435
Hinterleitner, Gernot; Leopold-Wildburger, Ulrike; Mestel, Roland; Palan, Stefan
2015-01-01
This paper deals with the market structure at the opening of the trading day and its influence on subsequent trading. We compare a single continuous double auction and two complement markets with different call auction designs as opening mechanisms in a unified experimental framework. The call auctions differ with respect to their levels of transparency. We find that a call auction not only improves market efficiency and liquidity at the beginning of the trading day when compared to the stand-alone continuous double auction, but also causes positive spillover effects on subsequent trading. Concerning the design of the opening call auction, we find no significant differences between the transparent and nontransparent specification with respect to opening prices and liquidity. In the course of subsequent continuous trading, however, market quality is slightly higher after a nontransparent call auction.
Unifying Complexity and Information
NASA Astrophysics Data System (ADS)
Ke, Da-Guan
2013-04-01
Complex systems, arising in many contexts in the computer, life, social, and physical sciences, have not shared a generally-accepted complexity measure playing a fundamental role as the Shannon entropy H in statistical mechanics. Superficially-conflicting criteria of complexity measurement, i.e. complexity-randomness (C-R) relations, have given rise to a special measure intrinsically adaptable to more than one criterion. However, deep causes of the conflict and the adaptability are not much clear. Here I trace the root of each representative or adaptable measure to its particular universal data-generating or -regenerating model (UDGM or UDRM). A representative measure for deterministic dynamical systems is found as a counterpart of the H for random process, clearly redefining the boundary of different criteria. And a specific UDRM achieving the intrinsic adaptability enables a general information measure that ultimately solves all major disputes. This work encourages a single framework coving deterministic systems, statistical mechanics and real-world living organisms.
Physics of Alfvén waves and energetic particles in burning plasmas
NASA Astrophysics Data System (ADS)
Chen, Liu; Zonca, Fulvio
2016-01-01
Dynamics of shear Alfvén waves and energetic particles are crucial to the performance of burning fusion plasmas. This article reviews linear as well as nonlinear physics of shear Alfvén waves and their self-consistent interaction with energetic particles in tokamak fusion devices. More specifically, the review on the linear physics deals with wave spectral properties and collective excitations by energetic particles via wave-particle resonances. The nonlinear physics deals with nonlinear wave-wave interactions as well as nonlinear wave-energetic particle interactions. Both linear as well as nonlinear physics demonstrate the qualitatively important roles played by realistic equilibrium nonuniformities, magnetic field geometries, and the specific radial mode structures in determining the instability evolution, saturation, and, ultimately, energetic-particle transport. These topics are presented within a single unified theoretical framework, where experimental observations and numerical simulation results are referred to elucidate concepts and physics processes.
NASA Astrophysics Data System (ADS)
Perfors, Amy
2014-09-01
There is much to approve of in this provocative and interesting paper. I strongly agree in many parts, especially the point that dichotomies like nature/nurture are actively detrimental to the field. I also appreciate the idea that cognitive scientists should take the "biological wetware" of the cell (rather than the network) more seriously.
Chimaera simulation of complex states of flowing matter.
Succi, S
2016-11-13
We discuss a unified mesoscale framework (chimaera) for the simulation of complex states of flowing matter across scales of motion. The chimaera framework can deal with each of the three macro-meso-micro levels through suitable 'mutations' of the basic mesoscale formulation. The idea is illustrated through selected simulations of complex micro- and nanoscale flows.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2016 The Author(s).
A unified framework for gesture recognition and spatiotemporal gesture segmentation.
Alon, Jonathan; Athitsos, Vassilis; Yuan, Quan; Sclaroff, Stan
2009-09-01
Within the context of hand gesture recognition, spatiotemporal gesture segmentation is the task of determining, in a video sequence, where the gesturing hand is located and when the gesture starts and ends. Existing gesture recognition methods typically assume either known spatial segmentation or known temporal segmentation, or both. This paper introduces a unified framework for simultaneously performing spatial segmentation, temporal segmentation, and recognition. In the proposed framework, information flows both bottom-up and top-down. A gesture can be recognized even when the hand location is highly ambiguous and when information about when the gesture begins and ends is unavailable. Thus, the method can be applied to continuous image streams where gestures are performed in front of moving, cluttered backgrounds. The proposed method consists of three novel contributions: a spatiotemporal matching algorithm that can accommodate multiple candidate hand detections in every frame, a classifier-based pruning framework that enables accurate and early rejection of poor matches to gesture models, and a subgesture reasoning algorithm that learns which gesture models can falsely match parts of other longer gestures. The performance of the approach is evaluated on two challenging applications: recognition of hand-signed digits gestured by users wearing short-sleeved shirts, in front of a cluttered background, and retrieval of occurrences of signs of interest in a video database containing continuous, unsegmented signing in American Sign Language (ASL).
Stochastic thermodynamics, fluctuation theorems and molecular machines.
Seifert, Udo
2012-12-01
Stochastic thermodynamics as reviewed here systematically provides a framework for extending the notions of classical thermodynamics such as work, heat and entropy production to the level of individual trajectories of well-defined non-equilibrium ensembles. It applies whenever a non-equilibrium process is still coupled to one (or several) heat bath(s) of constant temperature. Paradigmatic systems are single colloidal particles in time-dependent laser traps, polymers in external flow, enzymes and molecular motors in single molecule assays, small biochemical networks and thermoelectric devices involving single electron transport. For such systems, a first-law like energy balance can be identified along fluctuating trajectories. For a basic Markovian dynamics implemented either on the continuum level with Langevin equations or on a discrete set of states as a master equation, thermodynamic consistency imposes a local-detailed balance constraint on noise and rates, respectively. Various integral and detailed fluctuation theorems, which are derived here in a unifying approach from one master theorem, constrain the probability distributions for work, heat and entropy production depending on the nature of the system and the choice of non-equilibrium conditions. For non-equilibrium steady states, particularly strong results hold like a generalized fluctuation-dissipation theorem involving entropy production. Ramifications and applications of these concepts include optimal driving between specified states in finite time, the role of measurement-based feedback processes and the relation between dissipation and irreversibility. Efficiency and, in particular, efficiency at maximum power can be discussed systematically beyond the linear response regime for two classes of molecular machines, isothermal ones such as molecular motors, and heat engines such as thermoelectric devices, using a common framework based on a cycle decomposition of entropy production.
Sun, Xiankai; Yariv, Amnon
2008-06-09
We have developed a theory that unifies the analysis of the modal properties of surface-emitting chirped circular grating lasers. This theory is based on solving the resonance conditions which involve two types of reflectivities of chirped circular gratings. This approach is shown to be in agreement with previous derivations which use the characteristic equations. Utilizing this unified analysis, we obtain the modal properties of circular DFB, disk-, and ring- Bragg resonator lasers. We also compare the threshold gain, single mode range, quality factor, emission efficiency, and modal area of these types of circular grating lasers. It is demonstrated that, under similar conditions, disk Bragg resonator lasers have the highest quality factor, the highest emission efficiency, and the smallest modal area, indicating their suitability in low-threshold, high-efficiency, ultracompact laser design, while ring Bragg resonator lasers have a large single mode range, high emission efficiency, and large modal area, indicating their suitability for high-efficiency, large-area, high-power applications.
A unified account of tilt illusions, association fields, and contour detection based on elastica.
Keemink, Sander W; van Rossum, Mark C W
2016-09-01
As expressed in the Gestalt law of good continuation, human perception tends to associate stimuli that form smooth continuations. Contextual modulation in primary visual cortex, in the form of association fields, is believed to play an important role in this process. Yet a unified and principled account of the good continuation law on the neural level is lacking. In this study we introduce a population model of primary visual cortex. Its contextual interactions depend on the elastica curvature energy of the smoothest contour connecting oriented bars. As expected, this model leads to association fields consistent with data. However, in addition the model displays tilt-illusions for stimulus configurations with grating and single bars that closely match psychophysics. Furthermore, the model explains not only pop-out of contours amid a variety of backgrounds, but also pop-out of single targets amid a uniform background. We thus propose that elastica is a unifying principle of the visual cortical network. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Data Centric Development Methodology
ERIC Educational Resources Information Center
Khoury, Fadi E.
2012-01-01
Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…
The semiotics of medical image Segmentation.
Baxter, John S H; Gibson, Eli; Eagleson, Roy; Peters, Terry M
2018-02-01
As the interaction between clinicians and computational processes increases in complexity, more nuanced mechanisms are required to describe how their communication is mediated. Medical image segmentation in particular affords a large number of distinct loci for interaction which can act on a deep, knowledge-driven level which complicates the naive interpretation of the computer as a symbol processing machine. Using the perspective of the computer as dialogue partner, we can motivate the semiotic understanding of medical image segmentation. Taking advantage of Peircean semiotic traditions and new philosophical inquiry into the structure and quality of metaphors, we can construct a unified framework for the interpretation of medical image segmentation as a sign exchange in which each sign acts as an interface metaphor. This allows for a notion of finite semiosis, described through a schematic medium, that can rigorously describe how clinicians and computers interpret the signs mediating their interaction. Altogether, this framework provides a unified approach to the understanding and development of medical image segmentation interfaces. Copyright © 2017 Elsevier B.V. All rights reserved.
Complex networks as a unified framework for descriptive analysis and predictive modeling in climate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steinhaeuser, Karsten J K; Chawla, Nitesh; Ganguly, Auroop R
The analysis of climate data has relied heavily on hypothesis-driven statistical methods, while projections of future climate are based primarily on physics-based computational models. However, in recent years a wealth of new datasets has become available. Therefore, we take a more data-centric approach and propose a unified framework for studying climate, with an aim towards characterizing observed phenomena as well as discovering new knowledge in the climate domain. Specifically, we posit that complex networks are well-suited for both descriptive analysis and predictive modeling tasks. We show that the structural properties of climate networks have useful interpretation within the domain. Further,more » we extract clusters from these networks and demonstrate their predictive power as climate indices. Our experimental results establish that the network clusters are statistically significantly better predictors than clusters derived using a more traditional clustering approach. Using complex networks as data representation thus enables the unique opportunity for descriptive and predictive modeling to inform each other.« less
The thermodynamics of dense granular flow and jamming
NASA Astrophysics Data System (ADS)
Lu, Shih Yu
The scope of the thesis is to propose, based on experimental evidence and theoretical validation, a quantifiable connection between systems that exhibit the jamming phenomenon. When jammed, some materials that flow are able to resist deformation so that they appear solid-like on the laboratory scale. But unlike ordinary fusion, which has a critically defined criterion in pressure and temperature, jamming occurs under a wide range of conditions. These condition have been rigorously investigated but at the moment, no self-consistent framework can apply to grains, foam and colloids that may have suddenly ceased to flow. To quantify the jamming behavior, a constitutive model of dense granular flows is deduced from shear-flow experiments. The empirical equations are then generalized, via a thermodynamic approach, into an equation-of-state for jamming. Notably, the unifying theory also predicts the experimental data on the behavior of molecular glassy liquids. This analogy paves a crucial road map for a unifying theoretical framework in condensed matter, for example, ranging from sand to fire retardants to toothpaste.
Li, Bing; Yuan, Chunfeng; Xiong, Weihua; Hu, Weiming; Peng, Houwen; Ding, Xinmiao; Maybank, Steve
2017-12-01
In multi-instance learning (MIL), the relations among instances in a bag convey important contextual information in many applications. Previous studies on MIL either ignore such relations or simply model them with a fixed graph structure so that the overall performance inevitably degrades in complex environments. To address this problem, this paper proposes a novel multi-view multi-instance learning algorithm (MIL) that combines multiple context structures in a bag into a unified framework. The novel aspects are: (i) we propose a sparse -graph model that can generate different graphs with different parameters to represent various context relations in a bag, (ii) we propose a multi-view joint sparse representation that integrates these graphs into a unified framework for bag classification, and (iii) we propose a multi-view dictionary learning algorithm to obtain a multi-view graph dictionary that considers cues from all views simultaneously to improve the discrimination of the MIL. Experiments and analyses in many practical applications prove the effectiveness of the M IL.
Chen, Bor-Sen; Lin, Ying-Po
2013-01-01
In ecological networks, network robustness should be large enough to confer intrinsic robustness for tolerating intrinsic parameter fluctuations, as well as environmental robustness for resisting environmental disturbances, so that the phenotype stability of ecological networks can be maintained, thus guaranteeing phenotype robustness. However, it is difficult to analyze the network robustness of ecological systems because they are complex nonlinear partial differential stochastic systems. This paper develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance sensitivity in ecological networks. We found that the phenotype robustness criterion for ecological networks is that if intrinsic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations and environmental disturbances. These results in robust ecological networks are similar to that in robust gene regulatory networks and evolutionary networks even they have different spatial-time scales. PMID:23515112
Unified Computational Methods for Regression Analysis of Zero-Inflated and Bound-Inflated Data
Yang, Yan; Simpson, Douglas
2010-01-01
Bounded data with excess observations at the boundary are common in many areas of application. Various individual cases of inflated mixture models have been studied in the literature for bound-inflated data, yet the computational methods have been developed separately for each type of model. In this article we use a common framework for computing these models, and expand the range of models for both discrete and semi-continuous data with point inflation at the lower boundary. The quasi-Newton and EM algorithms are adapted and compared for estimation of model parameters. The numerical Hessian and generalized Louis method are investigated as means for computing standard errors after optimization. Correlated data are included in this framework via generalized estimating equations. The estimation of parameters and effectiveness of standard errors are demonstrated through simulation and in the analysis of data from an ultrasound bioeffect study. The unified approach enables reliable computation for a wide class of inflated mixture models and comparison of competing models. PMID:20228950
Domain Anomaly Detection in Machine Perception: A System Architecture and Taxonomy.
Kittler, Josef; Christmas, William; de Campos, Teófilo; Windridge, David; Yan, Fei; Illingworth, John; Osman, Magda
2014-05-01
We address the problem of anomaly detection in machine perception. The concept of domain anomaly is introduced as distinct from the conventional notion of anomaly used in the literature. We propose a unified framework for anomaly detection which exposes the multifaceted nature of anomalies and suggest effective mechanisms for identifying and distinguishing each facet as instruments for domain anomaly detection. The framework draws on the Bayesian probabilistic reasoning apparatus which clearly defines concepts such as outlier, noise, distribution drift, novelty detection (object, object primitive), rare events, and unexpected events. Based on these concepts we provide a taxonomy of domain anomaly events. One of the mechanisms helping to pinpoint the nature of anomaly is based on detecting incongruence between contextual and noncontextual sensor(y) data interpretation. The proposed methodology has wide applicability. It underpins in a unified way the anomaly detection applications found in the literature. To illustrate some of its distinguishing features, in here the domain anomaly detection methodology is applied to the problem of anomaly detection for a video annotation system.
Wang, Guoli; Ebrahimi, Nader
2014-01-01
Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H, such that V ∼ W H. It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H. In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data. PMID:25821345
Fallah, Parisa Nicole; Bernstein, Mark
2017-09-07
Access to adequate surgical care is limited globally, particularly in low- and middle-income countries (LMICs). To address this issue, surgeons are becoming increasingly involved in international surgical teaching collaborations (ISTCs), which include educational partnerships between surgical teams in high-income countries and those in LMICs. The purpose of this study is to determine a framework for unifying, systematizing, and improving the quality of ISTCs so that they can better address the global surgical need. A convenience sample of 68 surgeons, anesthesiologists, physicians, residents, nurses, academics, and administrators from the U.S., Canada, and Norway was used for the study. Participants all had some involvement in ISTCs and came from multiple specialties and institutions. Qualitative methodology was used, and participants were interviewed using a pre-determined set of open-ended questions. Data was gathered over two months either in-person, over the phone, or on Skype. Data was evaluated using thematic content analysis. To organize and systematize ISTCs, participants reported a need for a centralized/systematized process with designated leaders, a universal data bank of current efforts/progress, communication amongst involved parties, full-time administrative staff, dedicated funds, a scholarly approach, increased use of technology, and more research on needs and outcomes. By taking steps towards unifying and systematizing ISTCs, the quality of ISTCs can be improved. This could lead to an advancement in efforts to increase access to surgical care worldwide.
Devarajan, Karthik; Wang, Guoli; Ebrahimi, Nader
2015-04-01
Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H , such that V ∼ W H . It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H . In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data.
Left Handed Materials Based on Magnetic Nanocomposites
2006-10-18
theory that unifies DNMs and SNMs as a function of two flmdamental material parameters: quality factors for permittivity (Qe=e’/e") and permeability (Qu...simultaneously negative effective permeability/uff and permittivity Seff to form LHM or only single negative parameter (SNM) to form negative indexed...developed a theory that unifies DNMs and SNMs as a function of two fundamental material parameters: quality factors for permittivity (Q, = -’/ 6") and
Unified Engineering Software System
NASA Technical Reports Server (NTRS)
Purves, L. R.; Gordon, S.; Peltzman, A.; Dube, M.
1989-01-01
Collection of computer programs performs diverse functions in prototype engineering. NEXUS, NASA Engineering Extendible Unified Software system, is research set of computer programs designed to support full sequence of activities encountered in NASA engineering projects. Sequence spans preliminary design, design analysis, detailed design, manufacturing, assembly, and testing. Primarily addresses process of prototype engineering, task of getting single or small number of copies of product to work. Written in FORTRAN 77 and PROLOG.
Multi-scale graph-cut algorithm for efficient water-fat separation.
Berglund, Johan; Skorpil, Mikael
2017-09-01
To improve the accuracy and robustness to noise in water-fat separation by unifying the multiscale and graph cut based approaches to B 0 -correction. A previously proposed water-fat separation algorithm that corrects for B 0 field inhomogeneity in 3D by a single quadratic pseudo-Boolean optimization (QPBO) graph cut was incorporated into a multi-scale framework, where field map solutions are propagated from coarse to fine scales for voxels that are not resolved by the graph cut. The accuracy of the single-scale and multi-scale QPBO algorithms was evaluated against benchmark reference datasets. The robustness to noise was evaluated by adding noise to the input data prior to water-fat separation. Both algorithms achieved the highest accuracy when compared with seven previously published methods, while computation times were acceptable for implementation in clinical routine. The multi-scale algorithm was more robust to noise than the single-scale algorithm, while causing only a small increase (+10%) of the reconstruction time. The proposed 3D multi-scale QPBO algorithm offers accurate water-fat separation, robustness to noise, and fast reconstruction. The software implementation is freely available to the research community. Magn Reson Med 78:941-949, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
Hilltop supernatural inflation and SUSY unified models
NASA Astrophysics Data System (ADS)
Kohri, Kazunori; Lim, C. S.; Lin, Chia-Min; Mimura, Yukihiro
2014-01-01
In this paper, we consider high scale (100TeV) supersymmetry (SUSY) breaking and realize the idea of hilltop supernatural inflation in concrete particle physics models based on flipped-SU(5)and Pati-Salam models in the framework of supersymmetric grand unified theories (SUSY GUTs). The inflaton can be a flat direction including right-handed sneutrino and the waterfall field is a GUT Higgs. The spectral index is ns = 0.96 which fits very well with recent data by PLANCK satellite. There is no both thermal and non-thermal gravitino problems. Non-thermal leptogenesis can be resulted from the decay of right-handed sneutrino which plays (part of) the role of inflaton.
Huang, Zhen
2017-01-01
This paper uses experimental investigation and theoretical derivation to study the unified failure mechanism and ultimate capacity model of reinforced concrete (RC) members under combined axial, bending, shear and torsion loading. Fifteen RC members are tested under different combinations of compressive axial force, bending, shear and torsion using experimental equipment designed by the authors. The failure mechanism and ultimate strength data for the four groups of tested RC members under different combined loading conditions are investigated and discussed in detail. The experimental research seeks to determine how the ultimate strength of RC members changes with changing combined loads. According to the experimental research, a unified theoretical model is established by determining the shape of the warped failure surface, assuming an appropriate stress distribution on the failure surface, and considering the equilibrium conditions. This unified failure model can be reasonably and systematically changed into well-known failure theories of concrete members under single or combined loading. The unified calculation model could be easily used in design applications with some assumptions and simplifications. Finally, the accuracy of this theoretical unified model is verified by comparisons with experimental results. PMID:28414777
Emotion and the prefrontal cortex: An integrative review.
Dixon, Matthew L; Thiruchselvam, Ravi; Todd, Rebecca; Christoff, Kalina
2017-10-01
The prefrontal cortex (PFC) plays a critical role in the generation and regulation of emotion. However, we lack an integrative framework for understanding how different emotion-related functions are organized across the entire expanse of the PFC, as prior reviews have generally focused on specific emotional processes (e.g., decision making) or specific anatomical regions (e.g., orbitofrontal cortex). Additionally, psychological theories and neuroscientific investigations have proceeded largely independently because of the lack of a common framework. Here, we provide a comprehensive review of functional neuroimaging, electrophysiological, lesion, and structural connectivity studies on the emotion-related functions of 8 subregions spanning the entire PFC. We introduce the appraisal-by-content model, which provides a new framework for integrating the diverse range of empirical findings. Within this framework, appraisal serves as a unifying principle for understanding the PFC's role in emotion, while relative content-specialization serves as a differentiating principle for understanding the role of each subregion. A synthesis of data from affective, social, and cognitive neuroscience studies suggests that different PFC subregions are preferentially involved in assigning value to specific types of inputs: exteroceptive sensations, episodic memories and imagined future events, viscero-sensory signals, viscero-motor signals, actions, others' mental states (e.g., intentions), self-related information, and ongoing emotions. We discuss the implications of this integrative framework for understanding emotion regulation, value-based decision making, emotional salience, and refining theoretical models of emotion. This framework provides a unified understanding of how emotional processes are organized across PFC subregions and generates new hypotheses about the mechanisms underlying adaptive and maladaptive emotional functioning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Interprofessional Care and Collaborative Practice.
ERIC Educational Resources Information Center
Casto, R. Michael; And Others
This book provides materials for those learning about the dynamics, techniques, and potential of interprofessional collaboration in health care and human services professions. Eight case studies thread their way through most chapters to unify and illustrate the text. Part 1 addresses the theoretical framework that forms the basis for…
Mean Comparison: Manifest Variable versus Latent Variable
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Bentler, Peter M.
2006-01-01
An extension of multiple correspondence analysis is proposed that takes into account cluster-level heterogeneity in respondents' preferences/choices. The method involves combining multiple correspondence analysis and k-means in a unified framework. The former is used for uncovering a low-dimensional space of multivariate categorical variables…
Unified Framework for Deriving Simultaneous Equation Algorithms for Water Distribution Networks
The known formulations for steady state hydraulics within looped water distribution networks are re-derived in terms of linear and non-linear transformations of the original set of partly linear and partly non-linear equations that express conservation of mass and energy. All of ...
Reconciling Time, Space and Function: A New Dorsal-Ventral Stream Model of Sentence Comprehension
ERIC Educational Resources Information Center
Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias
2013-01-01
We present a new dorsal-ventral stream framework for language comprehension which unifies basic neurobiological assumptions (Rauschecker & Scott, 2009) with a cross-linguistic neurocognitive sentence comprehension model (eADM; Bornkessel & Schlesewsky, 2006). The dissociation between (time-dependent) syntactic structure-building and…
NASA Astrophysics Data System (ADS)
Honing, Henkjan; Zuidema, Willem
2014-09-01
The future of cognitive science will be about bridging neuroscience and behavioral studies, with essential roles played by comparative biology, formal modeling, and the theory of computation. Nowhere will this integration be more strongly needed than in understanding the biological basis of language and music. We thus strongly sympathize with the general framework that Fitch [1] proposes, and welcome the remarkably broad and readable review he presents to support it.
RosettaRemodel: A Generalized Framework for Flexible Backbone Protein Design
Huang, Po-Ssu; Ban, Yih-En Andrew; Richter, Florian; Andre, Ingemar; Vernon, Robert; Schief, William R.; Baker, David
2011-01-01
We describe RosettaRemodel, a generalized framework for flexible protein design that provides a versatile and convenient interface to the Rosetta modeling suite. RosettaRemodel employs a unified interface, called a blueprint, which allows detailed control over many aspects of flexible backbone protein design calculations. RosettaRemodel allows the construction and elaboration of customized protocols for a wide range of design problems ranging from loop insertion and deletion, disulfide engineering, domain assembly, loop remodeling, motif grafting, symmetrical units, to de novo structure modeling. PMID:21909381
Pricing foreign equity option with stochastic volatility
NASA Astrophysics Data System (ADS)
Sun, Qi; Xu, Weidong
2015-11-01
In this paper we propose a general foreign equity option pricing framework that unifies the vast foreign equity option pricing literature and incorporates the stochastic volatility into foreign equity option pricing. Under our framework, the time-changed Lévy processes are used to model the underlying assets price of foreign equity option and the closed form pricing formula is obtained through the use of characteristic function methodology. Numerical tests indicate that stochastic volatility has a dramatic effect on the foreign equity option prices.
NASA Astrophysics Data System (ADS)
Laban, Shaban; El-Desouky, Aly
2014-05-01
To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). Also, it is possible to extend the use of the framework in monitoring the IDC pipeline. The detailed design, implementation,conclusion and future work of the proposed framework will be presented.
RANZCR Body Systems Framework of diagnostic imaging examination descriptors.
Pitman, Alexander G; Penlington, Lisa; Doromal, Darren; Slater, Gregory; Vukolova, Natalia
2014-08-01
A unified and logical system of descriptors for diagnostic imaging examinations and procedures is a desirable resource for radiology in Australia and New Zealand and is needed to support core activities of RANZCR. Existing descriptor systems available in Australia and New Zealand (including the Medicare DIST and the ACC Schedule) have significant limitations and are inappropriate for broader clinical application. An anatomically based grid was constructed, with anatomical structures arranged in rows and diagnostic imaging modalities arranged in columns (including nuclear medicine and positron emission tomography). The grid was segregated into five body systems. The cells at the intersection of an anatomical structure row and an imaging modality column were populated with short, formulaic descriptors of the applicable diagnostic imaging examinations. Clinically illogical or physically impossible combinations were 'greyed out'. Where the same examination applied to different anatomical structures, the descriptor was kept identical for the purposes of streamlining. The resulting Body Systems Framework of diagnostic imaging examination descriptors lists all the reasonably common diagnostic imaging examinations currently performed in Australia and New Zealand using a unified grid structure allowing navigation by both referrers and radiologists. The Framework has been placed on the RANZCR website and is available for access free of charge by registered users. The Body Systems Framework of diagnostic imaging examination descriptors is a system of descriptors based on relationships between anatomical structures and imaging modalities. The Framework is now available as a resource and reference point for the radiology profession and to support core College activities. © 2014 The Royal Australian and New Zealand College of Radiologists.
Read, Mark; Andrews, Paul S; Timmis, Jon; Kumar, Vipin
2014-10-06
We present a framework to assist the diagrammatic modelling of complex biological systems using the unified modelling language (UML). The framework comprises three levels of modelling, ranging in scope from the dynamics of individual model entities to system-level emergent properties. By way of an immunological case study of the mouse disease experimental autoimmune encephalomyelitis, we show how the framework can be used to produce models that capture and communicate the biological system, detailing how biological entities, interactions and behaviours lead to higher-level emergent properties observed in the real world. We demonstrate how the UML can be successfully applied within our framework, and provide a critique of UML's ability to capture concepts fundamental to immunology and biology more generally. We show how specialized, well-explained diagrams with less formal semantics can be used where no suitable UML formalism exists. We highlight UML's lack of expressive ability concerning cyclic feedbacks in cellular networks, and the compounding concurrency arising from huge numbers of stochastic, interacting agents. To compensate for this, we propose several additional relationships for expressing these concepts in UML's activity diagram. We also demonstrate the ambiguous nature of class diagrams when applied to complex biology, and question their utility in modelling such dynamic systems. Models created through our framework are non-executable, and expressly free of simulation implementation concerns. They are a valuable complement and precursor to simulation specifications and implementations, focusing purely on thoroughly exploring the biology, recording hypotheses and assumptions, and serve as a communication medium detailing exactly how a simulation relates to the real biology.
Read, Mark; Andrews, Paul S.; Timmis, Jon; Kumar, Vipin
2014-01-01
We present a framework to assist the diagrammatic modelling of complex biological systems using the unified modelling language (UML). The framework comprises three levels of modelling, ranging in scope from the dynamics of individual model entities to system-level emergent properties. By way of an immunological case study of the mouse disease experimental autoimmune encephalomyelitis, we show how the framework can be used to produce models that capture and communicate the biological system, detailing how biological entities, interactions and behaviours lead to higher-level emergent properties observed in the real world. We demonstrate how the UML can be successfully applied within our framework, and provide a critique of UML's ability to capture concepts fundamental to immunology and biology more generally. We show how specialized, well-explained diagrams with less formal semantics can be used where no suitable UML formalism exists. We highlight UML's lack of expressive ability concerning cyclic feedbacks in cellular networks, and the compounding concurrency arising from huge numbers of stochastic, interacting agents. To compensate for this, we propose several additional relationships for expressing these concepts in UML's activity diagram. We also demonstrate the ambiguous nature of class diagrams when applied to complex biology, and question their utility in modelling such dynamic systems. Models created through our framework are non-executable, and expressly free of simulation implementation concerns. They are a valuable complement and precursor to simulation specifications and implementations, focusing purely on thoroughly exploring the biology, recording hypotheses and assumptions, and serve as a communication medium detailing exactly how a simulation relates to the real biology. PMID:25142524
A stochastically fully connected conditional random field framework for super resolution OCT
NASA Astrophysics Data System (ADS)
Boroomand, A.; Tan, B.; Wong, A.; Bizheva, K.
2017-02-01
A number of factors can degrade the resolution and contrast of OCT images, such as: (1) changes of the OCT pointspread function (PSF) resulting from wavelength dependent scattering and absorption of light along the imaging depth (2) speckle noise, as well as (3) motion artifacts. We propose a new Super Resolution OCT (SR OCT) imaging framework that takes advantage of a Stochastically Fully Connected Conditional Random Field (SF-CRF) model to generate a Super Resolved OCT (SR OCT) image of higher quality from a set of Low-Resolution OCT (LR OCT) images. The proposed SF-CRF SR OCT imaging is able to simultaneously compensate for all of the factors mentioned above, that degrade the OCT image quality, using a unified computational framework. The proposed SF-CRF SR OCT imaging framework was tested on a set of simulated LR human retinal OCT images generated from a high resolution, high contrast retinal image, and on a set of in-vivo, high resolution, high contrast rat retinal OCT images. The reconstructed SR OCT images show considerably higher spatial resolution, less speckle noise and higher contrast compared to other tested methods. Visual assessment of the results demonstrated the usefulness of the proposed approach in better preservation of fine details and structures of the imaged sample, retaining biological tissue boundaries while reducing speckle noise using a unified computational framework. Quantitative evaluation using both Contrast to Noise Ratio (CNR) and Edge Preservation (EP) parameter also showed superior performance of the proposed SF-CRF SR OCT approach compared to other image processing approaches.
Buetow, S; Adair, V; Coster, G; Hight, M; Gribben, B; Mitchell, E
2002-12-01
Different sets of literature suggest how aspects of practice time management can limit access to general practitioner (GP) care. Researchers have not organised this knowledge into a unified framework that can enhance understanding of barriers to, and opportunities for, improved access. To suggest a framework conceptualising how differences in professional and cultural understanding of practice time management in Auckland, New Zealand, influence access to GP care for children with chronic asthma. A qualitative study involving selective sampling, semi-structured interviews on barriers to access, and a general inductive approach. Twenty-nine key informants and ten mothers of children with chronic, moderate to severe asthma and poor access to GP care in Auckland. Development of a framework from themes describing barriers associated with, and needs for, practice time management. The themes were independently identified by two authors from transcribed interviews and confirmed through informant checking. Themes from key informant and patient interviews were triangulated with each other and with published literature. The framework distinguishes 'practice-centred time' from 'patient-centred time.' A predominance of 'practice-centred time' and an unmet opportunity for 'patient-centred time' are suggested by the persistence of five barriers to accessing GP care: limited hours of opening; traditional appointment systems; practice intolerance of missed appointments; long waiting times in the practice; and inadequate consultation lengths. None of the barriers is specific to asthmatic children. A unified framework was suggested for understanding how the organisation of practice work time can influence access to GP care by groups including asthmatic children.
Mechanic: The MPI/HDF code framework for dynamical astronomy
NASA Astrophysics Data System (ADS)
Słonina, Mariusz; Goździewski, Krzysztof; Migaszewski, Cezary
2015-01-01
We introduce the Mechanic, a new open-source code framework. It is designed to reduce the development effort of scientific applications by providing unified API (Application Programming Interface) for configuration, data storage and task management. The communication layer is based on the well-established Message Passing Interface (MPI) standard, which is widely used on variety of parallel computers and CPU-clusters. The data storage is performed within the Hierarchical Data Format (HDF5). The design of the code follows core-module approach which allows to reduce the user’s codebase and makes it portable for single- and multi-CPU environments. The framework may be used in a local user’s environment, without administrative access to the cluster, under the PBS or Slurm job schedulers. It may become a helper tool for a wide range of astronomical applications, particularly focused on processing large data sets, such as dynamical studies of long-term orbital evolution of planetary systems with Monte Carlo methods, dynamical maps or evolutionary algorithms. It has been already applied in numerical experiments conducted for Kepler-11 (Migaszewski et al., 2012) and νOctantis planetary systems (Goździewski et al., 2013). In this paper we describe the basics of the framework, including code listings for the implementation of a sample user’s module. The code is illustrated on a model Hamiltonian introduced by (Froeschlé et al., 2000) presenting the Arnold diffusion. The Arnold web is shown with the help of the MEGNO (Mean Exponential Growth of Nearby Orbits) fast indicator (Goździewski et al., 2008a) applied onto symplectic SABAn integrators family (Laskar and Robutel, 2001).
a Unified Matrix Polynomial Approach to Modal Identification
NASA Astrophysics Data System (ADS)
Allemang, R. J.; Brown, D. L.
1998-04-01
One important current focus of modal identification is a reformulation of modal parameter estimation algorithms into a single, consistent mathematical formulation with a corresponding set of definitions and unifying concepts. Particularly, a matrix polynomial approach is used to unify the presentation with respect to current algorithms such as the least-squares complex exponential (LSCE), the polyreference time domain (PTD), Ibrahim time domain (ITD), eigensystem realization algorithm (ERA), rational fraction polynomial (RFP), polyreference frequency domain (PFD) and the complex mode indication function (CMIF) methods. Using this unified matrix polynomial approach (UMPA) allows a discussion of the similarities and differences of the commonly used methods. the use of least squares (LS), total least squares (TLS), double least squares (DLS) and singular value decomposition (SVD) methods is discussed in order to take advantage of redundant measurement data. Eigenvalue and SVD transformation methods are utilized to reduce the effective size of the resulting eigenvalue-eigenvector problem as well.
Explanatory pluralism: An unrewarding prediction error for free energy theorists.
Colombo, Matteo; Wright, Cory
2017-03-01
Courtesy of its free energy formulation, the hierarchical predictive processing theory of the brain (PTB) is often claimed to be a grand unifying theory. To test this claim, we examine a central case: activity of mesocorticolimbic dopaminergic (DA) systems. After reviewing the three most prominent hypotheses of DA activity-the anhedonia, incentive salience, and reward prediction error hypotheses-we conclude that the evidence currently vindicates explanatory pluralism. This vindication implies that the grand unifying claims of advocates of PTB are unwarranted. More generally, we suggest that the form of scientific progress in the cognitive sciences is unlikely to be a single overarching grand unifying theory. Copyright © 2016 Elsevier Inc. All rights reserved.
Teacher Preparation for Vocational Education and Training in Germany: A Potential Model for Canada?
ERIC Educational Resources Information Center
Barabasch, Antje; Watt-Malcolm, Bonnie
2013-01-01
Germany's vocational education and training (VET) and corresponding teacher-education programmes are known worldwide for their integrated framework. Government legislation unifies companies, unions and vocational schools, and specifies the education and training required for students as well as vocational teachers. Changing from the Diplom…
The Unified Plant Growth Model (UPGM): software framework overview and model application
USDA-ARS?s Scientific Manuscript database
Since the Environmental Policy Integrated Climate (EPIC) model was developed in 1989, the EPIC plant growth component has been incorporated into other erosion and crop management models (e.g., WEPS, WEPP, SWAT, ALMANAC, and APEX) and modified to meet model developer research objectives. This has re...
The Importance of Culture for Developmental Science
ERIC Educational Resources Information Center
Keller, Heidi
2012-01-01
In this essay, it is argued that a general understanding of human development needs a unified framework based on evolutionary theorizing and cross-cultural and cultural anthropological approaches. An eco-social model of development has been proposed that defines cultural milieus as adaptations to specific socio-demographic contexts. Ontogenetic…
2009-08-19
SSDS Ship Self Defense System TSTS Total Ship Training System UDDI Universal Description, Discovery, and Integration UML Unified Modeling...34ContractorOrganization" type="ContractorOrganizationType"> <xs:annotation> <xs:documentation>Identifies a contractor organization resposible for the
ERIC Educational Resources Information Center
Hwang, Heungsun; Montreal, Hec; Dillon, William R.; Takane, Yoshio
2006-01-01
An extension of multiple correspondence analysis is proposed that takes into account cluster-level heterogeneity in respondents' preferences/choices. The method involves combining multiple correspondence analysis and k-means in a unified framework. The former is used for uncovering a low-dimensional space of multivariate categorical variables…
USDA-ARS?s Scientific Manuscript database
Biological diversity is a key concept in the life sciences and plays a fundamental role in many ecological and evolutionary processes. Although biodiversity is inherently a hierarchical concept covering different levels of organization (genes, population, species, ecological communities and ecosyst...
The Theory behind the Theory in DCT and SCDT: A Response to Rigazio-DiGilio.
ERIC Educational Resources Information Center
Terry, Linda L.
1994-01-01
Responds to previous article by Rigazio-DiGilio on Developmental Counseling and Therapy and Systemic Cognitive-Developmental Therapy as two integrative models that unify individual, family, and network treatment within coconstructive-developmental framework. Discusses hidden complexities in cognitive-developmental ecosystemic integration and…
Potential of DCT/SCDT in Addressing Two Elusive Themes of Mental Health Counseling.
ERIC Educational Resources Information Center
Borders, L. DiAnne
1994-01-01
Responds to previous article by Rigazio-DiGilio on Developmental Counseling and Therapy and Systemic Cognitive-Developmental Therapy as two integrative models that unify individual, family, and network treatment within coconstructive-developmental framework. Considers extent to which model breaks impasse in integrating development into counseling…
Converging Instructional Technology and Critical Intercultural Pedagogy in Teacher Education
ERIC Educational Resources Information Center
Pittman, Joyce
2007-01-01
Purpose: This paper aims to postulate an emerging unified cultural-convergence framework to converge the delivery of instructional technology and intercultural education (ICE) that extends beyond web-learning technologies to inculcate inclusive pedagogy in teacher education. Design/methodology/approach: The paper explores the literature and a…
Spending on School Infrastructure: Does Money Matter?
ERIC Educational Resources Information Center
Crampton, Faith E.
2009-01-01
Purpose: The purpose of this study is to further develop an emerging thread of quantitative research that grounds investment in school infrastructure in a unified theoretical framework of investment in human, social, and physical capital. Design/methodology/approach: To answer the research question, what is the impact of investment in human,…
Simultaneous Two-Way Clustering of Multiple Correspondence Analysis
ERIC Educational Resources Information Center
Hwang, Heungsun; Dillon, William R.
2010-01-01
A 2-way clustering approach to multiple correspondence analysis is proposed to account for cluster-level heterogeneity of both respondents and variable categories in multivariate categorical data. Specifically, in the proposed method, multiple correspondence analysis is combined with k-means in a unified framework in which "k"-means is…
ERIC Educational Resources Information Center
Arnhart, Larry
2006-01-01
Be it metaphysics, theology, or some other unifying framework, humans have long sought to determine "first principles" underlying knowledge. Larry Arnhart continues in this vein, positing a Darwinian web of genetic, cultural, and cognitive evolution to explain our social behavior in terms of human nature as governed by biology. He leaves it to us…
Unified, Insular, Firmly Policed, or Fractured, Porous, Contested, Gifted Education?
ERIC Educational Resources Information Center
Ambrose, Don; VanTassel-Baska, Joyce; Coleman, Laurence J.; Cross, Tracy L.
2010-01-01
Much like medieval, feudal nations, professional fields such as gifted education can take shape as centralized kingdoms with strong armies controlling their compliant populations and protecting closed borders, or as loose collections of conflict-prone principalities with borders open to invaders. Using an investigative framework borrowed from an…
Software for Data Analysis with Graphical Models
NASA Technical Reports Server (NTRS)
Buntine, Wray L.; Roy, H. Scott
1994-01-01
Probabilistic graphical models are being used widely in artificial intelligence and statistics, for instance, in diagnosis and expert systems, as a framework for representing and reasoning with probabilities and independencies. They come with corresponding algorithms for performing statistical inference. This offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper illustrates the framework with an example and then presents some basic techniques for the task: problem decomposition and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.
PERCH: A Unified Framework for Disease Gene Prioritization.
Feng, Bing-Jian
2017-03-01
To interpret genetic variants discovered from next-generation sequencing, integration of heterogeneous information is vital for success. This article describes a framework named PERCH (Polymorphism Evaluation, Ranking, and Classification for a Heritable trait), available at http://BJFengLab.org/. It can prioritize disease genes by quantitatively unifying a new deleteriousness measure called BayesDel, an improved assessment of the biological relevance of genes to the disease, a modified linkage analysis, a novel rare-variant association test, and a converted variant call quality score. It supports data that contain various combinations of extended pedigrees, trios, and case-controls, and allows for a reduced penetrance, an elevated phenocopy rate, liability classes, and covariates. BayesDel is more accurate than PolyPhen2, SIFT, FATHMM, LRT, Mutation Taster, Mutation Assessor, PhyloP, GERP++, SiPhy, CADD, MetaLR, and MetaSVM. The overall approach is faster and more powerful than the existing quantitative method pVAAST, as shown by the simulations of challenging situations in finding the missing heritability of a complex disease. This framework can also classify variants of unknown significance (variants of uncertain significance) by quantitatively integrating allele frequencies, deleteriousness, association, and co-segregation. PERCH is a versatile tool for gene prioritization in gene discovery research and variant classification in clinical genetic testing. © 2016 The Authors. **Human Mutation published by Wiley Periodicals, Inc.
Fatima, Iram; Fahim, Muhammad; Lee, Young-Koo; Lee, Sungyoung
2013-01-01
In recent years, activity recognition in smart homes is an active research area due to its applicability in many applications, such as assistive living and healthcare. Besides activity recognition, the information collected from smart homes has great potential for other application domains like lifestyle analysis, security and surveillance, and interaction monitoring. Therefore, discovery of users common behaviors and prediction of future actions from past behaviors become an important step towards allowing an environment to provide personalized service. In this paper, we develop a unified framework for activity recognition-based behavior analysis and action prediction. For this purpose, first we propose kernel fusion method for accurate activity recognition and then identify the significant sequential behaviors of inhabitants from recognized activities of their daily routines. Moreover, behaviors patterns are further utilized to predict the future actions from past activities. To evaluate the proposed framework, we performed experiments on two real datasets. The results show a remarkable improvement of 13.82% in the accuracy on average of recognized activities along with the extraction of significant behavioral patterns and precise activity predictions with 6.76% increase in F-measure. All this collectively help in understanding the users” actions to gain knowledge about their habits and preferences. PMID:23435057
Ghanbari, Yasser; Smith, Alex R.; Schultz, Robert T.; Verma, Ragini
2014-01-01
Diffusion tensor imaging (DTI) offers rich insights into the physical characteristics of white matter (WM) fiber tracts and their development in the brain, facilitating a network representation of brain’s traffic pathways. Such a network representation of brain connectivity has provided a novel means of investigating brain changes arising from pathology, development or aging. The high dimensionality of these connectivity networks necessitates the development of methods that identify the connectivity building blocks or sub-network components that characterize the underlying variation in the population. In addition, the projection of the subject networks into the basis set provides a low dimensional representation of it, that teases apart different sources of variation in the sample, facilitating variation-specific statistical analysis. We propose a unified framework of non-negative matrix factorization and graph embedding for learning sub-network patterns of connectivity by their projective non-negative decomposition into a reconstructive basis set, as well as, additional basis sets representing variational sources in the population like age and pathology. The proposed framework is applied to a study of diffusion-based connectivity in subjects with autism that shows localized sparse sub-networks which mostly capture the changes related to pathology and developmental variations. PMID:25037933
Inferring fitness landscapes and selection on phenotypic states from single-cell genealogical data
Kussell, Edo
2017-01-01
Recent advances in single-cell time-lapse microscopy have revealed non-genetic heterogeneity and temporal fluctuations of cellular phenotypes. While different phenotypic traits such as abundance of growth-related proteins in single cells may have differential effects on the reproductive success of cells, rigorous experimental quantification of this process has remained elusive due to the complexity of single cell physiology within the context of a proliferating population. We introduce and apply a practical empirical method to quantify the fitness landscapes of arbitrary phenotypic traits, using genealogical data in the form of population lineage trees which can include phenotypic data of various kinds. Our inference methodology for fitness landscapes determines how reproductivity is correlated to cellular phenotypes, and provides a natural generalization of bulk growth rate measures for single-cell histories. Using this technique, we quantify the strength of selection acting on different cellular phenotypic traits within populations, which allows us to determine whether a change in population growth is caused by individual cells’ response, selection within a population, or by a mixture of these two processes. By applying these methods to single-cell time-lapse data of growing bacterial populations that express a resistance-conferring protein under antibiotic stress, we show how the distributions, fitness landscapes, and selection strength of single-cell phenotypes are affected by the drug. Our work provides a unified and practical framework for quantitative measurements of fitness landscapes and selection strength for any statistical quantities definable on lineages, and thus elucidates the adaptive significance of phenotypic states in time series data. The method is applicable in diverse fields, from single cell biology to stem cell differentiation and viral evolution. PMID:28267748
Yin, X-X; Zhang, Y; Cao, J; Wu, J-L; Hadjiloucas, S
2016-12-01
We provide a comprehensive account of recent advances in biomedical image analysis and classification from two complementary imaging modalities: terahertz (THz) pulse imaging and dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). The work aims to highlight underlining commonalities in both data structures so that a common multi-channel data fusion framework can be developed. Signal pre-processing in both datasets is discussed briefly taking into consideration advances in multi-resolution analysis and model based fractional order calculus system identification. Developments in statistical signal processing using principal component and independent component analysis are also considered. These algorithms have been developed independently by the THz-pulse imaging and DCE-MRI communities, and there is scope to place them in a common multi-channel framework to provide better software standardization at the pre-processing de-noising stage. A comprehensive discussion of feature selection strategies is also provided and the importance of preserving textural information is highlighted. Feature extraction and classification methods taking into consideration recent advances in support vector machine (SVM) and extreme learning machine (ELM) classifiers and their complex extensions are presented. An outlook on Clifford algebra classifiers and deep learning techniques suitable to both types of datasets is also provided. The work points toward the direction of developing a new unified multi-channel signal processing framework for biomedical image analysis that will explore synergies from both sensing modalities for inferring disease proliferation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
A movement ecology paradigm for unifying organismal movement research
Nathan, Ran; Getz, Wayne M.; Revilla, Eloy; Holyoak, Marcel; Kadmon, Ronen; Saltz, David; Smouse, Peter E.
2008-01-01
Movement of individual organisms is fundamental to life, quilting our planet in a rich tapestry of phenomena with diverse implications for ecosystems and humans. Movement research is both plentiful and insightful, and recent methodological advances facilitate obtaining a detailed view of individual movement. Yet, we lack a general unifying paradigm, derived from first principles, which can place movement studies within a common context and advance the development of a mature scientific discipline. This introductory article to the Movement Ecology Special Feature proposes a paradigm that integrates conceptual, theoretical, methodological, and empirical frameworks for studying movement of all organisms, from microbes to trees to elephants. We introduce a conceptual framework depicting the interplay among four basic mechanistic components of organismal movement: the internal state (why move?), motion (how to move?), and navigation (when and where to move?) capacities of the individual and the external factors affecting movement. We demonstrate how the proposed framework aids the study of various taxa and movement types; promotes the formulation of hypotheses about movement; and complements existing biomechanical, cognitive, random, and optimality paradigms of movement. The proposed framework integrates eclectic research on movement into a structured paradigm and aims at providing a basis for hypothesis generation and a vehicle facilitating the understanding of the causes, mechanisms, and spatiotemporal patterns of movement and their role in various ecological and evolutionary processes. ”Now we must consider in general the common reason for moving with any movement whatever.“ (Aristotle, De Motu Animalium, 4th century B.C.) PMID:19060196
War-gaming application for future space systems acquisition
NASA Astrophysics Data System (ADS)
Nguyen, Tien M.; Guillen, Andy T.
2016-05-01
Recently the U.S. Department of Defense (DOD) released the Defense Innovation Initiative (DII) [1] to focus DOD on five key aspects; Aspect #1: Recruit talented and innovative people, Aspect #2: Reinvigorate war-gaming, Aspect #3: Initiate long-range research and development programs, Aspect #4: Make DOD practices more innovative, and Aspect #5: Advance technology and new operational concepts. Per DII instruction, this paper concentrates on Aspect #2 and Aspect #4 by reinvigorating the war-gaming effort with a focus on an innovative approach for developing the optimum Program and Technical Baselines (PTBs) and their corresponding optimum acquisition strategies for acquiring future space systems. The paper describes a unified approach for applying the war-gaming concept for future DOD acquisition of space systems. The proposed approach includes a Unified Game-based Acquisition Framework (UGAF) and an Advanced Game-Based Mathematical Framework (AGMF) using Bayesian war-gaming engines to optimize PTB solutions and select the corresponding optimum acquisition strategies for acquiring a space system. The framework defines the action space for all players with a complete description of the elements associated with the games, including Department of Defense Acquisition Authority (DAA), stakeholders, warfighters, and potential contractors, War-Gaming Engines (WGEs) played by DAA, WGEs played by Contractor (KTR), and the players' Payoff and Cost functions (PCFs). The AGMF presented here addresses both complete and incomplete information cases. The proposed framework provides a recipe for the DAA and USAF-Space and Missile Systems Center (SMC) to acquire future space systems optimally.
NASA Astrophysics Data System (ADS)
Lin, S. J.
2015-12-01
The NOAA/Geophysical Fluid Dynamics Laboratory has been developing a unified regional-global modeling system with variable resolution capabilities that can be used for severe weather predictions (e.g., tornado outbreak events and cat-5 hurricanes) and ultra-high-resolution (1-km) regional climate simulations within a consistent global modeling framework. The fundation of this flexible regional-global modeling system is the non-hydrostatic extension of the vertically Lagrangian dynamical core (Lin 2004, Monthly Weather Review) known in the community as FV3 (finite-volume on the cubed-sphere). Because of its flexability and computational efficiency, the FV3 is one of the final candidates of NOAA's Next Generation Global Prediction System (NGGPS). We have built into the modeling system a stretched (single) grid capability, a two-way (regional-global) multiple nested grid capability, and the combination of the stretched and two-way nests, so as to make convection-resolving regional climate simulation within a consistent global modeling system feasible using today's High Performance Computing System. One of our main scientific goals is to enable simulations of high impact weather phenomena (such as tornadoes, thunderstorms, category-5 hurricanes) within an IPCC-class climate modeling system previously regarded as impossible. In this presentation I will demonstrate that it is computationally feasible to simulate not only super-cell thunderstorms, but also the subsequent genesis of tornadoes using a global model that was originally designed for century long climate simulations. As a unified weather-climate modeling system, we evaluated the performance of the model with horizontal resolution ranging from 1 km to as low as 200 km. In particular, for downscaling studies, we have developed various tests to ensure that the large-scale circulation within the global varaible resolution system is well simulated while at the same time the small-scale can be accurately captured within the targeted high resolution region.
Evaluation of maternal and neonatal hospital care: quality index of completeness
da Silva, Ana Lúcia Andrade; Mendes, Antonio da Cruz Gouveia; Miranda, Gabriella Morais Duarte; de Sá, Domicio Aurélio; de Souza, Wayner Vieira; Lyra, Tereza Maciel
2014-01-01
OBJECTIVE Develop an index to evaluate the maternal and neonatal hospital care of the Brazilian Unified Health System. METHODS This descriptive cross-sectional study of national scope was based on the structure-process-outcome framework proposed by Donabedian and on comprehensive health care. Data from the Hospital Information System and the National Registry of Health Establishments were used. The maternal and neonatal network of Brazilian Unified Health System consisted of 3,400 hospitals that performed at least 12 deliveries in 2009 or whose number of deliveries represented 10.0% or more of the total admissions in 2009. Relevance and reliability were defined as criteria for the selection of variables. Simple and composite indicators and the index of completeness were constructed and evaluated, and the distribution of maternal and neonatal hospital care was assessed in different regions of the country. RESULTS A total of 40 variables were selected, from which 27 single indicators, five composite indicators, and the index of completeness of care were built. Composite indicators were constructed by grouping simple indicators and included the following variables: hospital size, level of complexity, delivery care practice, recommended hospital practice, and epidemiological practice. The index of completeness of care grouped the five variables and classified them in ascending order, thereby yielding five levels of completeness of maternal and neonatal hospital care: very low, low, intermediate, high, and very high. The hospital network was predominantly of small size and low complexity, with inadequate child delivery care and poor development of recommended and epidemiological practices. The index showed that more than 80.0% hospitals had a low index of completeness of care and that most qualified heath care services were concentrated in the more developed regions of the country. CONCLUSIONS The index of completeness proved to be of great value for monitoring the maternal and neonatal hospital care of Brazilian Unified Health System and indicated that the quality of health care was unsatisfactory. However, its application does not replace specific evaluations. PMID:25210827
The free-energy principle: a unified brain theory?
Friston, Karl
2010-02-01
A free-energy principle has been proposed recently that accounts for action, perception and learning. This Review looks at some key brain theories in the biological (for example, neural Darwinism) and physical (for example, information theory and optimal control theory) sciences from the free-energy perspective. Crucially, one key theme runs through each of these theories - optimization. Furthermore, if we look closely at what is optimized, the same quantity keeps emerging, namely value (expected reward, expected utility) or its complement, surprise (prediction error, expected cost). This is the quantity that is optimized under the free-energy principle, which suggests that several global brain theories might be unified within a free-energy framework.
Hilltop supernatural inflation and SUSY unified models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kohri, Kazunori; Lim, C.S.; Lin, Chia-Min
2014-01-01
In this paper, we consider high scale (100TeV) supersymmetry (SUSY) breaking and realize the idea of hilltop supernatural inflation in concrete particle physics models based on flipped-SU(5)and Pati-Salam models in the framework of supersymmetric grand unified theories (SUSY GUTs). The inflaton can be a flat direction including right-handed sneutrino and the waterfall field is a GUT Higgs. The spectral index is n{sub s} = 0.96 which fits very well with recent data by PLANCK satellite. There is no both thermal and non-thermal gravitino problems. Non-thermal leptogenesis can be resulted from the decay of right-handed sneutrino which plays (part of) themore » role of inflaton.« less
Customer-experienced rapid prototyping
NASA Astrophysics Data System (ADS)
Zhang, Lijuan; Zhang, Fu; Li, Anbo
2008-12-01
In order to describe accurately and comprehend quickly the perfect GIS requirements, this article will integrate the ideas of QFD (Quality Function Deployment) and UML (Unified Modeling Language), and analyze the deficiency of prototype development model, and will propose the idea of the Customer-Experienced Rapid Prototyping (CE-RP) and describe in detail the process and framework of the CE-RP, from the angle of the characteristics of Modern-GIS. The CE-RP is mainly composed of Customer Tool-Sets (CTS), Developer Tool-Sets (DTS) and Barrier-Free Semantic Interpreter (BF-SI) and performed by two roles of customer and developer. The main purpose of the CE-RP is to produce the unified and authorized requirements data models between customer and software developer.
Unified reduction principle for the evolution of mutation, migration, and recombination
Altenberg, Lee; Liberman, Uri; Feldman, Marcus W.
2017-01-01
Modifier-gene models for the evolution of genetic information transmission between generations of organisms exhibit the reduction principle: Selection favors reduction in the rate of variation production in populations near equilibrium under a balance of constant viability selection and variation production. Whereas this outcome has been proven for a variety of genetic models, it has not been proven in general for multiallelic genetic models of mutation, migration, and recombination modification with arbitrary linkage between the modifier and major genes under viability selection. We show that the reduction principle holds for all of these cases by developing a unifying mathematical framework that characterizes all of these evolutionary models. PMID:28265103
A Markovian state-space framework for integrating flexibility into space system design decisions
NASA Astrophysics Data System (ADS)
Lafleur, Jarret M.
The past decades have seen the state of the art in aerospace system design progress from a scope of simple optimization to one including robustness, with the objective of permitting a single system to perform well even in off-nominal future environments. Integrating flexibility, or the capability to easily modify a system after it has been fielded in response to changing environments, into system design represents a further step forward. One challenge in accomplishing this rests in that the decision-maker must consider not only the present system design decision, but also sequential future design and operation decisions. Despite extensive interest in the topic, the state of the art in designing flexibility into aerospace systems, and particularly space systems, tends to be limited to analyses that are qualitative, deterministic, single-objective, and/or limited to consider a single future time period. To address these gaps, this thesis develops a stochastic, multi-objective, and multi-period framework for integrating flexibility into space system design decisions. Central to the framework are five steps. First, system configuration options are identified and costs of switching from one configuration to another are compiled into a cost transition matrix. Second, probabilities that demand on the system will transition from one mission to another are compiled into a mission demand Markov chain. Third, one performance matrix for each design objective is populated to describe how well the identified system configurations perform in each of the identified mission demand environments. The fourth step employs multi-period decision analysis techniques, including Markov decision processes from the field of operations research, to find efficient paths and policies a decision-maker may follow. The final step examines the implications of these paths and policies for the primary goal of informing initial system selection. Overall, this thesis unifies state-centric concepts of flexibility from economics and engineering literature with sequential decision-making techniques from operations research. The end objective of this thesis’ framework and its supporting tools is to enable selection of the next-generation space systems today, tailored to decision-maker budget and performance preferences, that will be best able to adapt and perform in a future of changing environments and requirements. Following extensive theoretical development, the framework and its steps are applied to space system planning problems of (1) DARPA-motivated multiple- or distributed-payload satellite selection and (2) NASA human space exploration architecture selection.
Baines, Darrin L
2018-05-04
This paper proposes a new conceptual framework for jointly analysing the production of staff and patient welfare in health systems. Research to date has identified a direct link between staff and patient well-being. However, until now, no one has produced a unified framework for analysing them concurrently. In response, this paper introduces the "Frontier Framework". The new conceptual framework is applicable to all health systems regardless of their structure or financing. To demonstrate the benefits of its use, an empirical example of the Frontier Framework is constructed using data from the UK's National Health Service. This paper also introduces eight "Frontier Archetypes", which represent common patterns of welfare generation observable in health organisations involved in programmes of change. These archetypes may be used in planning, monitoring or creating narratives about organisational journeys. Copyright © 2018 The Author. Published by Elsevier Ltd.. All rights reserved.
AMBIT RESTful web services: an implementation of the OpenTox application programming interface.
Jeliazkova, Nina; Jeliazkov, Vedrin
2011-05-16
The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations.The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems.
AMBIT RESTful web services: an implementation of the OpenTox application programming interface
2011-01-01
The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations. The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems. PMID:21575202
Impact of environmental colored noise in single-species population dynamics
NASA Astrophysics Data System (ADS)
Spanio, Tommaso; Hidalgo, Jorge; Muñoz, Miguel A.
2017-10-01
Variability on external conditions has important consequences for the dynamics and the organization of biological systems. In many cases, the characteristic timescale of environmental changes as well as their correlations play a fundamental role in the way living systems adapt and respond to it. A proper mathematical approach to understand population dynamics, thus, requires approaches more refined than, e.g., simple white-noise approximations. To shed further light onto this problem, in this paper we propose a unifying framework based on different analytical and numerical tools available to deal with "colored" environmental noise. In particular, we employ a "unified colored noise approximation" to map the original problem into an effective one with white noise, and then we apply a standard path integral approach to gain analytical understanding. For the sake of specificity, we present our approach using as a guideline a variation of the contact process—which can also be seen as a birth-death process of the Malthus-Verhulst class—where the propagation or birth rate varies stochastically in time. Our approach allows us to tackle in a systematic manner some of the relevant questions concerning population dynamics under environmental variability, such as determining the stationary population density, establishing the conditions under which a population may become extinct, and estimating extinction times. We focus on the emerging phase diagram and its possible phase transitions, underlying how these are affected by the presence of environmental noise time-correlations.
A unified account of perceptual layering and surface appearance in terms of gamut relativity.
Vladusich, Tony; McDonnell, Mark D
2014-01-01
When we look at the world--or a graphical depiction of the world--we perceive surface materials (e.g. a ceramic black and white checkerboard) independently of variations in illumination (e.g. shading or shadow) and atmospheric media (e.g. clouds or smoke). Such percepts are partly based on the way physical surfaces and media reflect and transmit light and partly on the way the human visual system processes the complex patterns of light reaching the eye. One way to understand how these percepts arise is to assume that the visual system parses patterns of light into layered perceptual representations of surfaces, illumination and atmospheric media, one seen through another. Despite a great deal of previous experimental and modelling work on layered representation, however, a unified computational model of key perceptual demonstrations is still lacking. Here we present the first general computational model of perceptual layering and surface appearance--based on a boarder theoretical framework called gamut relativity--that is consistent with these demonstrations. The model (a) qualitatively explains striking effects of perceptual transparency, figure-ground separation and lightness, (b) quantitatively accounts for the role of stimulus- and task-driven constraints on perceptual matching performance, and (c) unifies two prominent theoretical frameworks for understanding surface appearance. The model thereby provides novel insights into the remarkable capacity of the human visual system to represent and identify surface materials, illumination and atmospheric media, which can be exploited in computer graphics applications.
A Unified Account of Perceptual Layering and Surface Appearance in Terms of Gamut Relativity
Vladusich, Tony; McDonnell, Mark D.
2014-01-01
When we look at the world—or a graphical depiction of the world—we perceive surface materials (e.g. a ceramic black and white checkerboard) independently of variations in illumination (e.g. shading or shadow) and atmospheric media (e.g. clouds or smoke). Such percepts are partly based on the way physical surfaces and media reflect and transmit light and partly on the way the human visual system processes the complex patterns of light reaching the eye. One way to understand how these percepts arise is to assume that the visual system parses patterns of light into layered perceptual representations of surfaces, illumination and atmospheric media, one seen through another. Despite a great deal of previous experimental and modelling work on layered representation, however, a unified computational model of key perceptual demonstrations is still lacking. Here we present the first general computational model of perceptual layering and surface appearance—based on a boarder theoretical framework called gamut relativity—that is consistent with these demonstrations. The model (a) qualitatively explains striking effects of perceptual transparency, figure-ground separation and lightness, (b) quantitatively accounts for the role of stimulus- and task-driven constraints on perceptual matching performance, and (c) unifies two prominent theoretical frameworks for understanding surface appearance. The model thereby provides novel insights into the remarkable capacity of the human visual system to represent and identify surface materials, illumination and atmospheric media, which can be exploited in computer graphics applications. PMID:25402466
Townsend, James T; Eidels, Ami
2011-08-01
Increasing the number of available sources of information may impair or facilitate performance, depending on the capacity of the processing system. Tests performed on response time distributions are proving to be useful tools in determining the workload capacity (as well as other properties) of cognitive systems. In this article, we develop a framework and relevant mathematical formulae that represent different capacity assays (Miller's race model bound, Grice's bound, and Townsend's capacity coefficient) in the same space. The new space allows a direct comparison between the distinct bounds and the capacity coefficient values and helps explicate the relationships among the different measures. An analogous common space is proposed for the AND paradigm, relating the capacity index to the Colonius-Vorberg bounds. We illustrate the effectiveness of the unified spaces by presenting data from two simulated models (standard parallel, coactive) and a prototypical visual detection experiment. A conversion table for the unified spaces is provided.
UUI: Reusable Spatial Data Services in Unified User Interface at NASA GES DISC
NASA Technical Reports Server (NTRS)
Petrenko, Maksym; Hegde, Mahabaleshwa; Bryant, Keith; Pham, Long B.
2016-01-01
Unified User Interface (UUI) is a next-generation operational data access tool that has been developed at Goddard Earth Sciences Data and Information Services Center(GES DISC) to provide a simple, unified, and intuitive one-stop shop experience for the key data services available at GES DISC, including subsetting (Simple Subset Wizard -SSW), granule file search (Mirador), plotting (Giovanni), and other legacy spatial data services. UUI has been built based on a flexible infrastructure of reusable web services self-contained building blocks that can easily be plugged into spatial applications, including third-party clients or services, to easily enable new functionality as new datasets and services become available. In this presentation, we will discuss our experience in designing UUI services based on open industry standards. We will also explain how the resulting framework can be used for a rapid development, deployment, and integration of spatial data services, facilitating efficient access and dissemination of spatial data sets.
Microphysics in Multi-scale Modeling System with Unified Physics
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo
2012-01-01
Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the microphysics development and its performance for the multi-scale modeling system will be presented.
A unified data representation theory for network visualization, ordering and coarse-graining
Kovács, István A.; Mizsei, Réka; Csermely, Péter
2015-01-01
Representation of large data sets became a key question of many scientific disciplines in the last decade. Several approaches for network visualization, data ordering and coarse-graining accomplished this goal. However, there was no underlying theoretical framework linking these problems. Here we show an elegant, information theoretic data representation approach as a unified solution of network visualization, data ordering and coarse-graining. The optimal representation is the hardest to distinguish from the original data matrix, measured by the relative entropy. The representation of network nodes as probability distributions provides an efficient visualization method and, in one dimension, an ordering of network nodes and edges. Coarse-grained representations of the input network enable both efficient data compression and hierarchical visualization to achieve high quality representations of larger data sets. Our unified data representation theory will help the analysis of extensive data sets, by revealing the large-scale structure of complex networks in a comprehensible form. PMID:26348923
Modelling Participatory Geographic Information System for Customary Land Conflict Resolution
NASA Astrophysics Data System (ADS)
Gyamera, E. A.; Arko-Adjei, A.; Duncan, E. E.; Kuma, J. S. Y.
2017-11-01
Since land contributes to about 73 % of most countries Gross Domestic Product (GDP), attention on land rights have tremendously increased globally. Conflicts over land have therefore become part of the major problems associated with land administration. However, the conventional mechanisms for land conflict resolution do not provide satisfactory result to disputants due to various factors. This study sought to develop a Framework of using Participatory Geographic Information System (PGIS) for customary land conflict resolution. The framework was modelled using Unified Modelling Language (UML). The PGIS framework, called butterfly model, consists of three units namely, Social Unit (SU), Technical Unit (TU) and Decision Making Unit (DMU). The name butterfly model for land conflict resolution was adopted for the framework based on its features and properties. The framework has therefore been recommended to be adopted for land conflict resolution in customary areas.
Xue, Alexander T; Hickerson, Michael J
2017-11-01
Population genetic data from multiple taxa can address comparative phylogeographic questions about community-scale response to environmental shifts, and a useful strategy to this end is to employ hierarchical co-demographic models that directly test multi-taxa hypotheses within a single, unified analysis. This approach has been applied to classical phylogeographic data sets such as mitochondrial barcodes as well as reduced-genome polymorphism data sets that can yield 10,000s of SNPs, produced by emergent technologies such as RAD-seq and GBS. A strategy for the latter had been accomplished by adapting the site frequency spectrum to a novel summarization of population genomic data across multiple taxa called the aggregate site frequency spectrum (aSFS), which potentially can be deployed under various inferential frameworks including approximate Bayesian computation, random forest and composite likelihood optimization. Here, we introduce the r package multi-dice, a wrapper program that exploits existing simulation software for flexible execution of hierarchical model-based inference using the aSFS, which is derived from reduced genome data, as well as mitochondrial data. We validate several novel software features such as applying alternative inferential frameworks, enforcing a minimal threshold of time surrounding co-demographic pulses and specifying flexible hyperprior distributions. In sum, multi-dice provides comparative analysis within the familiar R environment while allowing a high degree of user customization, and will thus serve as a tool for comparative phylogeography and population genomics. © 2017 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.
Anssari-Benam, Afshin; Bucchi, Andrea; Bader, Dan L
2015-09-18
Discrete element models have often been the primary tool in investigating and characterising the viscoelastic behaviour of soft tissues. However, studies have employed varied configurations of these models, based on the choice of the number of elements and the utilised formation, for different subject tissues. This approach has yielded a diverse array of viscoelastic models in the literature, each seemingly resulting in different descriptions of viscoelastic constitutive behaviour and/or stress-relaxation and creep functions. Moreover, most studies do not apply a single discrete element model to characterise both stress-relaxation and creep behaviours of tissues. The underlying assumption for this disparity is the implicit perception that the viscoelasticity of soft tissues cannot be described by a universal behaviour or law, resulting in the lack of a unified approach in the literature based on discrete element representations. This paper derives the constitutive equation for different viscoelastic models applicable to soft tissues with two characteristic times. It demonstrates that all possible configurations exhibit a unified and universal behaviour, captured by a single constitutive relationship between stress, strain and time as: σ+Aσ̇+Bσ¨=Pε̇+Qε¨. The ensuing stress-relaxation G(t) and creep J(t) functions are also unified and universal, derived as [Formula: see text] and J(t)=c2+(ε0-c2)e(-PQt)+σ0Pt, respectively. Application of these relationships to experimental data is illustrated for various tissues including the aortic valve, ligament and cerebral artery. The unified model presented in this paper may be applied to all tissues with two characteristic times, obviating the need for employing varied configurations of discrete element models in preliminary investigation of the viscoelastic behaviour of soft tissues. Copyright © 2015 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Dannhauser, Walter
1980-01-01
Described is an experiment designed to provide an experimental basis for a unifying point of view (utilizing theoretical framework and chemistry laboratory experiments) for physical chemistry students. Three experiments are described: phase equilibrium, chemical equilibrium, and a test of the third law of thermodynamics. (Author/DS)
Persuasive Writing, A Curriculum Design: K-12.
ERIC Educational Resources Information Center
Bennett, Susan G., Ed.
In the spirit of the Texas Hill Country Writing Project and in response to the requirements of the Texas Assessment of Basic Skills, this guide presents writing assignments reflecting a commitment to a unified writing program for kindergarten through grade twelve. The framework for the assignments is adopted from the discourse theory of James…
Practical Issues in Estimating Classification Accuracy and Consistency with R Package cacIRT
ERIC Educational Resources Information Center
Lathrop, Quinn N.
2015-01-01
There are two main lines of research in estimating classification accuracy (CA) and classification consistency (CC) under Item Response Theory (IRT). The R package cacIRT provides computer implementations of both approaches in an accessible and unified framework. Even with available implementations, there remains decisions a researcher faces when…
The Reliability of Setting Grade Boundaries Using Comparative Judgement
ERIC Educational Resources Information Center
Benton, Tom; Elliott, Gill
2016-01-01
In recent years the use of expert judgement to set and maintain examination standards has been increasingly criticised in favour of approaches based on statistical modelling. This paper reviews existing research on this controversy and attempts to unify the evidence within a framework where expertise is utilised in the form of comparative…
Optimization Techniques for Analysis of Biological and Social Networks
2012-03-28
analyzing a new metaheuristic technique, variable objective search. 3. Experimentation and application: Implement the proposed algorithms , test and fine...alternative mathematical programming formulations, their theoretical analysis, the development of exact algorithms , and heuristics. Originally, clusters...systematic fashion under a unifying theoretical and algorithmic framework. Optimization, Complex Networks, Social Network Analysis, Computational
Conceptualizing the Suicide-Alcohol Relationship.
ERIC Educational Resources Information Center
Rogers, James R.
Despite the strong empirical evidence linking alcohol use across varying levels to suicidal behavior, the field is lacking a unifying theoretical framework in this area. The concept of alcohol induced myopia to explain the varied effects of alcohol on the behaviors of individuals who drink has been proposed. The term "alcohol myopia" refers to its…
Diffusion of Innovations Theory: A Unifying Framework for HIV Peer Education
ERIC Educational Resources Information Center
Ramseyer Winter, Virginia
2013-01-01
Peer education programs are a popular approach to preventing HIV infection among adolescents. While the programs show promise for effectively preventing HIV among the peers who are provided education, little evaluation research has been conducted to determine if the peer educators themselves experience knowledge, attitude, and behavior changes. A…
Utilizing Emergency Departments as Learning Spaces through a Post-Occupancy Evaluation
ERIC Educational Resources Information Center
Guinther, Lindsey Lawry; Carll-White, Allison
2014-01-01
This case study describes the use of an emergency department as a learning space for interior design students. Kolb's (1984; 2005) framework identifies the characteristics of experiential learning and learning spaces, serving as the bridge to unify learning styles and the learning environment. A post-occupancy evaluation was conducted with…
Professionalization in Universities and European Convergence
ERIC Educational Resources Information Center
Vivas, Amparo Jimenez; Hevia, David Menendez Alvarez
2009-01-01
The constant assessment of the quality of higher education within the framework of European convergence is a challenge for all those universities that wish their degrees and diplomas to reflect a unified Europe. As is the case in any assessment, change and review process, the quest to improve quality implies measuring achievement of the objectives…
Reaching and Remediating "Grey-Area" Middle School Students
ERIC Educational Resources Information Center
Jorgenson, Olaf; Smolkovich, Greg E.
2004-01-01
This article presents a framework for school administrators developed by Mesa Unified School district used in identifying and assisting the subtly struggling adolescents. Mesa's "safety net" approach targets middle grades students in the midst of their formative, pre-high school experience. Here, it is stated that the first step to identify a…
Evolution of Students' Ideas about Natural Selection through a Constructivist Framework
ERIC Educational Resources Information Center
Baumgartner, Erin; Duncan, Kanesa
2009-01-01
Educating students about the process of evolution through natural selection is vitally important because not only is it the unifying theory of biological science, it is also widely regarded as difficult for students to fully comprehend. Anderson and colleagues (2002) describe alternative ideas and misconceptions about natural selection as highly…
ERIC Educational Resources Information Center
DeBray, Elizabeth; Houck, Eric A.
2011-01-01
This article uses an institutional framework to analyze the political context of the next reauthorization of the Elementary and Secondary Education Act. The authors analyze three relevant factors in the institutional environment: the role of traditional party politics, including theories of divided versus unified party government; the entrance of…
Understanding Early Childhood Student Teachers' Acceptance and Use of Interactive Whiteboard
ERIC Educational Resources Information Center
Wong, Kung-Teck; Russo, Sharon; McDowall, Janet
2013-01-01
Purpose: The purpose of this paper is to understand early childhood student teachers' self-reported acceptance and use of interactive whiteboard (IWB), by employing the Unified Theory of Acceptance and Use of Technology (UTAUT) as the research framework. Design/methodology/approach: A total of 112 student teachers enrolled in science-related…
Factors Influencing Students' Adoption of E-Learning: A Structural Equation Modeling Approach
ERIC Educational Resources Information Center
Tarhini, Ali; Masa'deh, Ra'ed; Al-Busaidi, Kamla Ali; Mohammed, Ashraf Bany; Maqableh, Mahmoud
2017-01-01
Purpose: This research aims to examine the factors that may hinder or enable the adoption of e-learning systems by university students. Design/methodology/approach: A conceptual framework was developed through extending the unified theory of acceptance and use of technology (performance expectancy, effort expectancy, hedonic motivation, habit,…
The Long Way towards Abandoning ECEC Dichotomy in Greece
ERIC Educational Resources Information Center
Rentzou, Konstantina
2018-01-01
Although Greece has a dichotomous system both in terms of Early Childhood Education and Care (ECEC) services and in terms of ECEC workers' preparation programmes, in 2016 Greek government's Organization for ECEC organized an open colloquy about the adoption of a 'Unified National Framework for Early Childhood Education and Care', causing a heated…
"A Unified Poet Alliance": The Personal and Social Outcomes of Youth Spoken Word Poetry Programming
ERIC Educational Resources Information Center
Weinstein, Susan
2010-01-01
This article places youth spoken word (YSW) poetry programming within the larger framework of arts education. Drawing primarily on transcripts of interviews with teen poets and adult teaching artists and program administrators, the article identifies specific benefits that participants ascribe to youth spoken word, including the development of…
Countering the Pedagogy of Extremism: Reflective Narratives and Critiques of Problem-Based Learning
ERIC Educational Resources Information Center
Woo, Chris W. H.; Laxman, Kumar
2013-01-01
This paper is a critique against "purist" pedagogies found in the literature of student-centred learning. The article reproves extremism in education and questions the absolutism and teleological truths expounded in exclusive problem-based learning. The paper articulates the framework of a unifying pedagogical practice through Eve…
The Four Elementary Forms of Sociality: Framework for a Unified Theory of Social Relations.
ERIC Educational Resources Information Center
Fiske, Alan Page
1992-01-01
A theory is presented that postulates that people in all cultures use four relational models to generate most kinds of social interaction, evaluation, and affect. Ethnographic and field studies (n=19) have supported cultural variations on communal sharing; authority ranking; equality matching; and market pricing. (SLD)
A Unifying Framework for Teaching Nonparametric Statistical Tests
ERIC Educational Resources Information Center
Bargagliotti, Anna E.; Orrison, Michael E.
2014-01-01
Increased importance is being placed on statistics at both the K-12 and undergraduate level. Research divulging effective methods to teach specific statistical concepts is still widely sought after. In this paper, we focus on best practices for teaching topics in nonparametric statistics at the undergraduate level. To motivate the work, we…
The road against fatalities: infrastructure spending vs. regulation??
Albalate, Daniel; Fernández, Laura; Yarygina, Anastasiya
2013-10-01
The road safety literature is typified by a high degree of compartmentalization between studies that focus on infrastructure and traffic conditions and those devoted to the evaluation of public policies and regulations. As a result, few studies adopt a unified empirical framework in their attempts at evaluating the road safety performance of public interventions, thus limiting our understanding of successful strategies in this regard. This paper considers both types of determinants in an analysis of a European country that has enjoyed considerable success in reducing road fatalities. After constructing a panel data set with road safety outcomes for all Spanish provinces between 1990 and 2009, we evaluate the role of the technical characteristics of infrastructure and recent infrastructure spending together with the main regulatory changes introduced. Our results show the importance of considering both types of determinants in a unified framework. Moreover, we highlight the importance of maintenance spending given its effectiveness in reducing fatalities and casualties in the current economic context of austerity that is having such a marked impact on investment efforts in Spain. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
de Albuquerque, Douglas F.; Fittipaldi, I. P.
1994-05-01
A unified effective-field renormalization-group framework (EFRG) for both quenched bond- and site-diluted Ising models is herein developed by extending recent works. The method, as in the previous works, follows up the same strategy of the mean-field renormalization-group scheme (MFRG), and is achieved by introducing an alternative way for constructing classical effective-field equations of state, based on rigorous Ising spin identities. The concentration dependence of the critical temperature, Tc(p), and the critical concentrations of magnetic atoms, pc, at which the transition temperature goes to zero, are evaluated for several two- and three-dimensional lattice structures. The obtained values of Tc and pc and the resulting phase diagrams for both bond and site cases are much more accurate than those estimated by the standard MFRG approach. Although preserving the same level of simplicity as the MFRG, it is shown that the present EFRG method, even by considering its simplest size-cluster version, provides results that correctly distinguishes those lattices that have the same coordination number, but differ in dimensionality or geometry.
Buetow, S; Adair, V; Coster, G; Hight, M; Gribben, B; Mitchell, E
2002-01-01
BACKGROUND: Different sets of literature suggest how aspects of practice time management can limit access to general practitioner (GP) care. Researchers have not organised this knowledge into a unified framework that can enhance understanding of barriers to, and opportunities for, improved access. AIM: To suggest a framework conceptualising how differences in professional and cultural understanding of practice time management in Auckland, New Zealand, influence access to GP care for children with chronic asthma. DESIGN OF STUDY: A qualitative study involving selective sampling, semi-structured interviews on barriers to access, and a general inductive approach. SETTING: Twenty-nine key informants and ten mothers of children with chronic, moderate to severe asthma and poor access to GP care in Auckland. METHOD: Development of a framework from themes describing barriers associated with, and needs for, practice time management. The themes were independently identified by two authors from transcribed interviews and confirmed through informant checking. Themes from key informant and patient interviews were triangulated with each other and with published literature. RESULTS: The framework distinguishes 'practice-centred time' from 'patient-centred time.' A predominance of 'practice-centred time' and an unmet opportunity for 'patient-centred time' are suggested by the persistence of five barriers to accessing GP care: limited hours of opening; traditional appointment systems; practice intolerance of missed appointments; long waiting times in the practice; and inadequate consultation lengths. None of the barriers is specific to asthmatic children. CONCLUSION: A unified framework was suggested for understanding how the organisation of practice work time can influence access to GP care by groups including asthmatic children. PMID:12528583
How Users Search the Library from a Single Search Box
ERIC Educational Resources Information Center
Lown, Cory; Sierra, Tito; Boyer, Josh
2013-01-01
Academic libraries are turning increasingly to unified search solutions to simplify search and discovery of library resources. Unfortunately, very little research has been published on library user search behavior in single search box environments. This study examines how users search a large public university library using a prominent, single…
Models for evaluating the performability of degradable computing systems
NASA Technical Reports Server (NTRS)
Wu, L. T.
1982-01-01
Recent advances in multiprocessor technology established the need for unified methods to evaluate computing systems performance and reliability. In response to this modeling need, a general modeling framework that permits the modeling, analysis and evaluation of degradable computing systems is considered. Within this framework, several user oriented performance variables are identified and shown to be proper generalizations of the traditional notions of system performance and reliability. Furthermore, a time varying version of the model is developed to generalize the traditional fault tree reliability evaluation methods of phased missions.
Einstein-Yang-Mills-Dirac systems from the discretized Kaluza-Klein theory
NASA Astrophysics Data System (ADS)
Wali, Kameshwar; Viet, Nguyen Ali
2017-01-01
A unified theory of the non-Abelian gauge interactions with gravity in the framework of a discretized Kaluza-Klein theory is constructed with a modified Dirac operator and wedge product. All the couplings of chiral spinors to the non-Abelian gauge fields emerge naturally as components of the coupling of the chiral spinors in the generalized gravity together with some new interactions. In particular, the currently prevailing gravity-QCD quark and gravity-electroweak-quark and lepton models are shown to follow as special cases of the general framework.
Pinsard, Basile; Boutin, Arnaud; Doyon, Julien; Benali, Habib
2018-01-01
Functional MRI acquisition is sensitive to subjects' motion that cannot be fully constrained. Therefore, signal corrections have to be applied a posteriori in order to mitigate the complex interactions between changing tissue localization and magnetic fields, gradients and readouts. To circumvent current preprocessing strategies limitations, we developed an integrated method that correct motion and spatial low-frequency intensity fluctuations at the level of each slice in order to better fit the acquisition processes. The registration of single or multiple simultaneously acquired slices is achieved online by an Iterated Extended Kalman Filter, favoring the robust estimation of continuous motion, while an intensity bias field is non-parametrically fitted. The proposed extraction of gray-matter BOLD activity from the acquisition space to an anatomical group template space, taking into account distortions, better preserves fine-scale patterns of activity. Importantly, the proposed unified framework generalizes to high-resolution multi-slice techniques. When tested on simulated and real data the latter shows a reduction of motion explained variance and signal variability when compared to the conventional preprocessing approach. These improvements provide more stable patterns of activity, facilitating investigation of cerebral information representation in healthy and/or clinical populations where motion is known to impact fine-scale data. PMID:29755312
Pinsard, Basile; Boutin, Arnaud; Doyon, Julien; Benali, Habib
2018-01-01
Functional MRI acquisition is sensitive to subjects' motion that cannot be fully constrained. Therefore, signal corrections have to be applied a posteriori in order to mitigate the complex interactions between changing tissue localization and magnetic fields, gradients and readouts. To circumvent current preprocessing strategies limitations, we developed an integrated method that correct motion and spatial low-frequency intensity fluctuations at the level of each slice in order to better fit the acquisition processes. The registration of single or multiple simultaneously acquired slices is achieved online by an Iterated Extended Kalman Filter, favoring the robust estimation of continuous motion, while an intensity bias field is non-parametrically fitted. The proposed extraction of gray-matter BOLD activity from the acquisition space to an anatomical group template space, taking into account distortions, better preserves fine-scale patterns of activity. Importantly, the proposed unified framework generalizes to high-resolution multi-slice techniques. When tested on simulated and real data the latter shows a reduction of motion explained variance and signal variability when compared to the conventional preprocessing approach. These improvements provide more stable patterns of activity, facilitating investigation of cerebral information representation in healthy and/or clinical populations where motion is known to impact fine-scale data.
Unified framework to evaluate panmixia and migration direction among multiple sampling locations.
Beerli, Peter; Palczewski, Michal
2010-05-01
For many biological investigations, groups of individuals are genetically sampled from several geographic locations. These sampling locations often do not reflect the genetic population structure. We describe a framework using marginal likelihoods to compare and order structured population models, such as testing whether the sampling locations belong to the same randomly mating population or comparing unidirectional and multidirectional gene flow models. In the context of inferences employing Markov chain Monte Carlo methods, the accuracy of the marginal likelihoods depends heavily on the approximation method used to calculate the marginal likelihood. Two methods, modified thermodynamic integration and a stabilized harmonic mean estimator, are compared. With finite Markov chain Monte Carlo run lengths, the harmonic mean estimator may not be consistent. Thermodynamic integration, in contrast, delivers considerably better estimates of the marginal likelihood. The choice of prior distributions does not influence the order and choice of the better models when the marginal likelihood is estimated using thermodynamic integration, whereas with the harmonic mean estimator the influence of the prior is pronounced and the order of the models changes. The approximation of marginal likelihood using thermodynamic integration in MIGRATE allows the evaluation of complex population genetic models, not only of whether sampling locations belong to a single panmictic population, but also of competing complex structured population models.
Programming chemistry in DNA-addressable bioreactors
Fellermann, Harold; Cardelli, Luca
2014-01-01
We present a formal calculus, termed the chemtainer calculus, able to capture the complexity of compartmentalized reaction systems such as populations of possibly nested vesicular compartments. Compartments contain molecular cargo as well as surface markers in the form of DNA single strands. These markers serve as compartment addresses and allow for their targeted transport and fusion, thereby enabling reactions of previously separated chemicals. The overall system organization allows for the set-up of programmable chemistry in microfluidic or other automated environments. We introduce a simple sequential programming language whose instructions are motivated by state-of-the-art microfluidic technology. Our approach integrates electronic control, chemical computing and material production in a unified formal framework that is able to mimic the integrated computational and constructive capabilities of the subcellular matrix. We provide a non-deterministic semantics of our programming language that enables us to analytically derive the computational and constructive power of our machinery. This semantics is used to derive the sets of all constructable chemicals and supermolecular structures that emerge from different underlying instruction sets. Because our proofs are constructive, they can be used to automatically infer control programs for the construction of target structures from a limited set of resource molecules. Finally, we present an example of our framework from the area of oligosaccharide synthesis. PMID:25121647
Heffernan, Eithne; Coulson, Neil S.; Henshaw, Helen; Barry, Johanna G.; Ferguson, Melanie A
2017-01-01
Objective This study explored the psychosocial experiences of adults with hearing loss using the self-regulatory model as a theoretical framework. The primary components of the model, namely cognitive representations, emotional representations, and coping responses, were examined. Design Individual semi-structured interviews were conducted. The data were analysed using an established thematic analysis procedure. Study sample Twenty-five adults with mild-moderate hearing loss from the UK and nine hearing healthcare professionals from the UK, USA, and Canada were recruited via maximum variation sampling. Results Cognitive representations: Most participants described their hearing loss as having negative connotations and consequences, although they were not particularly concerned about the progression or controllability/curability of the condition. Opinions differed regarding the benefits of understanding the causes of one’s hearing loss in detail. Emotional representations: negative emotions dominated, although some experienced positive emotions or muted emotions. Coping responses: engaged coping (e.g. hearing aids, communication tactics) and disengaged coping (e.g. withdrawal from situations, withdrawal within situations): both had perceived advantages and disadvantages. Conclusions This novel application of the self-regulatory model demonstrates that it can be used to capture the key psychosocial experiences (i.e. perceptions, emotions, and coping responses) of adults with mild-moderate hearing loss within a single, unifying framework. PMID:26754550
Chivukula, V; Mousel, J; Lu, J; Vigmostad, S
2014-12-01
The current research presents a novel method in which blood particulates - biconcave red blood cells (RBCs) and spherical cells are modeled using isogeometric analysis, specifically Non-Uniform Rational B-Splines (NURBS) in 3-D. The use of NURBS ensures that even with a coarse representation, the geometry of the blood particulates maintains an accurate description when subjected to large deformations. The fundamental advantage of this method is the coupling of the geometrical description and the stress analysis of the cell membrane into a single, unified framework. Details on the modeling approach, implementation of boundary conditions and the membrane mechanics analysis using isogeometric modeling are presented, along with validation cases for spherical and biconcave cells. Using NURBS - based isogeometric analysis, the behavior of individual cells in fluid flow is presented and analyzed in different flow regimes using as few as 176 elements for a spherical cell and 220 elements for a biconcave RBC. This work provides a framework for modeling a large number of 3-D deformable biological cells, each with its own geometric description and membrane properties. To the best knowledge of the authors, this is the first application of the NURBS - based isogeometric analysis to model and simulate blood particulates in flow in 3D. Copyright © 2014 John Wiley & Sons, Ltd.
Inference in the Wild: A Framework for Human Situation Assessment and a Case Study of Air Combat.
McAnally, Ken; Davey, Catherine; White, Daniel; Stimson, Murray; Mascaro, Steven; Korb, Kevin
2018-06-24
Situation awareness is a key construct in human factors and arises from a process of situation assessment (SA). SA comprises the perception of information, its integration with existing knowledge, the search for new information, and the prediction of the future state of the world, including the consequences of planned actions. Causal models implemented as Bayesian networks (BNs) are attractive for modeling all of these processes within a single, unified framework. We elicited declarative knowledge from two Royal Australian Air Force (RAAF) fighter pilots about the information sources used in the identification (ID) of airborne entities and the causal relationships between these sources. This knowledge was represented in a BN (the declarative model) that was evaluated against the performance of 19 RAAF fighter pilots in a low-fidelity simulation. Pilot behavior was well predicted by a simple associative model (the behavioral model) with only three attributes of ID. Search for information by pilots was largely compensatory and was near-optimal with respect to the behavioral model. The average revision of beliefs in response to evidence was close to Bayesian, but there was substantial variability. Together, these results demonstrate the value of BNs for modeling human SA. Copyright © 2018 Cognitive Science Society, Inc.
De Ridder, Dirk; Vanneste, Sven; Weisz, Nathan; Londero, Alain; Schlee, Winnie; Elgoyhen, Ana Belen; Langguth, Berthold
2014-07-01
Tinnitus is a considered to be an auditory phantom phenomenon, a persistent conscious percept of a salient memory trace, externally attributed, in the absence of a sound source. It is perceived as a phenomenological unified coherent percept, binding multiple separable clinical characteristics, such as its loudness, the sidedness, the type (pure tone, noise), the associated distress and so on. A theoretical pathophysiological framework capable of explaining all these aspects in one model is highly needed. The model must incorporate both the deafferentation based neurophysiological models and the dysfunctional noise canceling model, and propose a 'tinnitus core' subnetwork. The tinnitus core can be defined as the minimal set of brain areas that needs to be jointly activated (=subnetwork) for tinnitus to be consciously perceived, devoid of its affective components. The brain areas involved in the other separable characteristics of tinnitus can be retrieved by studies on spontaneous resting state magnetic and electrical activity in people with tinnitus, evaluated for the specific aspect investigated and controlled for other factors. By combining these functional imaging studies with neuromodulation techniques some of the correlations are turned into causal relationships. Thereof, a heuristic pathophysiological framework is constructed, integrating the tinnitus perceptual core with the other tinnitus related aspects. This phenomenological unified percept of tinnitus can be considered an emergent property of multiple, parallel, dynamically changing and partially overlapping subnetworks, each with a specific spontaneous oscillatory pattern and functional connectivity signature. Communication between these different subnetworks is proposed to occur at hubs, brain areas that are involved in multiple subnetworks simultaneously. These hubs can take part in each separable subnetwork at different frequencies. Communication between the subnetworks is proposed to occur at discrete oscillatory frequencies. As such, the brain uses multiple nonspecific networks in parallel, each with their own oscillatory signature, that adapt to the context to construct a unified percept possibly by synchronized activation integrated at hubs at discrete oscillatory frequencies. Copyright © 2013 Elsevier Ltd. All rights reserved.
An OpenACC-Based Unified Programming Model for Multi-accelerator Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Jungwon; Lee, Seyong; Vetter, Jeffrey S
2015-01-01
This paper proposes a novel SPMD programming model of OpenACC. Our model integrates the different granularities of parallelism from vector-level parallelism to node-level parallelism into a single, unified model based on OpenACC. It allows programmers to write programs for multiple accelerators using a uniform programming model whether they are in shared or distributed memory systems. We implement a prototype of our model and evaluate its performance with a GPU-based supercomputer using three benchmark applications.
NASA Astrophysics Data System (ADS)
Tallapragada, V.
2017-12-01
NOAA's Next Generation Global Prediction System (NGGPS) has provided the unique opportunity to develop and implement a non-hydrostatic global model based on Geophysical Fluid Dynamics Laboratory (GFDL) Finite Volume Cubed Sphere (FV3) Dynamic Core at National Centers for Environmental Prediction (NCEP), making a leap-step advancement in seamless prediction capabilities across all spatial and temporal scales. Model development efforts are centralized with unified model development in the NOAA Environmental Modeling System (NEMS) infrastructure based on Earth System Modeling Framework (ESMF). A more sophisticated coupling among various earth system components is being enabled within NEMS following National Unified Operational Prediction Capability (NUOPC) standards. The eventual goal of unifying global and regional models will enable operational global models operating at convective resolving scales. Apart from the advanced non-hydrostatic dynamic core and coupling to various earth system components, advanced physics and data assimilation techniques are essential for improved forecast skill. NGGPS is spearheading ambitious physics and data assimilation strategies, concentrating on creation of a Common Community Physics Package (CCPP) and Joint Effort for Data Assimilation Integration (JEDI). Both initiatives are expected to be community developed, with emphasis on research transitioning to operations (R2O). The unified modeling system is being built to support the needs of both operations and research. Different layers of community partners are also established with specific roles/responsibilities for researchers, core development partners, trusted super-users, and operations. Stakeholders are engaged at all stages to help drive the direction of development, resources allocations and prioritization. This talk presents the current and future plans of unified model development at NCEP for weather, sub-seasonal, and seasonal climate prediction applications with special emphasis on implementation of NCEP FV3 Global Forecast System (GFS) and Global Ensemble Forecast System (GEFS) into operations by 2019.
A Semantic Grid Oriented to E-Tourism
NASA Astrophysics Data System (ADS)
Zhang, Xiao Ming
With increasing complexity of tourism business models and tasks, there is a clear need of the next generation e-Tourism infrastructure to support flexible automation, integration, computation, storage, and collaboration. Currently several enabling technologies such as semantic Web, Web service, agent and grid computing have been applied in the different e-Tourism applications, however there is no a unified framework to be able to integrate all of them. So this paper presents a promising e-Tourism framework based on emerging semantic grid, in which a number of key design issues are discussed including architecture, ontologies structure, semantic reconciliation, service and resource discovery, role based authorization and intelligent agent. The paper finally provides the implementation of the framework.
Deontological coherence: A framework for commonsense moral reasoning.
Holyoak, Keith J; Powell, Derek
2016-11-01
We review a broad range of work, primarily in cognitive and social psychology, that provides insight into the processes of moral judgment. In particular, we consider research on pragmatic reasoning about regulations and on coherence in decision making, both areas in which psychological theories have been guided by work in legal philosophy. Armed with these essential prerequisites, we sketch a psychological framework for how ordinary people make judgments about moral issues. Based on a literature review, we show how the framework of deontological coherence unifies findings in moral psychology that have often been explained in terms of a grab-bag of heuristics and biases. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
A unifying framework for ghost-free Lorentz-invariant Lagrangian field theories
NASA Astrophysics Data System (ADS)
Li, Wenliang
2018-04-01
We propose a framework for Lorentz-invariant Lagrangian field theories where Ostrogradsky's scalar ghosts could be absent. A key ingredient is the generalized Kronecker delta. The general Lagrangians are reformulated in the language of differential forms. The absence of higher order equations of motion for the scalar modes stems from the basic fact that every exact form is closed. The well-established Lagrangian theories for spin-0, spin-1, p-form, spin-2 fields have natural formulations in this framework. We also propose novel building blocks for Lagrangian field theories. Some of them are novel nonlinear derivative terms for spin-2 fields. It is nontrivial that Ostrogradsky's scalar ghosts are absent in these fully nonlinear theories.
Zhai, Di-Hua; Xia, Yuanqing
2018-02-01
This paper addresses the adaptive control for task-space teleoperation systems with constrained predefined synchronization error, where a novel switched control framework is investigated. Based on multiple Lyapunov-Krasovskii functionals method, the stability of the resulting closed-loop system is established in the sense of state-independent input-to-output stability. Compared with previous work, the developed method can simultaneously handle the unknown kinematics/dynamics, asymmetric varying time delays, and prescribed performance control in a unified framework. It is shown that the developed controller can guarantee the prescribed transient-state and steady-state synchronization performances between the master and slave robots, which is demonstrated by the simulation study.
NASA Astrophysics Data System (ADS)
Post, Evert Jan
1999-05-01
This essay presents conclusive evidence of the impermissibility of Copenhagen's single system interpretation of the Schroedinger process. The latter needs to be viewed as a tool exclusively describing phase and orientation randomized ensembles and is not be used for isolated single systems. Asymptotic closeness of single system and ensemble behavior and the rare nature of true single system manifestations have prevented a definitive identification of this Copenhagen deficiency over the past three quarter century. Quantum uncertainty so becomes a basic trade mark of phase and orientation disordered ensembles. The ensuing void of usable single system tools opens a new inquiry for tools without statistical connotations. Three, in part already known, period integrals here identified as flux, charge and action counters emerge as diffeo-4 invariant tools fully compatible with the demands of the general theory of relativity. The discovery of the quantum Hall effect has been instrumental in forcing a distinction between ensemble disorder as in the normal Hall effect versus ensemble order in the plateau states. Since the order of the latter permits a view of the plateau states as a macro- or meso-scopic single system, the period integral description applies, yielding a straightforward unified description of integer and fractional quantum Hall effects.
Landscape-level effects on aboveground biomass of tropical forests: A conceptual framework.
Melito, Melina; Metzger, Jean Paul; de Oliveira, Alexandre A
2018-02-01
Despite the general recognition that fragmentation can reduce forest biomass through edge effects, a systematic review of the literature does not reveal a clear role of edges in modulating biomass loss. Additionally, the edge effects appear to be constrained by matrix type, suggesting that landscape composition has an influence on biomass stocks. The lack of empirical evidence of pervasive edge-related biomass losses across tropical forests highlights the necessity for a general framework linking landscape structure with aboveground biomass. Here, we propose a conceptual model in which landscape composition and configuration mediate the magnitude of edge effects and seed-flux among forest patches, which ultimately has an influence on biomass. Our model hypothesizes that a rapid reduction of biomass can occur below a threshold of forest cover loss. Just below this threshold, we predict that changes in landscape configuration can strongly influence the patch's isolation, thus enhancing biomass loss. Moreover, we expect a synergism between landscape composition and patch attributes, where matrix type mediates the effects of edges on species decline, particularly for shade-tolerant species. To test our conceptual framework, we propose a sampling protocol where the effects of edges, forest amount, forest isolation, fragment size, and matrix type on biomass stocks can be assessed both collectively and individually. The proposed model unifies the combined effects of landscape and patch structure on biomass into a single framework, providing a new set of main drivers of biomass loss in human-modified landscapes. We argue that carbon trading agendas (e.g., REDD+) and carbon-conservation initiatives must go beyond the effects of forest loss and edges on biomass, considering the whole set of effects on biomass related to changes in landscape composition and configuration. © 2017 John Wiley & Sons Ltd.
Understanding public perceptions of biotechnology through the "Integrative Worldview Framework".
De Witt, Annick; Osseweijer, Patricia; Pierce, Robin
2015-07-03
Biotechnological innovations prompt a range of societal responses that demand understanding. Research has shown such responses are shaped by individuals' cultural worldviews. We aim to demonstrate how the Integrative Worldview Framework (IWF) can be used for analyzing perceptions of biotechnology, by reviewing (1) research on public perceptions of biotechnology and (2) analyses of the stakeholder-debate on the bio-based economy, using the Integrative Worldview Framework (IWF) as analytical lens. This framework operationalizes the concept of worldview and distinguishes between traditional, modern, and postmodern worldviews, among others. Applied to these literatures, this framework illuminates how these worldviews underlie major societal responses, thereby providing a unifying understanding of the literature on perceptions of biotechnology. We conclude the IWF has relevance for informing research on perceptions of socio-technical changes, generating insight into the paradigmatic gaps in social science, and facilitating reflexive and inclusive policy-making and debates on these timely issues. © The Author(s) 2015.
Keltner, Dacher; Kogan, Aleksandr; Piff, Paul K; Saturn, Sarina R
2014-01-01
The study of prosocial behavior--altruism, cooperation, trust, and the related moral emotions--has matured enough to produce general scholarly consensus that prosociality is widespread, intuitive, and rooted deeply within our biological makeup. Several evolutionary frameworks model the conditions under which prosocial behavior is evolutionarily viable, yet no unifying treatment exists of the psychological decision-making processes that result in prosociality. Here, we provide such a perspective in the form of the sociocultural appraisals, values, and emotions (SAVE) framework of prosociality. We review evidence for the components of our framework at four levels of analysis: intrapsychic, dyadic, group, and cultural. Within these levels, we consider how phenomena such as altruistic punishment, prosocial contagion, self-other similarity, and numerous others give rise to prosocial behavior. We then extend our reasoning to chart the biological underpinnings of prosociality and apply our framework to understand the role of social class in prosociality.
United We Stand: Emphasizing Commonalities Across Cognitive-Behavioral Therapies
Mennin, Douglas S.; Ellard, Kristen K.; Fresco, David M.; Gross, James J.
2016-01-01
Cognitive behavioral therapy (CBT) has a rich history of alleviating the suffering associated with mental disorders. Recently, there have been exciting new developments, including multi-component approaches, incorporated alternative therapies (e.g., meditation), targeted and cost-effective technologies, and integrated biological and behavioral frameworks. These field-wide changes have led some to emphasize the differences among variants of CBT. Here, we draw attention to commonalities across cognitive-behavioral therapies, including shared goals, change principles, and therapeutic processes. Specifically, we offer a framework for examining common CBT characteristics that emphasizes behavioral adaptation as a unifying goal and three core change principles, namely (1) context engagement to promote adaptive imagining and enacting of new experiences; (2) attention change to promote adaptive sustaining, shifting, and broadening of attention; and (3) cognitive change to promote adaptive perspective taking on events so as to alter verbal meanings. Further, we argue that specific intervention components including behavioral exposure/activation, attention training, acceptance/tolerance, decentering/defusion, and cognitive reframing may be emphasized to a greater or lesser degree by different treatment packages but are still fundamentally common therapeutic processes that are present across approaches and are best understood by their relationships to these core CBT change principles. We conclude by arguing for shared methodological and design frameworks for investigating unique and common characteristics to advance a unified and strong voice for CBT in a widening, increasingly multimodal and interdisciplinary, intervention science. PMID:23611074
Coupled dictionary learning for joint MR image restoration and segmentation
NASA Astrophysics Data System (ADS)
Yang, Xuesong; Fan, Yong
2018-03-01
To achieve better segmentation of MR images, image restoration is typically used as a preprocessing step, especially for low-quality MR images. Recent studies have demonstrated that dictionary learning methods could achieve promising performance for both image restoration and image segmentation. These methods typically learn paired dictionaries of image patches from different sources and use a common sparse representation to characterize paired image patches, such as low-quality image patches and their corresponding high quality counterparts for the image restoration, and image patches and their corresponding segmentation labels for the image segmentation. Since learning these dictionaries jointly in a unified framework may improve the image restoration and segmentation simultaneously, we propose a coupled dictionary learning method to concurrently learn dictionaries for joint image restoration and image segmentation based on sparse representations in a multi-atlas image segmentation framework. Particularly, three dictionaries, including a dictionary of low quality image patches, a dictionary of high quality image patches, and a dictionary of segmentation label patches, are learned in a unified framework so that the learned dictionaries of image restoration and segmentation can benefit each other. Our method has been evaluated for segmenting the hippocampus in MR T1 images collected with scanners of different magnetic field strengths. The experimental results have demonstrated that our method achieved better image restoration and segmentation performance than state of the art dictionary learning and sparse representation based image restoration and image segmentation methods.
The role of the parahippocampal cortex in cognition
Aminoff, Elissa M.; Kveraga, Kestutis; Bar, Moshe
2013-01-01
The parahippocampal cortex (PHC) has been associated with many cognitive processes, including visuospatial processing and episodic memory. To characterize the role of PHC in cognition a framework is required that unifies these disparate processes. An overarching account was proposed, whereby the PHC is part of a network of brain regions that processes contextual associations. Contextual associations are the principal element underlying many higher-level cognitive processes, and thus are suitable for unifying the PHC literature. Recent findings are reviewed that provide support for the contextual associations account of PHC function. In addition to reconciling a vast breadth of literature, the synthesis presented expands the implications of the proposed account and gives rise to new and general questions about context and cognition. PMID:23850264
Dai, Jiayu; Hou, Yong; Yuan, Jianmin
2010-06-18
Electron-ion interactions are central to numerous phenomena in the warm dense matter (WDM) regime and at higher temperature. The electron-ion collisions induced friction at high temperature is introduced in the procedure of ab initio molecular dynamics using the Langevin equation based on density functional theory. In this framework, as a test for Fe and H up to 1000 eV, the equation of state and the transition of electronic structures of the materials with very wide density and temperature can be described, which covers a full range of WDM up to high energy density physics. A unified first principles description from condensed matter to ideal ionized gas plasma is constructed.
NASA Astrophysics Data System (ADS)
Till, Christy B.; Pritchard, Matthew; Miller, Craig A.; Brugman, Karalee K.; Ryan-Davis, Juliet
2018-04-01
Multi-disciplinary analyses of Earth's most destructive volcanic systems show that continuous monitoring and an understanding of each volcano's quirks, rather than a single unified model, are key to generating accurate hazard assessments.
Model-Unified Planning and Execution for Distributed Autonomous System Control
NASA Technical Reports Server (NTRS)
Aschwanden, Pascal; Baskaran, Vijay; Bernardini, Sara; Fry, Chuck; Moreno, Maria; Muscettola, Nicola; Plaunt, Chris; Rijsman, David; Tompkins, Paul
2006-01-01
The Intelligent Distributed Execution Architecture (IDEA) is a real-time architecture that exploits artificial intelligence planning as the core reasoning engine for interacting autonomous agents. Rather than enforcing separate deliberation and execution layers, IDEA unifies them under a single planning technology. Deliberative and reactive planners reason about and act according to a single representation of the past, present and future domain state. The domain state behaves the rules dictated by a declarative model of the subsystem to be controlled, internal processes of the IDEA controller, and interactions with other agents. We present IDEA concepts - modeling, the IDEA core architecture, the unification of deliberation and reaction under planning - and illustrate its use in a simple example. Finally, we present several real-world applications of IDEA, and compare IDEA to other high-level control approaches.
Lu, Songjian; Jin, Bo; Cowart, L Ashley; Lu, Xinghua
2013-01-01
Genetic and pharmacological perturbation experiments, such as deleting a gene and monitoring gene expression responses, are powerful tools for studying cellular signal transduction pathways. However, it remains a challenge to automatically derive knowledge of a cellular signaling system at a conceptual level from systematic perturbation-response data. In this study, we explored a framework that unifies knowledge mining and data mining towards the goal. The framework consists of the following automated processes: 1) applying an ontology-driven knowledge mining approach to identify functional modules among the genes responding to a perturbation in order to reveal potential signals affected by the perturbation; 2) applying a graph-based data mining approach to search for perturbations that affect a common signal; and 3) revealing the architecture of a signaling system by organizing signaling units into a hierarchy based on their relationships. Applying this framework to a compendium of yeast perturbation-response data, we have successfully recovered many well-known signal transduction pathways; in addition, our analysis has led to many new hypotheses regarding the yeast signal transduction system; finally, our analysis automatically organized perturbed genes as a graph reflecting the architecture of the yeast signaling system. Importantly, this framework transformed molecular findings from a gene level to a conceptual level, which can be readily translated into computable knowledge in the form of rules regarding the yeast signaling system, such as "if genes involved in the MAPK signaling are perturbed, genes involved in pheromone responses will be differentially expressed."
A unified framework for physical print quality
NASA Astrophysics Data System (ADS)
Eid, Ahmed; Cooper, Brian; Rippetoe, Ed
2007-01-01
In this paper we present a unified framework for physical print quality. This framework includes a design for a testbed, testing methodologies and quality measures of physical print characteristics. An automatic belt-fed flatbed scanning system is calibrated to acquire L* data for a wide range of flat field imagery. Testing methodologies based on wavelet pre-processing and spectral/statistical analysis are designed. We apply the proposed framework to three common printing artifacts: banding, jitter, and streaking. Since these artifacts are directional, wavelet based approaches are used to extract one artifact at a time and filter out other artifacts. Banding is characterized as a medium-to-low frequency, vertical periodic variation down the page. The same definition is applied to the jitter artifact, except that the jitter signal is characterized as a high-frequency signal above the banding frequency range. However, streaking is characterized as a horizontal aperiodic variation in the high-to-medium frequency range. Wavelets at different levels are applied to the input images in different directions to extract each artifact within specified frequency bands. Following wavelet reconstruction, images are converted into 1-D signals describing the artifact under concern. Accurate spectral analysis using a DFT with Blackman-Harris windowing technique is used to extract the power (strength) of periodic signals (banding and jitter). Since streaking is an aperiodic signal, a statistical measure is used to quantify the streaking strength. Experiments on 100 print samples scanned at 600 dpi from 10 different printers show high correlation (75% to 88%) between the ranking of these samples by the proposed metrologies and experts' visual ranking.
A Unified Estimation Framework for State-Related Changes in Effective Brain Connectivity.
Samdin, S Balqis; Ting, Chee-Ming; Ombao, Hernando; Salleh, Sh-Hussain
2017-04-01
This paper addresses the critical problem of estimating time-evolving effective brain connectivity. Current approaches based on sliding window analysis or time-varying coefficient models do not simultaneously capture both slow and abrupt changes in causal interactions between different brain regions. To overcome these limitations, we develop a unified framework based on a switching vector autoregressive (SVAR) model. Here, the dynamic connectivity regimes are uniquely characterized by distinct vector autoregressive (VAR) processes and allowed to switch between quasi-stationary brain states. The state evolution and the associated directed dependencies are defined by a Markov process and the SVAR parameters. We develop a three-stage estimation algorithm for the SVAR model: 1) feature extraction using time-varying VAR (TV-VAR) coefficients, 2) preliminary regime identification via clustering of the TV-VAR coefficients, 3) refined regime segmentation by Kalman smoothing and parameter estimation via expectation-maximization algorithm under a state-space formulation, using initial estimates from the previous two stages. The proposed framework is adaptive to state-related changes and gives reliable estimates of effective connectivity. Simulation results show that our method provides accurate regime change-point detection and connectivity estimates. In real applications to brain signals, the approach was able to capture directed connectivity state changes in functional magnetic resonance imaging data linked with changes in stimulus conditions, and in epileptic electroencephalograms, differentiating ictal from nonictal periods. The proposed framework accurately identifies state-dependent changes in brain network and provides estimates of connectivity strength and directionality. The proposed approach is useful in neuroscience studies that investigate the dynamics of underlying brain states.
Systemic risk in a unifying framework for cascading processes on networks
NASA Astrophysics Data System (ADS)
Lorenz, J.; Battiston, S.; Schweitzer, F.
2009-10-01
We introduce a general framework for models of cascade and contagion processes on networks, to identify their commonalities and differences. In particular, models of social and financial cascades, as well as the fiber bundle model, the voter model, and models of epidemic spreading are recovered as special cases. To unify their description, we define the net fragility of a node, which is the difference between its fragility and the threshold that determines its failure. Nodes fail if their net fragility grows above zero and their failure increases the fragility of neighbouring nodes, thus possibly triggering a cascade. In this framework, we identify three classes depending on the way the fragility of a node is increased by the failure of a neighbour. At the microscopic level, we illustrate with specific examples how the failure spreading pattern varies with the node triggering the cascade, depending on its position in the network and its degree. At the macroscopic level, systemic risk is measured as the final fraction of failed nodes, X*, and for each of the three classes we derive a recursive equation to compute its value. The phase diagram of X* as a function of the initial conditions, thus allows for a prediction of the systemic risk as well as a comparison of the three different model classes. We could identify which model class leads to a first-order phase transition in systemic risk, i.e. situations where small changes in the initial conditions determine a global failure. Eventually, we generalize our framework to encompass stochastic contagion models. This indicates the potential for further generalizations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, Wei; Warrick, Erika R.; Neumark, Daniel M.
Using attosecond transient absorption, the dipole response of an argon atom in the vacuum ultraviolet (VUV) region is studied when an external electromagnetic field is present. An isolated attosecond VUV pulse populates Rydberg states lying 15 eV above the argon ground state. A synchronized few-cycle near infrared (NIR) pulse modifies the oscillating dipoles of argon impulsively, leading to alterations in the VUV absorption spectra. As the NIR pulse is delayed with respect to the VUV pulse, multiple features in the absorption profile emerge simultaneously including line broadening, sideband structure, sub-cycle fast modulations, and 5-10 fs slow modulations. These features indicatemore » the coexistence of two general processes of the light-matter interaction: the energy shift of individual atomic levels and coherent population transfer between atomic eigenstates, revealing coherent superpositions. Finally, an intuitive formula is derived to treat both effects in a unifying framework, allowing one to identify and quantify the two processes in a single absorption spectrogram.« less
NASA Astrophysics Data System (ADS)
Cao, Wei; Warrick, Erika R.; Neumark, Daniel M.; Leone, Stephen R.
2016-01-01
Using attosecond transient absorption, the dipole response of an argon atom in the vacuum ultraviolet (VUV) region is studied when an external electromagnetic field is present. An isolated attosecond VUV pulse populates Rydberg states lying 15 eV above the argon ground state. A synchronized few-cycle near infrared (NIR) pulse modifies the oscillating dipoles of argon impulsively, leading to alterations in the VUV absorption spectra. As the NIR pulse is delayed with respect to the VUV pulse, multiple features in the absorption profile emerge simultaneously including line broadening, sideband structure, sub-cycle fast modulations, and 5-10 fs slow modulations. These features indicate the coexistence of two general processes of the light-matter interaction: the energy shift of individual atomic levels and coherent population transfer between atomic eigenstates, revealing coherent superpositions. An intuitive formula is derived to treat both effects in a unifying framework, allowing one to identify and quantify the two processes in a single absorption spectrogram.
Cao, Wei; Warrick, Erika R.; Neumark, Daniel M.; ...
2016-01-18
Using attosecond transient absorption, the dipole response of an argon atom in the vacuum ultraviolet (VUV) region is studied when an external electromagnetic field is present. An isolated attosecond VUV pulse populates Rydberg states lying 15 eV above the argon ground state. A synchronized few-cycle near infrared (NIR) pulse modifies the oscillating dipoles of argon impulsively, leading to alterations in the VUV absorption spectra. As the NIR pulse is delayed with respect to the VUV pulse, multiple features in the absorption profile emerge simultaneously including line broadening, sideband structure, sub-cycle fast modulations, and 5-10 fs slow modulations. These features indicatemore » the coexistence of two general processes of the light-matter interaction: the energy shift of individual atomic levels and coherent population transfer between atomic eigenstates, revealing coherent superpositions. Finally, an intuitive formula is derived to treat both effects in a unifying framework, allowing one to identify and quantify the two processes in a single absorption spectrogram.« less
NASA Astrophysics Data System (ADS)
Zhao, Lei; Wang, Zengcai; Wang, Xiaojin; Qi, Yazhou; Liu, Qing; Zhang, Guoxin
2016-09-01
Human fatigue is an important cause of traffic accidents. To improve the safety of transportation, we propose, in this paper, a framework for fatigue expression recognition using image-based facial dynamic multi-information and a bimodal deep neural network. First, the landmark of face region and the texture of eye region, which complement each other in fatigue expression recognition, are extracted from facial image sequences captured by a single camera. Then, two stacked autoencoder neural networks are trained for landmark and texture, respectively. Finally, the two trained neural networks are combined by learning a joint layer on top of them to construct a bimodal deep neural network. The model can be used to extract a unified representation that fuses landmark and texture modalities together and classify fatigue expressions accurately. The proposed system is tested on a human fatigue dataset obtained from an actual driving environment. The experimental results demonstrate that the proposed method performs stably and robustly, and that the average accuracy achieves 96.2%.
Automatic Modulation Classification Based on Deep Learning for Unmanned Aerial Vehicles.
Zhang, Duona; Ding, Wenrui; Zhang, Baochang; Xie, Chunyu; Li, Hongguang; Liu, Chunhui; Han, Jungong
2018-03-20
Deep learning has recently attracted much attention due to its excellent performance in processing audio, image, and video data. However, few studies are devoted to the field of automatic modulation classification (AMC). It is one of the most well-known research topics in communication signal recognition and remains challenging for traditional methods due to complex disturbance from other sources. This paper proposes a heterogeneous deep model fusion (HDMF) method to solve the problem in a unified framework. The contributions include the following: (1) a convolutional neural network (CNN) and long short-term memory (LSTM) are combined by two different ways without prior knowledge involved; (2) a large database, including eleven types of single-carrier modulation signals with various noises as well as a fading channel, is collected with various signal-to-noise ratios (SNRs) based on a real geographical environment; and (3) experimental results demonstrate that HDMF is very capable of coping with the AMC problem, and achieves much better performance when compared with the independent network.
Kawarazuka, Nozomi; Locke, Catherine; McDougall, Cynthia; Kantor, Paula; Morgan, Miranda
2017-03-01
The demand for gender analysis is now increasingly orthodox in natural resource programming, including that for small-scale fisheries. Whilst the analysis of social-ecological resilience has made valuable contributions to integrating social dimensions into research and policy-making on natural resource management, it has so far demonstrated limited success in effectively integrating considerations of gender equity. This paper reviews the challenges in, and opportunities for, bringing a gender analysis together with social-ecological resilience analysis in the context of small-scale fisheries research in developing countries. We conclude that rather than searching for a single unifying framework for gender and resilience analysis, it will be more effective to pursue a plural solution in which closer engagement is fostered between analysis of gender and social-ecological resilience whilst preserving the strengths of each approach. This approach can make an important contribution to developing a better evidence base for small-scale fisheries management and policy.
Multivariate Methods for Meta-Analysis of Genetic Association Studies.
Dimou, Niki L; Pantavou, Katerina G; Braliou, Georgia G; Bagos, Pantelis G
2018-01-01
Multivariate meta-analysis of genetic association studies and genome-wide association studies has received a remarkable attention as it improves the precision of the analysis. Here, we review, summarize and present in a unified framework methods for multivariate meta-analysis of genetic association studies and genome-wide association studies. Starting with the statistical methods used for robust analysis and genetic model selection, we present in brief univariate methods for meta-analysis and we then scrutinize multivariate methodologies. Multivariate models of meta-analysis for a single gene-disease association studies, including models for haplotype association studies, multiple linked polymorphisms and multiple outcomes are discussed. The popular Mendelian randomization approach and special cases of meta-analysis addressing issues such as the assumption of the mode of inheritance, deviation from Hardy-Weinberg Equilibrium and gene-environment interactions are also presented. All available methods are enriched with practical applications and methodologies that could be developed in the future are discussed. Links for all available software implementing multivariate meta-analysis methods are also provided.
Automatic Modulation Classification Based on Deep Learning for Unmanned Aerial Vehicles
Ding, Wenrui; Zhang, Baochang; Xie, Chunyu; Li, Hongguang; Liu, Chunhui; Han, Jungong
2018-01-01
Deep learning has recently attracted much attention due to its excellent performance in processing audio, image, and video data. However, few studies are devoted to the field of automatic modulation classification (AMC). It is one of the most well-known research topics in communication signal recognition and remains challenging for traditional methods due to complex disturbance from other sources. This paper proposes a heterogeneous deep model fusion (HDMF) method to solve the problem in a unified framework. The contributions include the following: (1) a convolutional neural network (CNN) and long short-term memory (LSTM) are combined by two different ways without prior knowledge involved; (2) a large database, including eleven types of single-carrier modulation signals with various noises as well as a fading channel, is collected with various signal-to-noise ratios (SNRs) based on a real geographical environment; and (3) experimental results demonstrate that HDMF is very capable of coping with the AMC problem, and achieves much better performance when compared with the independent network. PMID:29558434
Spiked GBS: A unified, open platform for single marker genotyping and whole-genome profiling
USDA-ARS?s Scientific Manuscript database
In plant breeding, there are two primary applications for DNA markers in selection: 1) selection of known genes using a single marker assay (marker-assisted selection; MAS); and 2) whole-genome profiling and prediction (genomic selection; GS). Typically, marker platforms have addressed only one of t...
Global Science and Social Systems: The Essentials of Montessori Education and Peace Frameworks
ERIC Educational Resources Information Center
Kahn, David
2016-01-01
Inspired by Baiba Krumins-Grazzini's interdependencies lecture at NAMTA's Portland conference, David Kahn shows the unifying structures of the program that are rooted in the natural and social sciences. Through a connective web, these sciences explore the integration of all knowledge and lead to a philosophical view of life on earth, including…
String Theory: Big Problem for Small Size
ERIC Educational Resources Information Center
Sahoo, S.
2009-01-01
String theory is the most promising candidate theory for a unified description of all the fundamental forces that exist in nature. It provides a mathematical framework that combines quantum theory with Einstein's general theory of relativity. The typical size of a string is of the order of 10[superscript -33] cm, called the Planck length. But due…
A Unified Algebraic and Logic-Based Framework Towards Safe Routing Implementations
2015-08-13
Software - defined Networks ( SDN ). We developed a declarative platform for implementing SDN protocols using declarative...and debugging several SDN applications. Example-based SDN synthesis. Recent emergence of software - defined networks offers an opportunity to design...domain of Software - defined Networks ( SDN ). We developed a declarative platform for implementing SDN protocols using declarative networking
At the Edge of Chaos: A New Paradigm for Social Work?
ERIC Educational Resources Information Center
Hudson, Christopher G.
2000-01-01
Reviews key concepts and applications of chaos theory and the broader complex systems theory in the context of general systems theory and the search for a unified conceptual framework for social work. Concludes that chaos theory shows promise as a solution to many problems posed by the now dated general systems approach. (DB)
The Qubit as Key to Quantum Physics Part II: Physical Realizations and Applications
ERIC Educational Resources Information Center
Dür, Wolfgang; Heusler, Stefan
2016-01-01
Using the simplest possible quantum system--the qubit--the fundamental concepts of quantum physics can be introduced. This highlights the common features of many different physical systems, and provides a unifying framework when teaching quantum physics at the high school or introductory level. In a previous "TPT" article and in a…
Gender Divide and Acceptance of Collaborative Web 2.0 Applications for Learning in Higher Education
ERIC Educational Resources Information Center
Huang, Wen-Hao David; Hood, Denice Ward; Yoo, Sun Joo
2013-01-01
Situated in the gender digital divide framework, this survey study investigated the role of computer anxiety in influencing female college students' perceptions toward Web 2.0 applications for learning. Based on 432 college students' "Web 2.0 for learning" perception ratings collected by relevant categories of "Unified Theory of Acceptance and Use…
The Administrator Training Program. A Model of Educational Leadership.
ERIC Educational Resources Information Center
Funderburg, Jean; And Others
This paper describes the Administrator Training Program (ATP), a joint venture between San Jose Unified School District and Stanford University. A discussion of the ATP's theoretical framework is followed by an outline of the structure and content of the program and a review of the ATP outcomes. Then the generic elements of the ATP model are…
ERIC Educational Resources Information Center
World Health Organization, Geneva (Switzerland).
The manual contains three classifications (impairments, disabilities, and handicaps), each relating to a different plane of experience consequent upon disease. Section 1 attempts to clarify the nature of health related experiences by addressing reponse to acute and chronic illness; the unifying framework for classification (principle events in the…
ERIC Educational Resources Information Center
Salinas, Esther Charlotte
2013-01-01
Using the Gap Analysis problem-solving framework (Clark & Estes, 2008), this project examined collaboration around student achievement at the school site leadership level in the Pasadena Unified School District (PUSD). This project is one of three concurrent studies focused on collaboration around student achievement in the PUSD that include…
Unity of elementary particles and forces in higher dimensions.
Gogoladze, Ilia; Mimura, Yukihiro; Nandi, S
2003-10-03
The idea of unifying all the gauge and Yukawa forces as well as the gauge, Higgs, and fermionic matter particles naturally leads us to a simple gauge symmetry in higher dimensions with supersymmetry. We present a model in which, for the first time, such a unification is achieved in the framework of quantum field theory.
ERIC Educational Resources Information Center
Llamas, Sonia Rodarte
2013-01-01
Using the Gap Analysis problem-solving framework (Clark & Estes, 2008), this study examined collaboration around student achievement at the central office leadership level in the Pasadena Unified School District (PUSD). This study is one of three concurrent studies focused on collaboration around student achievement in the PUSD that include…
ERIC Educational Resources Information Center
Carruthers, Anthony Steven
2013-01-01
Using the Gap Analysis problem-solving framework (Clark & Estes, 2008), this project examined collaboration around student achievement in the Pasadena Unified School District (PUSD) from the teacher perspective. As part of a tri-level study, two other projects examined collaboration around student achievement in PUSD from the perspectives of…
ERIC Educational Resources Information Center
Seung, Eulsun; Bryan, Lynn A.; Haugan, Mark P.
2012-01-01
In this study, we investigated the pedagogical content knowledge (PCK) that physics graduate teaching assistants (TAs) developed in the context of teaching a new introductory physics curriculum, "Matter and Interactions" ("M&I"). "M&I" is an innovative introductory physics course that emphasizes a unified framework for understanding the world and…
High Maneuverability Airframe: Investigation of Fin and Canard Sizing for Optimum Maneuverability
2014-09-01
overset grids (unified- grid); 5) total variation diminishing discretization based on a new multidimensional interpolation framework; 6) Riemann solvers to...Aerodynamics .........................................................................................3 3.1.1 Solver ...describes the methodology used for the simulations. 3.1.1 Solver The double-precision solver of a commercially available code, CFD ++ v12.1.1, 9
A Unified Framework for Bounded and Unbounded Numerical Estimation
ERIC Educational Resources Information Center
Kim, Dan; Opfer, John E.
2017-01-01
Representations of numerical value have been assessed by using bounded (e.g., 0-1,000) and unbounded (e.g., 0-?) number-line tasks, with considerable debate regarding whether 1 or both tasks elicit unique cognitive strategies (e.g., addition or subtraction) and require unique cognitive models. To test this, we examined how well a mixed log-linear…
In Search of Optimal Cognitive Diagnostic Model(s) for ESL Grammar Test Data
ERIC Educational Resources Information Center
Yi, Yeon-Sook
2017-01-01
This study compares five cognitive diagnostic models in search of optimal one(s) for English as a Second Language grammar test data. Using a unified modeling framework that can represent specific models with proper constraints, the article first fit the full model (the log-linear cognitive diagnostic model, LCDM) and investigated which model…
NASA Astrophysics Data System (ADS)
Maechling, P. J.; Taborda, R.; Callaghan, S.; Shaw, J. H.; Plesch, A.; Olsen, K. B.; Jordan, T. H.; Goulet, C. A.
2017-12-01
Crustal seismic velocity models and datasets play a key role in regional three-dimensional numerical earthquake ground-motion simulation, full waveform tomography, modern physics-based probabilistic earthquake hazard analysis, as well as in other related fields including geophysics, seismology, and earthquake engineering. The standard material properties provided by a seismic velocity model are P- and S-wave velocities and density for any arbitrary point within the geographic volume for which the model is defined. Many seismic velocity models and datasets are constructed by synthesizing information from multiple sources and the resulting models are delivered to users in multiple file formats, such as text files, binary files, HDF-5 files, structured and unstructured grids, and through computer applications that allow for interactive querying of material properties. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) software framework to facilitate the registration and distribution of existing and future seismic velocity models to the SCEC community. The UCVM software framework is designed to provide a standard query interface to multiple, alternative velocity models, even if the underlying velocity models are defined in different formats or use different geographic projections. The UCVM framework provides a comprehensive set of open-source tools for querying seismic velocity model properties, combining regional 3D models and 1D background models, visualizing 3D models, and generating computational models in the form of regular grids or unstructured meshes that can be used as inputs for ground-motion simulations. The UCVM framework helps researchers compare seismic velocity models and build equivalent simulation meshes from alternative velocity models. These capabilities enable researchers to evaluate the impact of alternative velocity models in ground-motion simulations and seismic hazard analysis applications. In this poster, we summarize the key components of the UCVM framework and describe the impact it has had in various computational geoscientific applications.
Classifying clinical decision making: a unifying approach.
Buckingham, C D; Adams, A
2000-10-01
This is the first of two linked papers exploring decision making in nursing which integrate research evidence from different clinical and academic disciplines. Currently there are many decision-making theories, each with their own distinctive concepts and terminology, and there is a tendency for separate disciplines to view their own decision-making processes as unique. Identifying good nursing decisions and where improvements can be made is therefore problematic, and this can undermine clinical and organizational effectiveness, as well as nurses' professional status. Within the unifying framework of psychological classification, the overall aim of the two papers is to clarify and compare terms, concepts and processes identified in a diversity of decision-making theories, and to demonstrate their underlying similarities. It is argued that the range of explanations used across disciplines can usefully be re-conceptualized as classification behaviour. This paper explores problems arising from multiple theories of decision making being applied to separate clinical disciplines. Attention is given to detrimental effects on nursing practice within the context of multidisciplinary health-care organizations and the changing role of nurses. The different theories are outlined and difficulties in applying them to nursing decisions highlighted. An alternative approach based on a general model of classification is then presented in detail to introduce its terminology and the unifying framework for interpreting all types of decisions. The classification model is used to provide the context for relating alternative philosophical approaches and to define decision-making activities common to all clinical domains. This may benefit nurses by improving multidisciplinary collaboration and weakening clinical elitism.
Towards a Grand Unified Theory of sports performance.
Glazier, Paul S
2017-12-01
Sports performance is generally considered to be governed by a range of interacting physiological, biomechanical, and psychological variables, amongst others. Despite sports performance being multi-factorial, however, the majority of performance-oriented sports science research has predominantly been monodisciplinary in nature, presumably due, at least in part, to the lack of a unifying theoretical framework required to integrate the various subdisciplines of sports science. In this target article, I propose a Grand Unified Theory (GUT) of sports performance-and, by elaboration, sports science-based around the constraints framework introduced originally by Newell (1986). A central tenet of this GUT is that, at both the intra- and inter-individual levels of analysis, patterns of coordination and control, which directly determine the performance outcome, emerge from the confluence of interacting organismic, environmental, and task constraints via the formation and self-organisation of coordinative structures. It is suggested that this GUT could be used to: foster interdisciplinary research collaborations; break down the silos that have developed in sports science and restore greater disciplinary balance to the field; promote a more holistic understanding of sports performance across all levels of analysis; increase explanatory power of applied research work; provide stronger rationale for data collection and variable selection; and direct the development of integrated performance monitoring technologies. This GUT could also provide a scientifically rigorous basis for integrating the subdisciplines of sports science in applied sports science support programmes adopted by high-performance agencies and national governing bodies for various individual and team sports. Copyright © 2017 Elsevier B.V. All rights reserved.
Scalable large format 3D displays
NASA Astrophysics Data System (ADS)
Chang, Nelson L.; Damera-Venkata, Niranjan
2010-02-01
We present a general framework for the modeling and optimization of scalable large format 3-D displays using multiple projectors. Based on this framework, we derive algorithms that can robustly optimize the visual quality of an arbitrary combination of projectors (e.g. tiled, superimposed, combinations of the two) without manual adjustment. The framework creates for the first time a new unified paradigm that is agnostic to a particular configuration of projectors yet robustly optimizes for the brightness, contrast, and resolution of that configuration. In addition, we demonstrate that our algorithms support high resolution stereoscopic video at real-time interactive frame rates achieved on commodity graphics hardware. Through complementary polarization, the framework creates high quality multi-projector 3-D displays at low hardware and operational cost for a variety of applications including digital cinema, visualization, and command-and-control walls.
Unifying inflation with ΛCDM epoch in modified f(R) gravity consistent with Solar System tests
NASA Astrophysics Data System (ADS)
Nojiri, Shin'ichi; Odintsov, Sergei D.
2007-12-01
We suggest two realistic f(R) and one F(G) modified gravities which are consistent with local tests and cosmological bounds. The typical property of such theories is the presence of the effective cosmological constant epochs in such a way that early-time inflation and late-time cosmic acceleration are naturally unified within single model. It is shown that classical instability does not appear here and Newton law is respected. Some discussion of possible anti-gravity regime appearance and related modification of the theory is done.
Driving Under the Influence (of Language).
Barrett, Daniel Paul; Bronikowski, Scott Alan; Yu, Haonan; Siskind, Jeffrey Mark
2017-06-09
We present a unified framework which supports grounding natural-language semantics in robotic driving. This framework supports acquisition (learning grounded meanings of nouns and prepositions from human sentential annotation of robotic driving paths), generation (using such acquired meanings to generate sentential description of new robotic driving paths), and comprehension (using such acquired meanings to support automated driving to accomplish navigational goals specified in natural language). We evaluate the performance of these three tasks by having independent human judges rate the semantic fidelity of the sentences associated with paths. Overall, machine performance is 74.9%, while the performance of human annotators is 83.8%.
Analysis model for personal eHealth solutions and services.
Mykkänen, Juha; Tuomainen, Mika; Luukkonen, Irmeli; Itälä, Timo
2010-01-01
In this paper, we present a framework for analysing and assessing various features of personal wellbeing information management services and solutions such as personal health records and citizen-oriented eHealth services. The model is based on general functional and interoperability standards for personal health management applications and generic frameworks for different aspects of analysis. It has been developed and used in the MyWellbeing project in Finland to provide baseline for the research, development and comparison of many different personal wellbeing and health management solutions and to support the development of unified "Coper" concept for citizen empowerment.
Action Recommendation for Cyber Resilience
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choudhury, Sutanay; Rodriguez, Luke R.; Curtis, Darren S.
2015-09-01
This paper presents an unifying graph-based model for representing the infrastructure, behavior and missions of an enterprise. We describe how the model can be used to achieve resiliency against a wide class of failures and attacks. We introduce an algorithm for recommending resilience establishing actions based on dynamic updates to the models. Without loss of generality, we show the effectiveness of the algorithm for preserving latency based quality of service (QoS). Our models and the recommendation algorithms are implemented in a software framework that we seek to release as an open source framework for simulating resilient cyber systems.
Building social cognitive models of language change.
Hruschka, Daniel J; Christiansen, Morten H; Blythe, Richard A; Croft, William; Heggarty, Paul; Mufwene, Salikoko S; Pierrehumbert, Janet B; Poplack, Shana
2009-11-01
Studies of language change have begun to contribute to answering several pressing questions in cognitive sciences, including the origins of human language capacity, the social construction of cognition and the mechanisms underlying culture change in general. Here, we describe recent advances within a new emerging framework for the study of language change, one that models such change as an evolutionary process among competing linguistic variants. We argue that a crucial and unifying element of this framework is the use of probabilistic, data-driven models both to infer change and to compare competing claims about social and cognitive influences on language change.
Liu, Meiqin; Zhang, Senlin
2008-10-01
A unified neural network model termed standard neural network model (SNNM) is advanced. Based on the robust L(2) gain (i.e. robust H(infinity) performance) analysis of the SNNM with external disturbances, a state-feedback control law is designed for the SNNM to stabilize the closed-loop system and eliminate the effect of external disturbances. The control design constraints are shown to be a set of linear matrix inequalities (LMIs) which can be easily solved by various convex optimization algorithms (e.g. interior-point algorithms) to determine the control law. Most discrete-time recurrent neural network (RNNs) and discrete-time nonlinear systems modelled by neural networks or Takagi and Sugeno (T-S) fuzzy models can be transformed into the SNNMs to be robust H(infinity) performance analyzed or robust H(infinity) controller synthesized in a unified SNNM's framework. Finally, some examples are presented to illustrate the wide application of the SNNMs to the nonlinear systems, and the proposed approach is compared with related methods reported in the literature.
NASA Technical Reports Server (NTRS)
Chung, Ching-Luan
1990-01-01
The term trajectory planning has been used to refer to the process of determining the time history of joint trajectory of each joint variable corresponding to a specified trajectory of the end effector. The trajectory planning problem was solved as a purely kinematic problem. The drawback is that there is no guarantee that the actuators can deliver the effort necessary to track the planned trajectory. To overcome this limitation, a motion planning approach which addresses the kinematics, dynamics, and feedback control of a manipulator in a unified framework was developed. Actuator constraints are taken into account explicitly and a priori in the synthesis of the feedback control law. Therefore the result of applying the motion planning approach described is not only the determination of the entire set of joint trajectories but also a complete specification of the feedback control strategy which would yield these joint trajectories without violating actuator constraints. The effectiveness of the unified motion planning approach is demonstrated on two problems which are of practical interest in manipulator robotics.
NASA Astrophysics Data System (ADS)
Frauendorf, S.
2018-04-01
The key elements of the Unified Model are reviewed. The microscopic derivation of the Bohr Hamiltonian by means of adiabatic time-dependent mean field theory is presented. By checking against experimental data the limitations of the Unified Model are delineated. The description of the strong coupling between the rotational and intrinsic degrees of freedom in framework of the rotating mean field is presented from a conceptual point of view. The classification of rotational bands as configurations of rotating quasiparticles is introduced. The occurrence of uniform rotation about an axis that differs from the principle axes of the nuclear density distribution is discussed. The physics behind this tilted-axis rotation, unknown in molecular physics, is explained on a basic level. The new symmetries of the rotating mean field that arise from the various orientations of the angular momentum vector with respect to the triaxial nuclear density distribution and their manifestation by the level sequence of rotational bands are discussed. Resulting phenomena, as transverse wobbling, rotational chirality, magnetic rotation and band termination are discussed. Using the concept of spontaneous symmetry breaking the microscopic underpinning of the rotational degrees is refined.
NASA Astrophysics Data System (ADS)
Fitch, W. Tecumseh
2014-09-01
Progress in understanding cognition requires a quantitative, theoretical framework, grounded in the other natural sciences and able to bridge between implementational, algorithmic and computational levels of explanation. I review recent results in neuroscience and cognitive biology that, when combined, provide key components of such an improved conceptual framework for contemporary cognitive science. Starting at the neuronal level, I first discuss the contemporary realization that single neurons are powerful tree-shaped computers, which implies a reorientation of computational models of learning and plasticity to a lower, cellular, level. I then turn to predictive systems theory (predictive coding and prediction-based learning) which provides a powerful formal framework for understanding brain function at a more global level. Although most formal models concerning predictive coding are framed in associationist terms, I argue that modern data necessitate a reinterpretation of such models in cognitive terms: as model-based predictive systems. Finally, I review the role of the theory of computation and formal language theory in the recent explosion of comparative biological research attempting to isolate and explore how different species differ in their cognitive capacities. Experiments to date strongly suggest that there is an important difference between humans and most other species, best characterized cognitively as a propensity by our species to infer tree structures from sequential data. Computationally, this capacity entails generative capacities above the regular (finite-state) level; implementationally, it requires some neural equivalent of a push-down stack. I dub this unusual human propensity "dendrophilia", and make a number of concrete suggestions about how such a system may be implemented in the human brain, about how and why it evolved, and what this implies for models of language acquisition. I conclude that, although much remains to be done, a neurally-grounded framework for theoretical cognitive science is within reach that can move beyond polarized debates and provide a more adequate theoretical future for cognitive biology.
Fitch, W Tecumseh
2014-09-01
Progress in understanding cognition requires a quantitative, theoretical framework, grounded in the other natural sciences and able to bridge between implementational, algorithmic and computational levels of explanation. I review recent results in neuroscience and cognitive biology that, when combined, provide key components of such an improved conceptual framework for contemporary cognitive science. Starting at the neuronal level, I first discuss the contemporary realization that single neurons are powerful tree-shaped computers, which implies a reorientation of computational models of learning and plasticity to a lower, cellular, level. I then turn to predictive systems theory (predictive coding and prediction-based learning) which provides a powerful formal framework for understanding brain function at a more global level. Although most formal models concerning predictive coding are framed in associationist terms, I argue that modern data necessitate a reinterpretation of such models in cognitive terms: as model-based predictive systems. Finally, I review the role of the theory of computation and formal language theory in the recent explosion of comparative biological research attempting to isolate and explore how different species differ in their cognitive capacities. Experiments to date strongly suggest that there is an important difference between humans and most other species, best characterized cognitively as a propensity by our species to infer tree structures from sequential data. Computationally, this capacity entails generative capacities above the regular (finite-state) level; implementationally, it requires some neural equivalent of a push-down stack. I dub this unusual human propensity "dendrophilia", and make a number of concrete suggestions about how such a system may be implemented in the human brain, about how and why it evolved, and what this implies for models of language acquisition. I conclude that, although much remains to be done, a neurally-grounded framework for theoretical cognitive science is within reach that can move beyond polarized debates and provide a more adequate theoretical future for cognitive biology. Copyright © 2014. Published by Elsevier B.V.
Physical theories, eternal inflation, and the quantum universe
NASA Astrophysics Data System (ADS)
Nomura, Yasunori
2011-11-01
Infinities in eternal inflation have long been plaguing cosmology, making any predictions highly sensitive to how they are regulated. The problem exists already at the level of semi-classical general relativity, and has a priori nothing to do with quantum gravity. On the other hand, we know that certain problems in semi-classical gravity, for example physics of black holes and their evaporation, have led to understanding of surprising, quantum natures of spacetime and gravity, such as the holographic principle and horizon complementarity. In this paper, we present a framework in which well-defined predictions are obtained in an eternally inflating multiverse, based on the principles of quantum mechanics. We propose that the entire multiverse is described purely from the viewpoint of a single "observer," who describes the world as a quantum state defined on his/her past light cones bounded by the (stretched) apparent horizons. We find that quantum mechanics plays an essential role in regulating infinities. The framework is "gauge invariant," i.e. predictions do not depend on how spacetime is parametrized, as it should be in a theory of quantum gravity. Our framework provides a fully unified treatment of quantum measurement processes and the multiverse. We conclude that the eternally inflating multiverse and many worlds in quantum mechanics are the same. Other important implications include: global spacetime can be viewed as a derived concept; the multiverse is a transient phenomenon during the world relaxing into a supersymmetric Minkowski state. We also present a model of "initial conditions" for the multiverse. By extrapolating our framework to the extreme, we arrive at a picture that the entire multiverse is a fluctuation in the stationary, fractal "mega-multiverse," in which an infinite sequence of multiverse productions occurs. The framework discussed here does not suffer from problems/paradoxes plaguing other measures proposed earlier, such as the youngness paradox and the Boltzmann brain problem.
NASA Technical Reports Server (NTRS)
Russell, D. L.
1983-01-01
Various aspects of the control theory of hyperbolic systems, including controllability, stabilization, control canonical form theory, etc., are reviewed. To allow a unified and not excessively technical treatment, attention is restricted to the case of a single space variable. A newly developed procedure of canonical augmentation is discussed.
The Dynamic Brain: From Spiking Neurons to Neural Masses and Cortical Fields
Deco, Gustavo; Jirsa, Viktor K.; Robinson, Peter A.; Breakspear, Michael; Friston, Karl
2008-01-01
The cortex is a complex system, characterized by its dynamics and architecture, which underlie many functions such as action, perception, learning, language, and cognition. Its structural architecture has been studied for more than a hundred years; however, its dynamics have been addressed much less thoroughly. In this paper, we review and integrate, in a unifying framework, a variety of computational approaches that have been used to characterize the dynamics of the cortex, as evidenced at different levels of measurement. Computational models at different space–time scales help us understand the fundamental mechanisms that underpin neural processes and relate these processes to neuroscience data. Modeling at the single neuron level is necessary because this is the level at which information is exchanged between the computing elements of the brain; the neurons. Mesoscopic models tell us how neural elements interact to yield emergent behavior at the level of microcolumns and cortical columns. Macroscopic models can inform us about whole brain dynamics and interactions between large-scale neural systems such as cortical regions, the thalamus, and brain stem. Each level of description relates uniquely to neuroscience data, from single-unit recordings, through local field potentials to functional magnetic resonance imaging (fMRI), electroencephalogram (EEG), and magnetoencephalogram (MEG). Models of the cortex can establish which types of large-scale neuronal networks can perform computations and characterize their emergent properties. Mean-field and related formulations of dynamics also play an essential and complementary role as forward models that can be inverted given empirical data. This makes dynamic models critical in integrating theory and experiments. We argue that elaborating principled and informed models is a prerequisite for grounding empirical neuroscience in a cogent theoretical framework, commensurate with the achievements in the physical sciences. PMID:18769680
2007-11-01
Florea, Anne-Laure Jousselme, Éloi Bossé ; DRDC Valcartier TR 2003-319 ; R & D pour la défense Canada – Valcartier ; novembre 2007. Contexte : Pour...12 3.3.2 Imprecise information . . . . . . . . . . . . . . . . . . . . . 13 3.3.3 Uncertain and imprecise information...information proposed by Philippe Smets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 Figure 5: The process of information modelling
ERIC Educational Resources Information Center
Price, Cristofer; Unlu, Fatih
2014-01-01
The Comparative Short Interrupted Time Series (C-SITS) design is a frequently employed quasi-experimental method, in which the pre- and post-intervention changes observed in the outcome levels of a treatment group is compared with those of a comparison group where the difference between the former and the latter is attributed to the treatment. The…
ERIC Educational Resources Information Center
Marsh, Herbert W.; Pekrun, Reinhard; Murayama, Kou; Arens, A. Katrin; Parker, Philip D.; Guo, Jiesi; Dicke, Theresa
2018-01-01
Our newly proposed integrated academic self-concept model integrates 3 major theories of academic self-concept formation and developmental perspectives into a unified conceptual and methodological framework. Relations among math self-concept (MSC), school grades, test scores, and school-level contextual effects over 6 years, from the end of…
ERIC Educational Resources Information Center
Rupp, Andre A.
2012-01-01
In the focus article of this issue, von Davier, Naemi, and Roberts essentially coupled: (1) a short methodological review of structural similarities of latent variable models with discrete and continuous latent variables; and (2) 2 short empirical case studies that show how these models can be applied to real, rather than simulated, large-scale…
ERIC Educational Resources Information Center
Perla, Rocco J.; Carifio, James
2011-01-01
Background: Extending Merton's (1936) work on the consequences of purposive social action, the model, theory and taxonomy outlined here incorporates and formalizes both anticipated and unanticipated research findings in a unified theoretical framework. The model of anticipated research findings was developed initially by Carifio (1975, 1977) and…
ERIC Educational Resources Information Center
Deng, Shengli; Lin, Yanqing; Liu, Yong; Chen, Xiaoyu; Li, Hongxiu
2017-01-01
Introduction: Personality and trust have been found to be important precursors of information-sharing behaviour, but little is known about how these factors interact with each other in shaping information-sharing behaviour. By integrating both trust and user personality into a unified research framework, this study examines how trust mediates the…
ERIC Educational Resources Information Center
Berkeley Unified School District, CA. Asian American Bilingual Center.
This Pilipino teacher's guide is part of Berkeley, California Unified School District Asian American Bilingual Center's effort to foster the total growth of the child. To facilitate that growth, the Center has selected an interdisciplinary approach to curriculum development. Social studies themes and concepts provide the framework within which all…
A Unified Approach toward the Development of Swedish as L2: A Processability Account.
ERIC Educational Resources Information Center
Pienemann, Manfred; Hakansson, Gisela
1999-01-01
Aims to put the body of research on Swedish as a second language (SSL) into one coherent framework and to test the predictions deriving from processability theory for Swedish against this empirical database. Surveys the 14 most prominent research projects on SSL, covering wide areas of syntax and morphology in longitudinal and cross-sectional…
ERIC Educational Resources Information Center
Calabrese, William R.; Rudick, Monica M.; Simms, Leonard J.; Clark, Lee Anna
2012-01-01
Recently, integrative, hierarchical models of personality and personality disorder (PD)--such as the Big Three, Big Four, and Big Five trait models--have gained support as a unifying dimensional framework for describing PD. However, no measures to date can simultaneously represent each of these potentially interesting levels of the personality…
ERIC Educational Resources Information Center
Soto, Julio G.; Everhart, Jerry
2016-01-01
Biology faculty at San José State University developed, piloted, implemented, and assessed a freshmen course sequence based on the macro-to micro-teaching approach that was team-taught, and organized around unifying themes. Content learning assessment drove the conceptual framework of our course sequence. Content student learning increased…
ERIC Educational Resources Information Center
Clark, Elaine
2009-01-01
This article reports on the launching of the Revans Academy for Action Learning and Research at Manchester Business School on 26 November 2008. The goal of the Academy is to foster the development of action learning as a unifying framework within Manchester Business School. Its goal is to provide a hub for dialogue, collaboration, exploitation and…
ERIC Educational Resources Information Center
Limanauskiene, Virginija; Stuikys, Vytautas
2009-01-01
With the expansion of e-learning, the understanding and evaluation of already created e-learning environments is becoming an extremely important issue. One way to dealing with the problem is analysis of case studies, i.e. already created environments, from the reuse perspective. The paper presents a general framework and model to assess UNITE, the…
ERIC Educational Resources Information Center
Jagodzinski, Wolfgang
2010-01-01
This paper investigates the influence of the economic, social, and cultural variables on life satisfaction in Asia and Europe. The second section sets a unifying theoretical framework for all three domains by defining life satisfaction as a function of aspirations and expectations which in turn are affected by micro- and macro-level variables. On…
ERIC Educational Resources Information Center
Cavanagh, Robert F.
2015-01-01
This study employed the capabilities-expectations model of engagement in classroom learning based on bio-ecological frameworks of intellectual development and flow theory. According to the capabilities-expectations model, engagement requires a balance between the capabilities of a student for learning in a particular situation and what is expected…
Optimization-Based Robust Nonlinear Control
2006-08-01
ABSTRACT New control algorithms were developed for robust stabilization of nonlinear dynamical systems . Novel, linear matrix inequality-based synthesis...was to further advance optimization-based robust nonlinear control design, for general nonlinear systems (especially in discrete time ), for linear...Teel, IEEE Transactions on Control Systems Technology, vol. 14, no. 3, p. 398-407, May 2006. 3. "A unified framework for input-to-state stability in
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mouchet, Amaury, E-mail: mouchet@phys.univ-tours.fr
The Noether theorem connecting symmetries and conservation laws can be applied directly in a Hamiltonian framework without using any intermediate Lagrangian formulation. This requires a careful discussion about the invariance of the boundary conditions under a canonical transformation and this paper proposes to address this issue. Then, the unified treatment of Hamiltonian systems offered by Noether’s approach is illustrated on several examples, including classical field theory and quantum dynamics.
Computer-Aided Group Problem Solving for Unified Life Cycle Engineering (ULCE)
1989-02-01
defining the problem, generating alternative solutions, evaluating alternatives, selecting alternatives, and implementing the solution. Systems...specialist in group dynamics, assists the group in formulating the problem and selecting a model framework. The analyst provides the group with computer...allocating resources, evaluating and selecting options, making judgments explicit, and analyzing dynamic systems. c. University of Rhode Island Drs. Geoffery
Hernández, Moisés; Guerrero, Ginés D.; Cecilia, José M.; García, José M.; Inuggi, Alberto; Jbabdi, Saad; Behrens, Timothy E. J.; Sotiropoulos, Stamatios N.
2013-01-01
With the performance of central processing units (CPUs) having effectively reached a limit, parallel processing offers an alternative for applications with high computational demands. Modern graphics processing units (GPUs) are massively parallel processors that can execute simultaneously thousands of light-weight processes. In this study, we propose and implement a parallel GPU-based design of a popular method that is used for the analysis of brain magnetic resonance imaging (MRI). More specifically, we are concerned with a model-based approach for extracting tissue structural information from diffusion-weighted (DW) MRI data. DW-MRI offers, through tractography approaches, the only way to study brain structural connectivity, non-invasively and in-vivo. We parallelise the Bayesian inference framework for the ball & stick model, as it is implemented in the tractography toolbox of the popular FSL software package (University of Oxford). For our implementation, we utilise the Compute Unified Device Architecture (CUDA) programming model. We show that the parameter estimation, performed through Markov Chain Monte Carlo (MCMC), is accelerated by at least two orders of magnitude, when comparing a single GPU with the respective sequential single-core CPU version. We also illustrate similar speed-up factors (up to 120x) when comparing a multi-GPU with a multi-CPU implementation. PMID:23658616
Breaking beta: deconstructing the parasite transmission function
McCallum, Hamish; Fenton, Andy; Hudson, Peter J.; Lee, Brian; Levick, Beth; Norman, Rachel
2017-01-01
Transmission is a fundamental step in the life cycle of every parasite but it is also one of the most challenging processes to model and quantify. In most host–parasite models, the transmission process is encapsulated by a single parameter β. Many different biological processes and interactions, acting on both hosts and infectious organisms, are subsumed in this single term. There are, however, at least two undesirable consequences of this high level of abstraction. First, nonlinearities and heterogeneities that can be critical to the dynamic behaviour of infections are poorly represented; second, estimating the transmission coefficient β from field data is often very difficult. In this paper, we present a conceptual model, which breaks the transmission process into its component parts. This deconstruction enables us to identify circumstances that generate nonlinearities in transmission, with potential implications for emergent transmission behaviour at individual and population scales. Such behaviour cannot be explained by the traditional linear transmission frameworks. The deconstruction also provides a clearer link to the empirical estimation of key components of transmission and enables the construction of flexible models that produce a unified understanding of the spread of both micro- and macro-parasite infectious disease agents. This article is part of the themed issue ‘Opening the black box: re-examining the ecology and evolution of parasite transmission’. PMID:28289252
Phylogenetic profiles reveal structural/functional determinants of TRPC3 signal-sensing antennae
Ko, Kyung Dae; Bhardwaj, Gaurav; Hong, Yoojin; Chang, Gue Su; Kiselyov, Kirill
2009-01-01
Biochemical assessment of channel structure/function is incredibly challenging. Developing computational tools that provide these data would enable translational research, accelerating mechanistic experimentation for the bench scientist studying ion channels. Starting with the premise that protein sequence encodes information about structure, function and evolution (SF&E), we developed a unified framework for inferring SF&E from sequence information using a knowledge-based approach. The Gestalt Domain Detection Algorithm-Basic Local Alignment Tool (GDDA-BLAST) provides phylogenetic profiles that can model, ab initio, SF&E relationships of biological sequences at the whole protein, single domain and single-amino acid level.1,2 In our recent paper,4 we have applied GDDA-BLAST analysis to study canonical TRP (TRPC) channels1 and empirically validated predicted lipid-binding and trafficking activities contained within the TRPC3 TRP_2 domain of unknown function. Overall, our in silico, in vitro, and in vivo experiments support a model in which TRPC3 has signal-sensing antennae which are adorned with lipid-binding, trafficking and calmodulin regulatory domains. In this Addendum, we correlate our functional domain analysis with the cryo-EM structure of TRPC3.3 In addition, we synthesize recent studies with our new findings to provide a refined model on the mechanism(s) of TRPC3 activation/deactivation. PMID:19704910
Temporal cognition: Connecting subjective time to perception, attention, and memory.
Matthews, William J; Meck, Warren H
2016-08-01
Time is a universal psychological dimension, but time perception has often been studied and discussed in relative isolation. Increasingly, researchers are searching for unifying principles and integrated models that link time perception to other domains. In this review, we survey the links between temporal cognition and other psychological processes. Specifically, we describe how subjective duration is affected by nontemporal stimulus properties (perception), the allocation of processing resources (attention), and past experience with the stimulus (memory). We show that many of these connections instantiate a "processing principle," according to which perceived time is positively related to perceptual vividity and the ease of extracting information from the stimulus. This empirical generalization generates testable predictions and provides a starting-point for integrated theoretical frameworks. By outlining some of the links between temporal cognition and other domains, and by providing a unifying principle for understanding these effects, we hope to encourage time-perception researchers to situate their work within broader theoretical frameworks, and that researchers from other fields will be inspired to apply their insights, techniques, and theorizing to improve our understanding of the representation and judgment of time. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Visscher, Peter M; Goddard, Michael E
2015-01-01
Heritability is a population parameter of importance in evolution, plant and animal breeding, and human medical genetics. It can be estimated using pedigree designs and, more recently, using relationships estimated from markers. We derive the sampling variance of the estimate of heritability for a wide range of experimental designs, assuming that estimation is by maximum likelihood and that the resemblance between relatives is solely due to additive genetic variation. We show that well-known results for balanced designs are special cases of a more general unified framework. For pedigree designs, the sampling variance is inversely proportional to the variance of relationship in the pedigree and it is proportional to 1/N, whereas for population samples it is approximately proportional to 1/N(2), where N is the sample size. Variation in relatedness is a key parameter in the quantification of the sampling variance of heritability. Consequently, the sampling variance is high for populations with large recent effective population size (e.g., humans) because this causes low variation in relationship. However, even using human population samples, low sampling variance is possible with high N. Copyright © 2015 by the Genetics Society of America.
Adaptive unified continuum FEM modeling of a 3D FSI benchmark problem.
Jansson, Johan; Degirmenci, Niyazi Cem; Hoffman, Johan
2017-09-01
In this paper, we address a 3D fluid-structure interaction benchmark problem that represents important characteristics of biomedical modeling. We present a goal-oriented adaptive finite element methodology for incompressible fluid-structure interaction based on a streamline diffusion-type stabilization of the balance equations for mass and momentum for the entire continuum in the domain, which is implemented in the Unicorn/FEniCS software framework. A phase marker function and its corresponding transport equation are introduced to select the constitutive law, where the mesh tracks the discontinuous fluid-structure interface. This results in a unified simulation method for fluids and structures. We present detailed results for the benchmark problem compared with experiments, together with a mesh convergence study. Copyright © 2016 John Wiley & Sons, Ltd.
Unifying physical concepts of reality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, T.L.
1983-08-01
Physics may be characterized as the science of matter and energy. It anchors the two ends of the frontiers of science: the frontier of the very small and the frontier of the very large. All of the phenomena that we observe and study at the frontiers of science - all external experiences - are manifestations of matter and energy. One may, therefore, use physics to exemplify both the diversity and unity of science. This theme will be developed in two separate examples: first by sketching, very briefly, the historical origins of frontiers of the very small and very large andmore » the converging unity of these two frontiers; and then by describing certain unifying concepts that play a central role in physics and provide a framework for relating developments in different sciences.« less
Towards a unified theory of health-disease: II. Holopathogenesis
Almeida-Filho, Naomar
2014-01-01
This article presents a systematic framework for modeling several classes of illness-sickness-disease named as Holopathogenesis. Holopathogenesis is defined as processes of over-determination of diseases and related conditions taken as a whole, comprising selected facets of the complex object Health. First, a conceptual background of Holopathogenesis is presented as a series of significant interfaces (biomolecular-immunological, physiopathological-clinical, epidemiological-ecosocial). Second, propositions derived from Holopathogenesis are introduced in order to allow drawing the disease-illness-sickness complex as a hierarchical network of networks. Third, a formalization of intra- and inter-level correspondences, over-determination processes, effects and links of Holopathogenesis models is proposed. Finally, the Holopathogenesis frame is evaluated as a comprehensive theoretical pathology taken as a preliminary step towards a unified theory of health-disease. PMID:24897040
A unified perspective on robot control - The energy Lyapunov function approach
NASA Technical Reports Server (NTRS)
Wen, John T.
1990-01-01
A unified framework for the stability analysis of robot tracking control is presented. By using an energy-motivated Lyapunov function candidate, the closed-loop stability is shown for a large family of control laws sharing a common structure of proportional and derivative feedback and a model-based feedforward. The feedforward can be zero, partial or complete linearized dynamics, partial or complete nonlinear dynamics, or linearized or nonlinear dynamics with parameter adaptation. As result, the dichotomous approaches to the robot control problem based on the open-loop linearization and nonlinear Lyapunov analysis are both included in this treatment. Furthermore, quantitative estimates of the trade-offs between different schemes in terms of the tracking performance, steady state error, domain of convergence, realtime computation load and required a prior model information are derived.
Hill, Christopher S
2018-05-29
Although there is much talk in various literatures of streams of consciousness, and most of us have an intuitive understanding of such talk, we are far from having a full grasp of what it is that unifies streams of consciousness, binding together the individual experiences that serve as their constituents. In recent years, discussion of this topic has been principally concerned with synchronic unity of consciousness-the form of unity that is exhibited by momentary states of consciousness, or in other words, by time slices or temporal segments of streams. There are two main questions about synchronic unity. First, what is its scope? Are the simultaneous experiences of a single subject necessarily unified? Generally but not necessarily unified? Sometimes unified? And second, what is the nature of synchronic unity? Is it a fundamental phenomenon, and if not, what are the more basic phenomena that constitute it? This essay reviews recent work on these questions, and provides reasons for preferring some answers to others. This article is categorized under: Philosophy > Consciousness Philosophy > Foundations of Cognitive Science Philosophy > Metaphysics. © 2018 Wiley Periodicals, Inc.
The CE/SE Method: a CFD Framework for the Challenges of the New Millennium
NASA Technical Reports Server (NTRS)
Chang, Sin-Chung; Yu, Sheng-Tao
2001-01-01
The space-time conservation element and solution element (CE/SE) method, which was originated and is continuously being developed at NASA Glenn Research Center, is a high-resolution, genuinely multidimensional and unstructured-mesh compatible numerical method for solving conservation laws. Since its inception in 1991, the CE/SE method has been used to obtain highly accurate numerical solutions for 1D, 2D and 3D flow problems involving shocks, contact discontinuities, acoustic waves, vortices, shock/acoustic waves/vortices interactions, shock/boundary layers interactions and chemical reactions. Without the aid of preconditioning or other special techniques, it has been applied to both steady and unsteady flows with speeds ranging from Mach number = 0.00288 to 10. In addition, the method has unique features that allow for (i) the use of very simple non-reflecting boundary conditions, and (ii) a unified wall boundary treatment for viscous and inviscid flows. The CE/SE method was developed with the conviction that, with a solid foundation in physics, a robust, coherent and accurate numerical framework can be built without involving overly complex mathematics. As a result, the method was constructed using a set of design principles that facilitate simplicity, robustness and accuracy. The most important among them are: (i) enforcing both local and global flux conservation in space and time, with flux evaluation at an interface being an integral part of the solution procedure and requiring no interpolation or extrapolation; (ii) unifying space and time and treating them as a single entity; and (iii) requiring that a numerical scheme be built from a nondissipative core scheme such that the numerical dissipation can be effectively controlled and, as a result, will not overwhelm the physical dissipation. Part I of the workshop will be devoted to a discussion of these principles along with a description of how the ID, 2D and 3D CE/SE schemes are constructed. In Part II, various applications of the CE/SE method, particularly those involving chemical reactions and acoustics, will be presented. The workshop will be concluded with a sketch of the future research directions.
Muon g -2 and dark matter suggest nonuniversal gaugino masses: S U (5 )×A4 case study at the LHC
NASA Astrophysics Data System (ADS)
Belyaev, Alexander S.; King, Steve F.; Schaefers, Patrick B.
2018-06-01
We argue that in order to account for the muon anomalous magnetic moment g -2 , dark matter and LHC data, nonuniversal gaugino masses Mi at the high scale are required in the framework of the minimal supersymmetric standard model. We also need a right-handed smuon μ˜R with a mass around 100 GeV, evading LHC searches due to the proximity of a neutralino χ˜10 several GeV lighter which allows successful dark matter. We discuss such a scenario in the framework of an S U (5 ) grand unified theory (GUT) combined with A4 family symmetry, where the three 5 ¯ representations form a single triplet of A4 with a unified soft mass mF, while the three 10 representations are singlets of A4 with independent soft masses mT 1,mT 2,mT 3. Although mT 2 (and hence μ˜R) may be light, the muon g -2 and relic density also requires light M1≃250 GeV , which is incompatible with universal gaugino masses due to LHC constraints on M2 and M3 arising from gaugino searches. After showing that universal gaugino masses M1 /2 at the GUT scale are excluded by gluino searches, we provide a series of benchmarks which show that while M1=M2≪M3 is in tension with 8 and 13 TeV LHC data, M1
NASA Astrophysics Data System (ADS)
Peng, Ao-Ping; Li, Zhi-Hui; Wu, Jun-Lin; Jiang, Xin-Yu
2016-12-01
Based on the previous researches of the Gas-Kinetic Unified Algorithm (GKUA) for flows from highly rarefied free-molecule transition to continuum, a new implicit scheme of cell-centered finite volume method is presented for directly solving the unified Boltzmann model equation covering various flow regimes. In view of the difficulty in generating the single-block grid system with high quality for complex irregular bodies, a multi-block docking grid generation method is designed on the basis of data transmission between blocks, and the data structure is constructed for processing arbitrary connection relations between blocks with high efficiency and reliability. As a result, the gas-kinetic unified algorithm with the implicit scheme and multi-block docking grid has been firstly established and used to solve the reentry flow problems around the multi-bodies covering all flow regimes with the whole range of Knudsen numbers from 10 to 3.7E-6. The implicit and explicit schemes are applied to computing and analyzing the supersonic flows in near-continuum and continuum regimes around a circular cylinder with careful comparison each other. It is shown that the present algorithm and modelling possess much higher computational efficiency and faster converging properties. The flow problems including two and three side-by-side cylinders are simulated from highly rarefied to near-continuum flow regimes, and the present computed results are found in good agreement with the related DSMC simulation and theoretical analysis solutions, which verify the good accuracy and reliability of the present method. It is observed that the spacing of the multi-body is smaller, the cylindrical throat obstruction is greater with the flow field of single-body asymmetrical more obviously and the normal force coefficient bigger. While in the near-continuum transitional flow regime of near-space flying surroundings, the spacing of the multi-body increases to six times of the diameter of the single-body, the interference effects of the multi-bodies tend to be negligible. The computing practice has confirmed that it is feasible for the present method to compute the aerodynamics and reveal flow mechanism around complex multi-body vehicles covering all flow regimes from the gas-kinetic point of view of solving the unified Boltzmann model velocity distribution function equation.
Self-evaluation of decision-making: A general Bayesian framework for metacognitive computation.
Fleming, Stephen M; Daw, Nathaniel D
2017-01-01
People are often aware of their mistakes, and report levels of confidence in their choices that correlate with objective performance. These metacognitive assessments of decision quality are important for the guidance of behavior, particularly when external feedback is absent or sporadic. However, a computational framework that accounts for both confidence and error detection is lacking. In addition, accounts of dissociations between performance and metacognition have often relied on ad hoc assumptions, precluding a unified account of intact and impaired self-evaluation. Here we present a general Bayesian framework in which self-evaluation is cast as a "second-order" inference on a coupled but distinct decision system, computationally equivalent to inferring the performance of another actor. Second-order computation may ensue whenever there is a separation between internal states supporting decisions and confidence estimates over space and/or time. We contrast second-order computation against simpler first-order models in which the same internal state supports both decisions and confidence estimates. Through simulations we show that second-order computation provides a unified account of different types of self-evaluation often considered in separate literatures, such as confidence and error detection, and generates novel predictions about the contribution of one's own actions to metacognitive judgments. In addition, the model provides insight into why subjects' metacognition may sometimes be better or worse than task performance. We suggest that second-order computation may underpin self-evaluative judgments across a range of domains. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Self-Evaluation of Decision-Making: A General Bayesian Framework for Metacognitive Computation
2017-01-01
People are often aware of their mistakes, and report levels of confidence in their choices that correlate with objective performance. These metacognitive assessments of decision quality are important for the guidance of behavior, particularly when external feedback is absent or sporadic. However, a computational framework that accounts for both confidence and error detection is lacking. In addition, accounts of dissociations between performance and metacognition have often relied on ad hoc assumptions, precluding a unified account of intact and impaired self-evaluation. Here we present a general Bayesian framework in which self-evaluation is cast as a “second-order” inference on a coupled but distinct decision system, computationally equivalent to inferring the performance of another actor. Second-order computation may ensue whenever there is a separation between internal states supporting decisions and confidence estimates over space and/or time. We contrast second-order computation against simpler first-order models in which the same internal state supports both decisions and confidence estimates. Through simulations we show that second-order computation provides a unified account of different types of self-evaluation often considered in separate literatures, such as confidence and error detection, and generates novel predictions about the contribution of one’s own actions to metacognitive judgments. In addition, the model provides insight into why subjects’ metacognition may sometimes be better or worse than task performance. We suggest that second-order computation may underpin self-evaluative judgments across a range of domains. PMID:28004960