Sample records for unified framework based

  1. Phase noise suppression for coherent optical block transmission systems: a unified framework.

    PubMed

    Yang, Chuanchuan; Yang, Feng; Wang, Ziyu

    2011-08-29

    A unified framework for phase noise suppression is proposed in this paper, which could be applied in any coherent optical block transmission systems, including coherent optical orthogonal frequency-division multiplexing (CO-OFDM), coherent optical single-carrier frequency-domain equalization block transmission (CO-SCFDE), etc. Based on adaptive modeling of phase noise, unified observation equations for different coherent optical block transmission systems are constructed, which lead to unified phase noise estimation and suppression. Numerical results demonstrate that the proposal is powerful in mitigating laser phase noise.

  2. A Unified Framework for Complex Networks with Degree Trichotomy Based on Markov Chains.

    PubMed

    Hui, David Shui Wing; Chen, Yi-Chao; Zhang, Gong; Wu, Weijie; Chen, Guanrong; Lui, John C S; Li, Yingtao

    2017-06-16

    This paper establishes a Markov chain model as a unified framework for describing the evolution processes in complex networks. The unique feature of the proposed model is its capability in addressing the formation mechanism that can reflect the "trichotomy" observed in degree distributions, based on which closed-form solutions can be derived. Important special cases of the proposed unified framework are those classical models, including Poisson, Exponential, Power-law distributed networks. Both simulation and experimental results demonstrate a good match of the proposed model with real datasets, showing its superiority over the classical models. Implications of the model to various applications including citation analysis, online social networks, and vehicular networks design, are also discussed in the paper.

  3. A unified structural/terminological interoperability framework based on LexEVS: application to TRANSFoRm.

    PubMed

    Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita

    2013-01-01

    Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration.

  4. A unified structural/terminological interoperability framework based on LexEVS: application to TRANSFoRm

    PubMed Central

    Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita

    2013-01-01

    Objective Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. Materials and methods We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Results Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. Conclusions We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration. PMID:23571850

  5. Robopedia: Leveraging Sensorpedia for Web-Enabled Robot Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Resseguie, David R

    There is a growing interest in building Internetscale sensor networks that integrate sensors from around the world into a single unified system. In contrast, robotics application development has primarily focused on building specialized systems. These specialized systems take scalability and reliability into consideration, but generally neglect exploring the key components required to build a large scale system. Integrating robotic applications with Internet-scale sensor networks will unify specialized robotics applications and provide answers to large scale implementation concerns. We focus on utilizing Internet-scale sensor network technology to construct a framework for unifying robotic systems. Our framework web-enables a surveillance robot smore » sensor observations and provides a webinterface to the robot s actuators. This lets robots seamlessly integrate into web applications. In addition, the framework eliminates most prerequisite robotics knowledge, allowing for the creation of general web-based robotics applications. The framework also provides mechanisms to create applications that can interface with any robot. Frameworks such as this one are key to solving large scale mobile robotics implementation problems. We provide an overview of previous Internetscale sensor networks, Sensorpedia (an ad-hoc Internet-scale sensor network), our framework for integrating robots with Sensorpedia, two applications which illustrate our frameworks ability to support general web-based robotic control, and offer experimental results that illustrate our framework s scalability, feasibility, and resource requirements.« less

  6. A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects

    PubMed Central

    Slob, Wout

    2015-01-01

    Background When chemical health hazards have been identified, probabilistic dose–response assessment (“hazard characterization”) quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. Objectives We developed a unified framework for probabilistic dose–response assessment. Methods We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose–response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, “effect metrics” can be specified to define “toxicologically equivalent” sizes for this underlying individual response; and d) dose–response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose–response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Results Probabilistically derived exposure limits are based on estimating a “target human dose” (HDMI), which requires risk management–informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%–10% effect sizes. Conclusions Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions. Citation Chiu WA, Slob W. 2015. A unified probabilistic framework for dose–response assessment of human health effects. Environ Health Perspect 123:1241–1254; http://dx.doi.org/10.1289/ehp.1409385 PMID:26006063

  7. A Unified Classification Framework for FP, DP and CP Data at X-Band in Southern China

    NASA Astrophysics Data System (ADS)

    Xie, Lei; Zhang, Hong; Li, Hhongzhong; Wang, Chao

    2015-04-01

    The main objective of this paper is to introduce an unified framework for crop classification in Southern China using data in fully polarimetric (FP), dual-pol (DP) and compact polarimetric (CP) modes. The TerraSAR-X data acquired over the Leizhou Peninsula, South China are used in our experiments. The study site involves four main crops (rice, banana, sugarcane eucalyptus). Through exploring the similarities between data in these three modes, a knowledge-based characteristic space is created and the unified framework is presented. The overall classification accuracies for data in the FP, coherent HH/VV are about 95%, and is about 91% in CP modes, which suggests that the proposed classification scheme is effective and promising. Compared with the Wishart Maximum Likelihood (ML) classifier, the proposed method exhibits higher classification accuracy.

  8. Pattern-oriented modeling of agent-based complex systems: Lessons from ecology

    USGS Publications Warehouse

    Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.

    2005-01-01

    Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.

  9. Pattern-Oriented Modeling of Agent-Based Complex Systems: Lessons from Ecology

    NASA Astrophysics Data System (ADS)

    Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.

    2005-11-01

    Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.

  10. A general modeling framework for describing spatially structured population dynamics

    USGS Publications Warehouse

    Sample, Christine; Fryxell, John; Bieri, Joanna; Federico, Paula; Earl, Julia; Wiederholt, Ruscena; Mattsson, Brady; Flockhart, Tyler; Nicol, Sam; Diffendorfer, James E.; Thogmartin, Wayne E.; Erickson, Richard A.; Norris, D. Ryan

    2017-01-01

    Variation in movement across time and space fundamentally shapes the abundance and distribution of populations. Although a variety of approaches model structured population dynamics, they are limited to specific types of spatially structured populations and lack a unifying framework. Here, we propose a unified network-based framework sufficiently novel in its flexibility to capture a wide variety of spatiotemporal processes including metapopulations and a range of migratory patterns. It can accommodate different kinds of age structures, forms of population growth, dispersal, nomadism and migration, and alternative life-history strategies. Our objective was to link three general elements common to all spatially structured populations (space, time and movement) under a single mathematical framework. To do this, we adopt a network modeling approach. The spatial structure of a population is represented by a weighted and directed network. Each node and each edge has a set of attributes which vary through time. The dynamics of our network-based population is modeled with discrete time steps. Using both theoretical and real-world examples, we show how common elements recur across species with disparate movement strategies and how they can be combined under a unified mathematical framework. We illustrate how metapopulations, various migratory patterns, and nomadism can be represented with this modeling approach. We also apply our network-based framework to four organisms spanning a wide range of life histories, movement patterns, and carrying capacities. General computer code to implement our framework is provided, which can be applied to almost any spatially structured population. This framework contributes to our theoretical understanding of population dynamics and has practical management applications, including understanding the impact of perturbations on population size, distribution, and movement patterns. By working within a common framework, there is less chance that comparative analyses are colored by model details rather than general principles

  11. Toward a unifying framework for evolutionary processes.

    PubMed

    Paixão, Tiago; Badkobeh, Golnaz; Barton, Nick; Çörüş, Doğan; Dang, Duc-Cuong; Friedrich, Tobias; Lehre, Per Kristian; Sudholt, Dirk; Sutton, Andrew M; Trubenová, Barbora

    2015-10-21

    The theory of population genetics and evolutionary computation have been evolving separately for nearly 30 years. Many results have been independently obtained in both fields and many others are unique to its respective field. We aim to bridge this gap by developing a unifying framework for evolutionary processes that allows both evolutionary algorithms and population genetics models to be cast in the same formal framework. The framework we present here decomposes the evolutionary process into its several components in order to facilitate the identification of similarities between different models. In particular, we propose a classification of evolutionary operators based on the defining properties of the different components. We cast several commonly used operators from both fields into this common framework. Using this, we map different evolutionary and genetic algorithms to different evolutionary regimes and identify candidates with the most potential for the translation of results between the fields. This provides a unified description of evolutionary processes and represents a stepping stone towards new tools and results to both fields. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. A unifying framework for systems modeling, control systems design, and system operation

    NASA Technical Reports Server (NTRS)

    Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.

    2005-01-01

    Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.

  13. A Unified Framework for Monetary Theory and Policy Analysis.

    ERIC Educational Resources Information Center

    Lagos, Ricardo; Wright, Randall

    2005-01-01

    Search-theoretic models of monetary exchange are based on explicit descriptions of the frictions that make money essential. However, tractable versions of these models typically make strong assumptions that render them ill suited for monetary policy analysis. We propose a new framework, based on explicit micro foundations, within which macro…

  14. A Unified Probabilistic Framework for Dose-Response Assessment of Human Health Effects.

    PubMed

    Chiu, Weihsueh A; Slob, Wout

    2015-12-01

    When chemical health hazards have been identified, probabilistic dose-response assessment ("hazard characterization") quantifies uncertainty and/or variability in toxicity as a function of human exposure. Existing probabilistic approaches differ for different types of endpoints or modes-of-action, lacking a unifying framework. We developed a unified framework for probabilistic dose-response assessment. We established a framework based on four principles: a) individual and population dose responses are distinct; b) dose-response relationships for all (including quantal) endpoints can be recast as relating to an underlying continuous measure of response at the individual level; c) for effects relevant to humans, "effect metrics" can be specified to define "toxicologically equivalent" sizes for this underlying individual response; and d) dose-response assessment requires making adjustments and accounting for uncertainty and variability. We then derived a step-by-step probabilistic approach for dose-response assessment of animal toxicology data similar to how nonprobabilistic reference doses are derived, illustrating the approach with example non-cancer and cancer datasets. Probabilistically derived exposure limits are based on estimating a "target human dose" (HDMI), which requires risk management-informed choices for the magnitude (M) of individual effect being protected against, the remaining incidence (I) of individuals with effects ≥ M in the population, and the percent confidence. In the example datasets, probabilistically derived 90% confidence intervals for HDMI values span a 40- to 60-fold range, where I = 1% of the population experiences ≥ M = 1%-10% effect sizes. Although some implementation challenges remain, this unified probabilistic framework can provide substantially more complete and transparent characterization of chemical hazards and support better-informed risk management decisions.

  15. RT-18: Value of Flexibility. Phase 1

    DTIC Science & Technology

    2010-09-25

    an analytical framework based on sound mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory...framework that is mathematically consistent, domain independent and applicable under varying information levels. This report presents our advances in...During this period, we also explored the development of an analytical framework based on sound mathematical constructs. A review of the current state

  16. Framework Design of Unified Cross-Authentication Based on the Fourth Platform Integrated Payment

    NASA Astrophysics Data System (ADS)

    Yong, Xu; Yujin, He

    The essay advances a unified authentication based on the fourth integrated payment platform. The research aims at improving the compatibility of the authentication in electronic business and providing a reference for the establishment of credit system by seeking a way to carry out a standard unified authentication on a integrated payment platform. The essay introduces the concept of the forth integrated payment platform and finally put forward the whole structure and different components. The main issue of the essay is about the design of the credit system of the fourth integrated payment platform and the PKI/CA structure design.

  17. Students and Teacher Academic Evaluation Perceptions: Methodology to Construct a Representation Based on Actionable Knowledge Discovery Framework

    ERIC Educational Resources Information Center

    Molina, Otilia Alejandro; Ratté, Sylvie

    2017-01-01

    This research introduces a method to construct a unified representation of teachers and students perspectives based on the actionable knowledge discovery (AKD) and delivery framework. The representation is constructed using two models: one obtained from student evaluations and the other obtained from teachers' reflections about their teaching…

  18. Value of Flexibility - Phase 1

    DTIC Science & Technology

    2010-09-25

    weaknesses of each approach. During this period, we also explored the development of an analytical framework based on sound mathematical constructs... mathematical constructs. A review of the current state-of-the-art showed that there is little unifying theory or guidance on best approaches to...research activities is in developing a coherent value based definition of flexibility that is based on an analytical framework that is mathematically

  19. [Arabian food pyramid: unified framework for nutritional health messages].

    PubMed

    Shokr, Adel M

    2008-01-01

    There are several ways to present nutritional health messages, particularly pyramidic indices, but they have many deficiencies such as lack of agreement on a unified or clear methodology for food grouping and ignoring nutritional group inter-relation and integration. This causes confusion for health educators and target individuals. This paper presents an Arabian food pyramid that aims to unify the bases of nutritional health messages, bringing together the function, contents, source and nutritional group servings and indicating the inter-relation and integration of nutritional groups. This provides comprehensive, integrated, simple and flexible health messages.

  20. A unified framework for building high performance DVEs

    NASA Astrophysics Data System (ADS)

    Lei, Kaibin; Ma, Zhixia; Xiong, Hua

    2011-10-01

    A unified framework for integrating PC cluster based parallel rendering with distributed virtual environments (DVEs) is presented in this paper. While various scene graphs have been proposed in DVEs, it is difficult to enable collaboration of different scene graphs. This paper proposes a technique for non-distributed scene graphs with the capability of object and event distribution. With the increase of graphics data, DVEs require more powerful rendering ability. But general scene graphs are inefficient in parallel rendering. The paper also proposes a technique to connect a DVE and a PC cluster based parallel rendering environment. A distributed multi-player video game is developed to show the interaction of different scene graphs and the parallel rendering performance on a large tiled display wall.

  1. A Unified Theoretical Framework for Cognitive Sequencing.

    PubMed

    Savalia, Tejas; Shukla, Anuj; Bapi, Raju S

    2016-01-01

    The capacity to sequence information is central to human performance. Sequencing ability forms the foundation stone for higher order cognition related to language and goal-directed planning. Information related to the order of items, their timing, chunking and hierarchical organization are important aspects in sequencing. Past research on sequencing has emphasized two distinct and independent dichotomies: implicit vs. explicit and goal-directed vs. habits. We propose a theoretical framework unifying these two streams. Our proposal relies on brain's ability to implicitly extract statistical regularities from the stream of stimuli and with attentional engagement organizing sequences explicitly and hierarchically. Similarly, sequences that need to be assembled purposively to accomplish a goal require engagement of attentional processes. With repetition, these goal-directed plans become habits with concomitant disengagement of attention. Thus, attention and awareness play a crucial role in the implicit-to-explicit transition as well as in how goal-directed plans become automatic habits. Cortico-subcortical loops basal ganglia-frontal cortex and hippocampus-frontal cortex loops mediate the transition process. We show how the computational principles of model-free and model-based learning paradigms, along with a pivotal role for attention and awareness, offer a unifying framework for these two dichotomies. Based on this framework, we make testable predictions related to the potential influence of response-to-stimulus interval (RSI) on developing awareness in implicit learning tasks.

  2. A Unified Theoretical Framework for Cognitive Sequencing

    PubMed Central

    Savalia, Tejas; Shukla, Anuj; Bapi, Raju S.

    2016-01-01

    The capacity to sequence information is central to human performance. Sequencing ability forms the foundation stone for higher order cognition related to language and goal-directed planning. Information related to the order of items, their timing, chunking and hierarchical organization are important aspects in sequencing. Past research on sequencing has emphasized two distinct and independent dichotomies: implicit vs. explicit and goal-directed vs. habits. We propose a theoretical framework unifying these two streams. Our proposal relies on brain's ability to implicitly extract statistical regularities from the stream of stimuli and with attentional engagement organizing sequences explicitly and hierarchically. Similarly, sequences that need to be assembled purposively to accomplish a goal require engagement of attentional processes. With repetition, these goal-directed plans become habits with concomitant disengagement of attention. Thus, attention and awareness play a crucial role in the implicit-to-explicit transition as well as in how goal-directed plans become automatic habits. Cortico-subcortical loops basal ganglia-frontal cortex and hippocampus-frontal cortex loops mediate the transition process. We show how the computational principles of model-free and model-based learning paradigms, along with a pivotal role for attention and awareness, offer a unifying framework for these two dichotomies. Based on this framework, we make testable predictions related to the potential influence of response-to-stimulus interval (RSI) on developing awareness in implicit learning tasks. PMID:27917146

  3. A Unified Framework for Association Analysis with Multiple Related Phenotypes

    PubMed Central

    Stephens, Matthew

    2013-01-01

    We consider the problem of assessing associations between multiple related outcome variables, and a single explanatory variable of interest. This problem arises in many settings, including genetic association studies, where the explanatory variable is genotype at a genetic variant. We outline a framework for conducting this type of analysis, based on Bayesian model comparison and model averaging for multivariate regressions. This framework unifies several common approaches to this problem, and includes both standard univariate and standard multivariate association tests as special cases. The framework also unifies the problems of testing for associations and explaining associations – that is, identifying which outcome variables are associated with genotype. This provides an alternative to the usual, but conceptually unsatisfying, approach of resorting to univariate tests when explaining and interpreting significant multivariate findings. The method is computationally tractable genome-wide for modest numbers of phenotypes (e.g. 5–10), and can be applied to summary data, without access to raw genotype and phenotype data. We illustrate the methods on both simulated examples, and to a genome-wide association study of blood lipid traits where we identify 18 potential novel genetic associations that were not identified by univariate analyses of the same data. PMID:23861737

  4. A unified framework for heat and mass transport at the atomic scale

    NASA Astrophysics Data System (ADS)

    Ponga, Mauricio; Sun, Dingyi

    2018-04-01

    We present a unified framework to simulate heat and mass transport in systems of particles. The proposed framework is based on kinematic mean field theory and uses a phenomenological master equation to compute effective transport rates between particles without the need to evaluate operators. We exploit this advantage and apply the model to simulate transport phenomena at the nanoscale. We demonstrate that, when calibrated to experimentally-measured transport coefficients, the model can accurately predict transient and steady state temperature and concentration profiles even in scenarios where the length of the device is comparable to the mean free path of the carriers. Through several example applications, we demonstrate the validity of our model for all classes of materials, including ones that, until now, would have been outside the domain of computational feasibility.

  5. A unified framework for image retrieval using keyword and visual features.

    PubMed

    Jing, Feng; Li, Mingling; Zhang, Hong-Jiang; Zhang, Bo

    2005-07-01

    In this paper, a unified image retrieval framework based on both keyword annotations and visual features is proposed. In this framework, a set of statistical models are built based on visual features of a small set of manually labeled images to represent semantic concepts and used to propagate keywords to other unlabeled images. These models are updated periodically when more images implicitly labeled by users become available through relevance feedback. In this sense, the keyword models serve the function of accumulation and memorization of knowledge learned from user-provided relevance feedback. Furthermore, two sets of effective and efficient similarity measures and relevance feedback schemes are proposed for query by keyword scenario and query by image example scenario, respectively. Keyword models are combined with visual features in these schemes. In particular, a new, entropy-based active learning strategy is introduced to improve the efficiency of relevance feedback for query by keyword. Furthermore, a new algorithm is proposed to estimate the keyword features of the search concept for query by image example. It is shown to be more appropriate than two existing relevance feedback algorithms. Experimental results demonstrate the effectiveness of the proposed framework.

  6. In quest of a systematic framework for unifying and defining nanoscience

    PubMed Central

    2009-01-01

    This article proposes a systematic framework for unifying and defining nanoscience based on historic first principles and step logic that led to a “central paradigm” (i.e., unifying framework) for traditional elemental/small-molecule chemistry. As such, a Nanomaterials classification roadmap is proposed, which divides all nanomatter into Category I: discrete, well-defined and Category II: statistical, undefined nanoparticles. We consider only Category I, well-defined nanoparticles which are >90% monodisperse as a function of Critical Nanoscale Design Parameters (CNDPs) defined according to: (a) size, (b) shape, (c) surface chemistry, (d) flexibility, and (e) elemental composition. Classified as either hard (H) (i.e., inorganic-based) or soft (S) (i.e., organic-based) categories, these nanoparticles were found to manifest pervasive atom mimicry features that included: (1) a dominance of zero-dimensional (0D) core–shell nanoarchitectures, (2) the ability to self-assemble or chemically bond as discrete, quantized nanounits, and (3) exhibited well-defined nanoscale valencies and stoichiometries reminiscent of atom-based elements. These discrete nanoparticle categories are referred to as hard or soft particle nanoelements. Many examples describing chemical bonding/assembly of these nanoelements have been reported in the literature. We refer to these hard:hard (H-n:H-n), soft:soft (S-n:S-n), or hard:soft (H-n:S-n) nanoelement combinations as nanocompounds. Due to their quantized features, many nanoelement and nanocompound categories are reported to exhibit well-defined nanoperiodic property patterns. These periodic property patterns are dependent on their quantized nanofeatures (CNDPs) and dramatically influence intrinsic physicochemical properties (i.e., melting points, reactivity/self-assembly, sterics, and nanoencapsulation), as well as important functional/performance properties (i.e., magnetic, photonic, electronic, and toxicologic properties). We propose this perspective as a modest first step toward more clearly defining synthetic nanochemistry as well as providing a systematic framework for unifying nanoscience. With further progress, one should anticipate the evolution of future nanoperiodic table(s) suitable for predicting important risk/benefit boundaries in the field of nanoscience. Electronic supplementary material The online version of this article (doi:10.1007/s11051-009-9632-z) contains supplementary material, which is available to authorized users. PMID:21170133

  7. A Unified Framework for Analyzing and Designing for Stationary Arterial Networks

    DOT National Transportation Integrated Search

    2017-05-17

    This research aims to develop a unified theoretical and simulation framework for analyzing and designing signals for stationary arterial networks. Existing traffic flow models used in design and analysis of signal control strategies are either too si...

  8. A quasi-likelihood approach to non-negative matrix factorization

    PubMed Central

    Devarajan, Karthik; Cheung, Vincent C.K.

    2017-01-01

    A unified approach to non-negative matrix factorization based on the theory of generalized linear models is proposed. This approach embeds a variety of statistical models, including the exponential family, within a single theoretical framework and provides a unified view of such factorizations from the perspective of quasi-likelihood. Using this framework, a family of algorithms for handling signal-dependent noise is developed and its convergence proven using the Expectation-Maximization algorithm. In addition, a measure to evaluate the goodness-of-fit of the resulting factorization is described. The proposed methods allow modeling of non-linear effects via appropriate link functions and are illustrated using an application in biomedical signal processing. PMID:27348511

  9. Groundwater modelling in decision support: reflections on a unified conceptual framework

    NASA Astrophysics Data System (ADS)

    Doherty, John; Simmons, Craig T.

    2013-11-01

    Groundwater models are commonly used as basis for environmental decision-making. There has been discussion and debate in recent times regarding the issue of model simplicity and complexity. This paper contributes to this ongoing discourse. The selection of an appropriate level of model structural and parameterization complexity is not a simple matter. Although the metrics on which such selection should be based are simple, there are many competing, and often unquantifiable, considerations which must be taken into account as these metrics are applied. A unified conceptual framework is introduced and described which is intended to underpin groundwater modelling in decision support with a direct focus on matters regarding model simplicity and complexity.

  10. Generic-distributed framework for cloud services marketplace based on unified ontology.

    PubMed

    Hasan, Samer; Valli Kumari, V

    2017-11-01

    Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo) to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors' knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.

  11. Unified Bayesian Estimator of EEG Reference at Infinity: rREST (Regularized Reference Electrode Standardization Technique).

    PubMed

    Hu, Shiang; Yao, Dezhong; Valdes-Sosa, Pedro A

    2018-01-01

    The choice of reference for the electroencephalogram (EEG) is a long-lasting unsolved issue resulting in inconsistent usages and endless debates. Currently, both the average reference (AR) and the reference electrode standardization technique (REST) are two primary, apparently irreconcilable contenders. We propose a theoretical framework to resolve this reference issue by formulating both (a) estimation of potentials at infinity, and (b) determination of the reference, as a unified Bayesian linear inverse problem, which can be solved by maximum a posterior estimation. We find that AR and REST are very particular cases of this unified framework: AR results from biophysically non-informative prior; while REST utilizes the prior based on the EEG generative model. To allow for simultaneous denoising and reference estimation, we develop the regularized versions of AR and REST, named rAR and rREST, respectively. Both depend on a regularization parameter that is the noise to signal variance ratio. Traditional and new estimators are evaluated with this framework, by both simulations and analysis of real resting EEGs. Toward this end, we leverage the MRI and EEG data from 89 subjects which participated in the Cuban Human Brain Mapping Project. Generated artificial EEGs-with a known ground truth, show that relative error in estimating the EEG potentials at infinity is lowest for rREST. It also reveals that realistic volume conductor models improve the performances of REST and rREST. Importantly, for practical applications, it is shown that an average lead field gives the results comparable to the individual lead field. Finally, it is shown that the selection of the regularization parameter with Generalized Cross-Validation (GCV) is close to the "oracle" choice based on the ground truth. When evaluated with the real 89 resting state EEGs, rREST consistently yields the lowest GCV. This study provides a novel perspective to the EEG reference problem by means of a unified inverse solution framework. It may allow additional principled theoretical formulations and numerical evaluation of performance.

  12. A Systematic Framework and Nanoperiodic Concept for Unifying Nanoscience: Hard/Soft Nanoelements, Superatoms, Meta-Atoms, New Emerging Properties, Periodic Property Patterns, and Predictive Mendeleev-like Nanoperiodic Tables.

    PubMed

    Tomalia, Donald A; Khanna, Shiv N

    2016-02-24

    Development of a central paradigm is undoubtedly the single most influential force responsible for advancing Dalton's 19th century atomic/molecular chemistry concepts to the current maturity enjoyed by traditional chemistry. A similar central dogma for guiding and unifying nanoscience has been missing. This review traces the origins, evolution, and current status of such a critical nanoperiodic concept/framework for defining and unifying nanoscience. Based on parallel efforts and a mutual consensus now shared by both chemists and physicists, a nanoperiodic/systematic framework concept has emerged. This concept is based on the well-documented existence of discrete, nanoscale collections of traditional inorganic/organic atoms referred to as hard and soft superatoms (i.e., nanoelement categories). These nanometric entities are widely recognized to exhibit nanoscale atom mimicry features reminiscent of traditional picoscale atoms. All unique superatom/nanoelement physicochemical features are derived from quantized structural control defined by six critical nanoscale design parameters (CNDPs), namely, size, shape, surface chemistry, flexibility/rigidity, architecture, and elemental composition. These CNDPs determine all intrinsic superatom properties, their combining behavior to form stoichiometric nanocompounds/assemblies as well as to exhibit nanoperiodic properties leading to new nanoperiodic rules and predictive Mendeleev-like nanoperiodic tables, and they portend possible extension of these principles to larger quantized building blocks including meta-atoms.

  13. A new view of Baryon symmetric cosmology based on grand unified theories

    NASA Technical Reports Server (NTRS)

    Stecker, F. W.

    1981-01-01

    Within the framework of grand unified theories, it is shown how spontaneous CP violation leads to a domain structure in the universe with the domains evolving into separate regions of matter and antimatter excesses. Subsequent to exponential horizon growth, this can result in a universe of matter galaxies and antimatter galaxies. Various astrophysical data appear to favor this form of big bang cosmology. Future direct tests for cosmologically significant antimatter are discussed.

  14. Development and application of unified algorithms for problems in computational science

    NASA Technical Reports Server (NTRS)

    Shankar, Vijaya; Chakravarthy, Sukumar

    1987-01-01

    A framework is presented for developing computationally unified numerical algorithms for solving nonlinear equations that arise in modeling various problems in mathematical physics. The concept of computational unification is an attempt to encompass efficient solution procedures for computing various nonlinear phenomena that may occur in a given problem. For example, in Computational Fluid Dynamics (CFD), a unified algorithm will be one that allows for solutions to subsonic (elliptic), transonic (mixed elliptic-hyperbolic), and supersonic (hyperbolic) flows for both steady and unsteady problems. The objectives are: development of superior unified algorithms emphasizing accuracy and efficiency aspects; development of codes based on selected algorithms leading to validation; application of mature codes to realistic problems; and extension/application of CFD-based algorithms to problems in other areas of mathematical physics. The ultimate objective is to achieve integration of multidisciplinary technologies to enhance synergism in the design process through computational simulation. Specific unified algorithms for a hierarchy of gas dynamics equations and their applications to two other areas: electromagnetic scattering, and laser-materials interaction accounting for melting.

  15. A development framework for semantically interoperable health information systems.

    PubMed

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  16. The unified model of vegetarian identity: A conceptual framework for understanding plant-based food choices.

    PubMed

    Rosenfeld, Daniel L; Burrow, Anthony L

    2017-05-01

    By departing from social norms regarding food behaviors, vegetarians acquire membership in a distinct social group and can develop a salient vegetarian identity. However, vegetarian identities are diverse, multidimensional, and unique to each individual. Much research has identified fundamental psychological aspects of vegetarianism, and an identity framework that unifies these findings into common constructs and conceptually defines variables is needed. Integrating psychological theories of identity with research on food choices and vegetarianism, this paper proposes a conceptual model for studying vegetarianism: The Unified Model of Vegetarian Identity (UMVI). The UMVI encompasses ten dimensions-organized into three levels (contextual, internalized, and externalized)-that capture the role of vegetarianism in an individual's self-concept. Contextual dimensions situate vegetarianism within contexts; internalized dimensions outline self-evaluations; and externalized dimensions describe enactments of identity through behavior. Together, these dimensions form a coherent vegetarian identity, characterizing one's thoughts, feelings, and behaviors regarding being vegetarian. By unifying dimensions that capture psychological constructs universally, the UMVI can prevent discrepancies in operationalization, capture the inherent diversity of vegetarian identities, and enable future research to generate greater insight into how people understand themselves and their food choices. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. [Research on tumor information grid framework].

    PubMed

    Zhang, Haowei; Qin, Zhu; Liu, Ying; Tan, Jianghao; Cao, Haitao; Chen, Youping; Zhang, Ke; Ding, Yuqing

    2013-10-01

    In order to realize tumor disease information sharing and unified management, we utilized grid technology to make the data and software resources which distributed in various medical institutions for effective integration so that we could make the heterogeneous resources consistent and interoperable in both semantics and syntax aspects. This article describes the tumor grid framework, the type of the service being packaged in Web Service Description Language (WSDL) and extensible markup language schemas definition (XSD), the client use the serialized document to operate the distributed resources. The service objects could be built by Unified Modeling Language (UML) as middle ware to create application programming interface. All of the grid resources are registered in the index and released in the form of Web Services based on Web Services Resource Framework (WSRF). Using the system we can build a multi-center, large sample and networking tumor disease resource sharing framework to improve the level of development in medical scientific research institutions and the patient's quality of life.

  18. Control of Distributed Parameter Systems

    DTIC Science & Technology

    1990-08-01

    vari- ant of the general Lotka - Volterra model for interspecific competition. The variant described the emergence of one subpopulation from another as a...distribut ion unlimited. I&. ARSTRACT (MAUMUnw2O1 A unified arioroximation framework for Parameter estimation In general linear POE models has been completed...unified approximation framework for parameter estimation in general linear PDE models. This framework has provided the theoretical basis for a number of

  19. Unified framework for information integration based on information geometry

    PubMed Central

    Oizumi, Masafumi; Amari, Shun-ichi

    2016-01-01

    Assessment of causal influences is a ubiquitous and important subject across diverse research fields. Drawn from consciousness studies, integrated information is a measure that defines integration as the degree of causal influences among elements. Whereas pairwise causal influences between elements can be quantified with existing methods, quantifying multiple influences among many elements poses two major mathematical difficulties. First, overestimation occurs due to interdependence among influences if each influence is separately quantified in a part-based manner and then simply summed over. Second, it is difficult to isolate causal influences while avoiding noncausal confounding influences. To resolve these difficulties, we propose a theoretical framework based on information geometry for the quantification of multiple causal influences with a holistic approach. We derive a measure of integrated information, which is geometrically interpreted as the divergence between the actual probability distribution of a system and an approximated probability distribution where causal influences among elements are statistically disconnected. This framework provides intuitive geometric interpretations harmonizing various information theoretic measures in a unified manner, including mutual information, transfer entropy, stochastic interaction, and integrated information, each of which is characterized by how causal influences are disconnected. In addition to the mathematical assessment of consciousness, our framework should help to analyze causal relationships in complex systems in a complete and hierarchical manner. PMID:27930289

  20. A theoretical formulation of wave-vortex interactions

    NASA Technical Reports Server (NTRS)

    Wu, J. Z.; Wu, J. M.

    1989-01-01

    A unified theoretical formulation for wave-vortex interaction, designated the '(omega, Pi) framework,' is presented. Based on the orthogonal decomposition of fluid dynamic interactions, the formulation can be used to study a variety of problems, including the interaction of a longitudinal (acoustic) wave and/or transverse (vortical) wave with a main vortex flow. Moreover, the formulation permits a unified treatment of wave-vortex interaction at various approximate levels, where the normal 'piston' process and tangential 'rubbing' process can be approximated dfferently.

  1. Some characteristics of supernetworks based on unified hybrid network theory framework

    NASA Astrophysics Data System (ADS)

    Liu, Qiang; Fang, Jin-Qing; Li, Yong

    Comparing with single complex networks, supernetworks are more close to the real world in some ways, and have become the newest research hot spot in the network science recently. Some progresses have been made in the research of supernetworks, but the theoretical research method and complex network characteristics of supernetwork models are still needed to further explore. In this paper, we propose three kinds of supernetwork models with three layers based on the unified hybrid network theory framework (UHNTF), and introduce preferential and random linking, respectively, between the upper and lower layers. Then we compared the topological characteristics of the single networks with the supernetwork models. In order to analyze the influence of the interlayer edges on network characteristics, the cross-degree is defined as a new important parameter. Then some interesting new phenomena are found, the results imply this supernetwork model has reference value and application potential.

  2. A UML profile for framework modeling.

    PubMed

    Xu, Xiao-liang; Wang, Le-yu; Zhou, Hong

    2004-01-01

    The current standard Unified Modeling Language(UML) could not model framework flexibility and extendability adequately due to lack of appropriate constructs to distinguish framework hot-spots from kernel elements. A new UML profile that may customize UML for framework modeling was presented using the extension mechanisms of UML, providing a group of UML extensions to meet the needs of framework modeling. In this profile, the extended class diagrams and sequence diagrams were defined to straightforwardly identify the hot-spots and describe their instantiation restrictions. A transformation model based on design patterns was also put forward, such that the profile based framework design diagrams could be automatically mapped to the corresponding implementation diagrams. It was proved that the presented profile makes framework modeling more straightforwardly and therefore easier to understand and instantiate.

  3. Fully Associative, Nonisothermal, Potential-Based Unified Viscoplastic Model for Titanium-Based Matrices

    NASA Technical Reports Server (NTRS)

    2005-01-01

    A number of titanium matrix composite (TMC) systems are currently being investigated for high-temperature air frame and propulsion system applications. As a result, numerous computational methodologies for predicting both deformation and life for this class of materials are under development. An integral part of these methodologies is an accurate and computationally efficient constitutive model for the metallic matrix constituent. Furthermore, because these systems are designed to operate at elevated temperatures, the required constitutive models must account for both time-dependent and time-independent deformations. To accomplish this, the NASA Lewis Research Center is employing a recently developed, complete, potential-based framework. This framework, which utilizes internal state variables, was put forth for the derivation of reversible and irreversible constitutive equations. The framework, and consequently the resulting constitutive model, is termed complete because the existence of the total (integrated) form of the Gibbs complementary free energy and complementary dissipation potentials are assumed a priori. The specific forms selected here for both the Gibbs and complementary dissipation potentials result in a fully associative, multiaxial, nonisothermal, unified viscoplastic model with nonlinear kinematic hardening. This model constitutes one of many models in the Generalized Viscoplasticity with Potential Structure (GVIPS) class of inelastic constitutive equations.

  4. Evaluating Health Information Systems Using Ontologies

    PubMed Central

    Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan

    2016-01-01

    Background There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. Objectives The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems—whether similar or heterogeneous—by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. Methods On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. Results The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. Conclusions The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems. PMID:27311735

  5. Evaluating Health Information Systems Using Ontologies.

    PubMed

    Eivazzadeh, Shahryar; Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan

    2016-06-16

    There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems-whether similar or heterogeneous-by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems.

  6. Clinical data integration model. Core interoperability ontology for research using primary care data.

    PubMed

    Ethier, J-F; Curcin, V; Barton, A; McGilchrist, M M; Bastiaens, H; Andreasson, A; Rossiter, J; Zhao, L; Arvanitis, T N; Taweel, A; Delaney, B C; Burgun, A

    2015-01-01

    This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". Primary care data is the single richest source of routine health care data. However its use, both in research and clinical work, often requires data from multiple clinical sites, clinical trials databases and registries. Data integration and interoperability are therefore of utmost importance. TRANSFoRm's general approach relies on a unified interoperability framework, described in a previous paper. We developed a core ontology for an interoperability framework based on data mediation. This article presents how such an ontology, the Clinical Data Integration Model (CDIM), can be designed to support, in conjunction with appropriate terminologies, biomedical data federation within TRANSFoRm, an EU FP7 project that aims to develop the digital infrastructure for a learning healthcare system in European Primary Care. TRANSFoRm utilizes a unified structural / terminological interoperability framework, based on the local-as-view mediation paradigm. Such an approach mandates the global information model to describe the domain of interest independently of the data sources to be explored. Following a requirement analysis process, no ontology focusing on primary care research was identified and, thus we designed a realist ontology based on Basic Formal Ontology to support our framework in collaboration with various terminologies used in primary care. The resulting ontology has 549 classes and 82 object properties and is used to support data integration for TRANSFoRm's use cases. Concepts identified by researchers were successfully expressed in queries using CDIM and pertinent terminologies. As an example, we illustrate how, in TRANSFoRm, the Query Formulation Workbench can capture eligibility criteria in a computable representation, which is based on CDIM. A unified mediation approach to semantic interoperability provides a flexible and extensible framework for all types of interaction between health record systems and research systems. CDIM, as core ontology of such an approach, enables simplicity and consistency of design across the heterogeneous software landscape and can support the specific needs of EHR-driven phenotyping research using primary care data.

  7. Unified Program Design: Organizing Existing Programming Models, Delivery Options, and Curriculum

    ERIC Educational Resources Information Center

    Rubenstein, Lisa DaVia; Ridgley, Lisa M.

    2017-01-01

    A persistent problem in the field of gifted education has been the lack of categorization and delineation of gifted programming options. To address this issue, we propose Unified Program Design as a structural framework for gifted program models. This framework defines gifted programs as the combination of delivery methods and curriculum models.…

  8. A lightweight messaging-based distributed processing and workflow execution framework for real-time and big data analysis

    NASA Astrophysics Data System (ADS)

    Laban, Shaban; El-Desouky, Aly

    2014-05-01

    To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). Also, it is possible to extend the use of the framework in monitoring the IDC pipeline. The detailed design, implementation,conclusion and future work of the proposed framework will be presented.

  9. Countering the Pedagogy of Extremism: Reflective Narratives and Critiques of Problem-Based Learning

    ERIC Educational Resources Information Center

    Woo, Chris W. H.; Laxman, Kumar

    2013-01-01

    This paper is a critique against "purist" pedagogies found in the literature of student-centred learning. The article reproves extremism in education and questions the absolutism and teleological truths expounded in exclusive problem-based learning. The paper articulates the framework of a unifying pedagogical practice through Eve…

  10. Food-web based unified model of macro- and microevolution.

    PubMed

    Chowdhury, Debashish; Stauffer, Dietrich

    2003-10-01

    We incorporate the generic hierarchical architecture of foodwebs into a "unified" model that describes both micro- and macroevolutions within a single theoretical framework. This model describes the microevolution in detail by accounting for the birth, ageing, and natural death of individual organisms as well as prey-predator interactions on a hierarchical dynamic food web. It also provides a natural description of random mutations and speciation (origination) of species as well as their extinctions. The distribution of lifetimes of species follows an approximate power law only over a limited regime.

  11. Unified Bayesian Estimator of EEG Reference at Infinity: rREST (Regularized Reference Electrode Standardization Technique)

    PubMed Central

    Hu, Shiang; Yao, Dezhong; Valdes-Sosa, Pedro A.

    2018-01-01

    The choice of reference for the electroencephalogram (EEG) is a long-lasting unsolved issue resulting in inconsistent usages and endless debates. Currently, both the average reference (AR) and the reference electrode standardization technique (REST) are two primary, apparently irreconcilable contenders. We propose a theoretical framework to resolve this reference issue by formulating both (a) estimation of potentials at infinity, and (b) determination of the reference, as a unified Bayesian linear inverse problem, which can be solved by maximum a posterior estimation. We find that AR and REST are very particular cases of this unified framework: AR results from biophysically non-informative prior; while REST utilizes the prior based on the EEG generative model. To allow for simultaneous denoising and reference estimation, we develop the regularized versions of AR and REST, named rAR and rREST, respectively. Both depend on a regularization parameter that is the noise to signal variance ratio. Traditional and new estimators are evaluated with this framework, by both simulations and analysis of real resting EEGs. Toward this end, we leverage the MRI and EEG data from 89 subjects which participated in the Cuban Human Brain Mapping Project. Generated artificial EEGs—with a known ground truth, show that relative error in estimating the EEG potentials at infinity is lowest for rREST. It also reveals that realistic volume conductor models improve the performances of REST and rREST. Importantly, for practical applications, it is shown that an average lead field gives the results comparable to the individual lead field. Finally, it is shown that the selection of the regularization parameter with Generalized Cross-Validation (GCV) is close to the “oracle” choice based on the ground truth. When evaluated with the real 89 resting state EEGs, rREST consistently yields the lowest GCV. This study provides a novel perspective to the EEG reference problem by means of a unified inverse solution framework. It may allow additional principled theoretical formulations and numerical evaluation of performance. PMID:29780302

  12. Incubation, Insight, and Creative Problem Solving: A Unified Theory and a Connectionist Model

    ERIC Educational Resources Information Center

    Helie, Sebastien; Sun, Ron

    2010-01-01

    This article proposes a unified framework for understanding creative problem solving, namely, the explicit-implicit interaction theory. This new theory of creative problem solving constitutes an attempt at providing a more unified explanation of relevant phenomena (in part by reinterpreting/integrating various fragmentary existing theories of…

  13. Optimization-Based Robust Nonlinear Control

    DTIC Science & Technology

    2006-08-01

    ABSTRACT New control algorithms were developed for robust stabilization of nonlinear dynamical systems . Novel, linear matrix inequality-based synthesis...was to further advance optimization-based robust nonlinear control design, for general nonlinear systems (especially in discrete time ), for linear...Teel, IEEE Transactions on Control Systems Technology, vol. 14, no. 3, p. 398-407, May 2006. 3. "A unified framework for input-to-state stability in

  14. Data Centric Development Methodology

    ERIC Educational Resources Information Center

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  15. Complex networks as a unified framework for descriptive analysis and predictive modeling in climate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steinhaeuser, Karsten J K; Chawla, Nitesh; Ganguly, Auroop R

    The analysis of climate data has relied heavily on hypothesis-driven statistical methods, while projections of future climate are based primarily on physics-based computational models. However, in recent years a wealth of new datasets has become available. Therefore, we take a more data-centric approach and propose a unified framework for studying climate, with an aim towards characterizing observed phenomena as well as discovering new knowledge in the climate domain. Specifically, we posit that complex networks are well-suited for both descriptive analysis and predictive modeling tasks. We show that the structural properties of climate networks have useful interpretation within the domain. Further,more » we extract clusters from these networks and demonstrate their predictive power as climate indices. Our experimental results establish that the network clusters are statistically significantly better predictors than clusters derived using a more traditional clustering approach. Using complex networks as data representation thus enables the unique opportunity for descriptive and predictive modeling to inform each other.« less

  16. Domain Anomaly Detection in Machine Perception: A System Architecture and Taxonomy.

    PubMed

    Kittler, Josef; Christmas, William; de Campos, Teófilo; Windridge, David; Yan, Fei; Illingworth, John; Osman, Magda

    2014-05-01

    We address the problem of anomaly detection in machine perception. The concept of domain anomaly is introduced as distinct from the conventional notion of anomaly used in the literature. We propose a unified framework for anomaly detection which exposes the multifaceted nature of anomalies and suggest effective mechanisms for identifying and distinguishing each facet as instruments for domain anomaly detection. The framework draws on the Bayesian probabilistic reasoning apparatus which clearly defines concepts such as outlier, noise, distribution drift, novelty detection (object, object primitive), rare events, and unexpected events. Based on these concepts we provide a taxonomy of domain anomaly events. One of the mechanisms helping to pinpoint the nature of anomaly is based on detecting incongruence between contextual and noncontextual sensor(y) data interpretation. The proposed methodology has wide applicability. It underpins in a unified way the anomaly detection applications found in the literature. To illustrate some of its distinguishing features, in here the domain anomaly detection methodology is applied to the problem of anomaly detection for a video annotation system.

  17. Another Initiative? Where Does it Fit? A Unifying Framework and an Integrated Infrastructure for Schools to Address Barriers to Learning and Promote Healthy Development

    ERIC Educational Resources Information Center

    Center for Mental Health in Schools at UCLA, 2005

    2005-01-01

    This report was developed to highlight the current state of affairs and illustrate the value of a unifying framework and integrated infrastructure for the many initiatives, projects, programs, and services schools pursue in addressing barriers to learning and promoting healthy development. Specifically, it highlights how initiatives can be…

  18. A Unified Algebraic and Logic-Based Framework Towards Safe Routing Implementations

    DTIC Science & Technology

    2015-08-13

    Software - defined Networks ( SDN ). We developed a declarative platform for implementing SDN protocols using declarative...and debugging several SDN applications. Example-based SDN synthesis. Recent emergence of software - defined networks offers an opportunity to design...domain of Software - defined Networks ( SDN ). We developed a declarative platform for implementing SDN protocols using declarative networking

  19. Action Recommendation for Cyber Resilience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choudhury, Sutanay; Rodriguez, Luke R.; Curtis, Darren S.

    2015-09-01

    This paper presents an unifying graph-based model for representing the infrastructure, behavior and missions of an enterprise. We describe how the model can be used to achieve resiliency against a wide class of failures and attacks. We introduce an algorithm for recommending resilience establishing actions based on dynamic updates to the models. Without loss of generality, we show the effectiveness of the algorithm for preserving latency based quality of service (QoS). Our models and the recommendation algorithms are implemented in a software framework that we seek to release as an open source framework for simulating resilient cyber systems.

  20. The Unified Behavior Framework for the Simulation of Autonomous Agents

    DTIC Science & Technology

    2015-03-01

    1980s, researchers have designed a variety of robot control architectures intending to imbue robots with some degree of autonomy. A recently developed ...Identification Friend or Foe viii THE UNIFIED BEHAVIOR FRAMEWORK FOR THE SIMULATION OF AUTONOMOUS AGENTS I. Introduction The development of autonomy has...room for research by utilizing methods like simulation and modeling that consume less time and fewer monetary resources. A recently developed reactive

  1. The Importance of Culture for Developmental Science

    ERIC Educational Resources Information Center

    Keller, Heidi

    2012-01-01

    In this essay, it is argued that a general understanding of human development needs a unified framework based on evolutionary theorizing and cross-cultural and cultural anthropological approaches. An eco-social model of development has been proposed that defines cultural milieus as adaptations to specific socio-demographic contexts. Ontogenetic…

  2. A Semantic Grid Oriented to E-Tourism

    NASA Astrophysics Data System (ADS)

    Zhang, Xiao Ming

    With increasing complexity of tourism business models and tasks, there is a clear need of the next generation e-Tourism infrastructure to support flexible automation, integration, computation, storage, and collaboration. Currently several enabling technologies such as semantic Web, Web service, agent and grid computing have been applied in the different e-Tourism applications, however there is no a unified framework to be able to integrate all of them. So this paper presents a promising e-Tourism framework based on emerging semantic grid, in which a number of key design issues are discussed including architecture, ontologies structure, semantic reconciliation, service and resource discovery, role based authorization and intelligent agent. The paper finally provides the implementation of the framework.

  3. A Novel Switching-Based Control Framework for Improved Task Performance in Teleoperation System With Asymmetric Time-Varying Delays.

    PubMed

    Zhai, Di-Hua; Xia, Yuanqing

    2018-02-01

    This paper addresses the adaptive control for task-space teleoperation systems with constrained predefined synchronization error, where a novel switched control framework is investigated. Based on multiple Lyapunov-Krasovskii functionals method, the stability of the resulting closed-loop system is established in the sense of state-independent input-to-output stability. Compared with previous work, the developed method can simultaneously handle the unknown kinematics/dynamics, asymmetric varying time delays, and prescribed performance control in a unified framework. It is shown that the developed controller can guarantee the prescribed transient-state and steady-state synchronization performances between the master and slave robots, which is demonstrated by the simulation study.

  4. Discrete shearlet transform: faithful digitization concept and its applications

    NASA Astrophysics Data System (ADS)

    Lim, Wang-Q.

    2011-09-01

    Over the past years, various representation systems which sparsely approximate functions governed by anisotropic features such as edges in images have been proposed. Alongside the theoretical development of these systems, algorithmic realizations of the associated transforms were provided. However, one of the most common short-comings of these frameworks is the lack of providing a unified treatment of the continuum and digital world, i.e., allowing a digital theory to be a natural digitization of the continuum theory. Shearlets were introduced as means to sparsely encode anisotropic singularities of multivariate data while providing a unified treatment of the continuous and digital realm. In this paper, we introduce a discrete framework which allows a faithful digitization of the continuum domain shearlet transform based on compactly supported shearlets. Finally, we show numerical experiments demonstrating the potential of the discrete shearlet transform in several image processing applications.

  5. Community health workers in Brazil's Unified Health System: a framework of their praxis and contributions to patient health behaviors.

    PubMed

    Pinto, Rogério M; da Silva, Sueli Bulhões; Soriano, Rafaela

    2012-03-01

    Community health workers (CHWs) play a pivotal role in primary care, serving as liaisons between community members and medical providers. However, the growing reliance of health care systems worldwide on CHWs has outpaced research explaining their praxis - how they combine indigenous and technical knowledge, overcome challenges and impact patient outcomes. This paper thus articulates the CHW Praxis and Patient Health Behavior Framework. Such a framework is needed to advance research on CHW impact on patient outcomes and to advance CHW training. The project that originated this framework followed community-based participatory research principles. A team of U.S.-Brazil research partners, including CHWs, worked together from conceptualization of the study to dissemination of its findings. The framework is built on an integrated conceptual foundation including learning/teaching and individual behavior theories. The empirical base of the framework comprises in-depth interviews with 30 CHWs in Brazil's Unified Health System, Mesquita, Rio de Janeiro. Data collection for the project which originated this report occurred in 2008-10. Semi-structured questions examined how CHWs used their knowledge/skills; addressed personal and environmental challenges; and how they promoted patient health behaviors. This study advances an explanation of how CHWs use self-identified strategies--i.e., empathic communication and perseverance--to help patients engage in health behaviors. Grounded in our proposed framework, survey measures can be developed and used in predictive models testing the effects of CHW praxis on health behaviors. Training for CHWs can explicitly integrate indigenous and technical knowledge in order for CHWs to overcome contextual challenges and enhance service delivery. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Community Health Workers in Brazil's Unified Health System: A Framework of their Praxis and Contributions to Patient Health Behaviors

    PubMed Central

    Pinto, Rogério M.; da Silva, Sueli Bulhões; Soriano, Rafaela

    2012-01-01

    Community Health Workers (CHWs) play a pivotal role in primary care, serving as liaisons between community members and medical providers. However, the growing reliance of health care systems worldwide on CHWs has outpaced research explaining their praxis – how they combine indigenous and technical knowledge, overcome challenges and impact patient outcomes. This paper thus articulates the CHW Praxis and Patient Health Behavior Framework. Such a framework is needed to advance research on CHW impact on patient outcomes and to advance CHW training. The project that originated this framework followed Community-Based Participatory Research principles. A team of U.S.-Brazil research partners, including CHWs, worked together from conceptualization of the study to dissemination of its findings. The framework is built on an integrated conceptual foundation including learning/teaching and individual behavior theories. The empirical base of the framework comprises in-depth interviews with 30 CHWs in Brazil's Unified Health System, Mesquita, Rio de Janeiro. Data collection for the project which originated this report occurred in 2008–10. Semi-structured questions examined how CHWs used their knowledge/skills; addressed personal and environmental challenges; and how they promoted patient health behaviors. This study advances an explanation of how CHWs use self-identified strategies – i.e., empathic communication and perseverance – to help patients engage in health behaviors. Grounded in our proposed framework, survey measures can be developed and used in predictive models testing the effects of CHW praxis on health behaviors. Training for CHWs can explicitly integrate indigenous and technical knowledge in order for CHWs to overcome contextual challenges and enhance service delivery. PMID:22305469

  7. A unified statistical approach to non-negative matrix factorization and probabilistic latent semantic indexing

    PubMed Central

    Wang, Guoli; Ebrahimi, Nader

    2014-01-01

    Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H, such that V ∼ W H. It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H. In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data. PMID:25821345

  8. A unified statistical approach to non-negative matrix factorization and probabilistic latent semantic indexing.

    PubMed

    Devarajan, Karthik; Wang, Guoli; Ebrahimi, Nader

    2015-04-01

    Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H , such that V ∼ W H . It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H . In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data.

  9. UUI: Reusable Spatial Data Services in Unified User Interface at NASA GES DISC

    NASA Technical Reports Server (NTRS)

    Petrenko, Maksym; Hegde, Mahabaleshwa; Bryant, Keith; Pham, Long B.

    2016-01-01

    Unified User Interface (UUI) is a next-generation operational data access tool that has been developed at Goddard Earth Sciences Data and Information Services Center(GES DISC) to provide a simple, unified, and intuitive one-stop shop experience for the key data services available at GES DISC, including subsetting (Simple Subset Wizard -SSW), granule file search (Mirador), plotting (Giovanni), and other legacy spatial data services. UUI has been built based on a flexible infrastructure of reusable web services self-contained building blocks that can easily be plugged into spatial applications, including third-party clients or services, to easily enable new functionality as new datasets and services become available. In this presentation, we will discuss our experience in designing UUI services based on open industry standards. We will also explain how the resulting framework can be used for a rapid development, deployment, and integration of spatial data services, facilitating efficient access and dissemination of spatial data sets.

  10. A unified framework for unraveling the functional interaction structure of a biomolecular network based on stimulus-response experimental data.

    PubMed

    Cho, Kwang-Hyun; Choo, Sang-Mok; Wellstead, Peter; Wolkenhauer, Olaf

    2005-08-15

    We propose a unified framework for the identification of functional interaction structures of biomolecular networks in a way that leads to a new experimental design procedure. In developing our approach, we have built upon previous work. Thus we begin by pointing out some of the restrictions associated with existing structure identification methods and point out how these restrictions may be eased. In particular, existing methods use specific forms of experimental algebraic equations with which to identify the functional interaction structure of a biomolecular network. In our work, we employ an extended form of these experimental algebraic equations which, while retaining their merits, also overcome some of their disadvantages. Experimental data are required in order to estimate the coefficients of the experimental algebraic equation set associated with the structure identification task. However, experimentalists are rarely provided with guidance on which parameters to perturb, and to what extent, to perturb them. When a model of network dynamics is required then there is also the vexed question of sample rate and sample time selection to be resolved. Supplying some answers to these questions is the main motivation of this paper. The approach is based on stationary and/or temporal data obtained from parameter perturbations, and unifies the previous approaches of Kholodenko et al. (PNAS 99 (2002) 12841-12846) and Sontag et al. (Bioinformatics 20 (2004) 1877-1886). By way of demonstration, we apply our unified approach to a network model which cannot be properly identified by existing methods. Finally, we propose an experiment design methodology, which is not limited by the amount of parameter perturbations, and illustrate its use with an in numero example.

  11. Trajectory optimization for lunar soft landing with complex constraints

    NASA Astrophysics Data System (ADS)

    Chu, Huiping; Ma, Lin; Wang, Kexin; Shao, Zhijiang; Song, Zhengyu

    2017-11-01

    A unified trajectory optimization framework with initialization strategies is proposed in this paper for lunar soft landing for various missions with specific requirements. Two main missions of interest are Apollo-like Landing from low lunar orbit and Vertical Takeoff Vertical Landing (a promising mobility method) on the lunar surface. The trajectory optimization is characterized by difficulties arising from discontinuous thrust, multi-phase connections, jump of attitude angle, and obstacles avoidance. Here R-function is applied to deal with the discontinuities of thrust, checkpoint constraints are introduced to connect multiple landing phases, attitude angular rate is designed to get rid of radical changes, and safeguards are imposed to avoid collision with obstacles. The resulting dynamic problems are generally with complex constraints. The unified framework based on Gauss Pseudospectral Method (GPM) and Nonlinear Programming (NLP) solver are designed to solve the problems efficiently. Advanced initialization strategies are developed to enhance both the convergence and computation efficiency. Numerical results demonstrate the adaptability of the framework for various landing missions, and the performance of successful solution of difficult dynamic problems.

  12. A Unified Framework for Activity Recognition-Based Behavior Analysis and Action Prediction in Smart Homes

    PubMed Central

    Fatima, Iram; Fahim, Muhammad; Lee, Young-Koo; Lee, Sungyoung

    2013-01-01

    In recent years, activity recognition in smart homes is an active research area due to its applicability in many applications, such as assistive living and healthcare. Besides activity recognition, the information collected from smart homes has great potential for other application domains like lifestyle analysis, security and surveillance, and interaction monitoring. Therefore, discovery of users common behaviors and prediction of future actions from past behaviors become an important step towards allowing an environment to provide personalized service. In this paper, we develop a unified framework for activity recognition-based behavior analysis and action prediction. For this purpose, first we propose kernel fusion method for accurate activity recognition and then identify the significant sequential behaviors of inhabitants from recognized activities of their daily routines. Moreover, behaviors patterns are further utilized to predict the future actions from past activities. To evaluate the proposed framework, we performed experiments on two real datasets. The results show a remarkable improvement of 13.82% in the accuracy on average of recognized activities along with the extraction of significant behavioral patterns and precise activity predictions with 6.76% increase in F-measure. All this collectively help in understanding the users” actions to gain knowledge about their habits and preferences. PMID:23435057

  13. Identifying group discriminative and age regressive sub-networks from DTI-based connectivity via a unified framework of non-negative matrix factorization and graph embedding

    PubMed Central

    Ghanbari, Yasser; Smith, Alex R.; Schultz, Robert T.; Verma, Ragini

    2014-01-01

    Diffusion tensor imaging (DTI) offers rich insights into the physical characteristics of white matter (WM) fiber tracts and their development in the brain, facilitating a network representation of brain’s traffic pathways. Such a network representation of brain connectivity has provided a novel means of investigating brain changes arising from pathology, development or aging. The high dimensionality of these connectivity networks necessitates the development of methods that identify the connectivity building blocks or sub-network components that characterize the underlying variation in the population. In addition, the projection of the subject networks into the basis set provides a low dimensional representation of it, that teases apart different sources of variation in the sample, facilitating variation-specific statistical analysis. We propose a unified framework of non-negative matrix factorization and graph embedding for learning sub-network patterns of connectivity by their projective non-negative decomposition into a reconstructive basis set, as well as, additional basis sets representing variational sources in the population like age and pathology. The proposed framework is applied to a study of diffusion-based connectivity in subjects with autism that shows localized sparse sub-networks which mostly capture the changes related to pathology and developmental variations. PMID:25037933

  14. Next Generation Community Based Unified Global Modeling System Development and Operational Implementation Strategies at NCEP

    NASA Astrophysics Data System (ADS)

    Tallapragada, V.

    2017-12-01

    NOAA's Next Generation Global Prediction System (NGGPS) has provided the unique opportunity to develop and implement a non-hydrostatic global model based on Geophysical Fluid Dynamics Laboratory (GFDL) Finite Volume Cubed Sphere (FV3) Dynamic Core at National Centers for Environmental Prediction (NCEP), making a leap-step advancement in seamless prediction capabilities across all spatial and temporal scales. Model development efforts are centralized with unified model development in the NOAA Environmental Modeling System (NEMS) infrastructure based on Earth System Modeling Framework (ESMF). A more sophisticated coupling among various earth system components is being enabled within NEMS following National Unified Operational Prediction Capability (NUOPC) standards. The eventual goal of unifying global and regional models will enable operational global models operating at convective resolving scales. Apart from the advanced non-hydrostatic dynamic core and coupling to various earth system components, advanced physics and data assimilation techniques are essential for improved forecast skill. NGGPS is spearheading ambitious physics and data assimilation strategies, concentrating on creation of a Common Community Physics Package (CCPP) and Joint Effort for Data Assimilation Integration (JEDI). Both initiatives are expected to be community developed, with emphasis on research transitioning to operations (R2O). The unified modeling system is being built to support the needs of both operations and research. Different layers of community partners are also established with specific roles/responsibilities for researchers, core development partners, trusted super-users, and operations. Stakeholders are engaged at all stages to help drive the direction of development, resources allocations and prioritization. This talk presents the current and future plans of unified model development at NCEP for weather, sub-seasonal, and seasonal climate prediction applications with special emphasis on implementation of NCEP FV3 Global Forecast System (GFS) and Global Ensemble Forecast System (GEFS) into operations by 2019.

  15. High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems.

    PubMed

    Mahadevan, Vijay S; Merzari, Elia; Tautges, Timothy; Jain, Rajeev; Obabko, Aleksandr; Smith, Michael; Fischer, Paul

    2014-08-06

    An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework.

  16. High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems

    PubMed Central

    Mahadevan, Vijay S.; Merzari, Elia; Tautges, Timothy; Jain, Rajeev; Obabko, Aleksandr; Smith, Michael; Fischer, Paul

    2014-01-01

    An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework. PMID:24982250

  17. A unified and efficient framework for court-net sports video analysis using 3D camera modeling

    NASA Astrophysics Data System (ADS)

    Han, Jungong; de With, Peter H. N.

    2007-01-01

    The extensive amount of video data stored on available media (hard and optical disks) necessitates video content analysis, which is a cornerstone for different user-friendly applications, such as, smart video retrieval and intelligent video summarization. This paper aims at finding a unified and efficient framework for court-net sports video analysis. We concentrate on techniques that are generally applicable for more than one sports type to come to a unified approach. To this end, our framework employs the concept of multi-level analysis, where a novel 3-D camera modeling is utilized to bridge the gap between the object-level and the scene-level analysis. The new 3-D camera modeling is based on collecting features points from two planes, which are perpendicular to each other, so that a true 3-D reference is obtained. Another important contribution is a new tracking algorithm for the objects (i.e. players). The algorithm can track up to four players simultaneously. The complete system contributes to summarization by various forms of information, of which the most important are the moving trajectory and real-speed of each player, as well as 3-D height information of objects and the semantic event segments in a game. We illustrate the performance of the proposed system by evaluating it for a variety of court-net sports videos containing badminton, tennis and volleyball, and we show that the feature detection performance is above 92% and events detection about 90%.

  18. The Reliability of Setting Grade Boundaries Using Comparative Judgement

    ERIC Educational Resources Information Center

    Benton, Tom; Elliott, Gill

    2016-01-01

    In recent years the use of expert judgement to set and maintain examination standards has been increasingly criticised in favour of approaches based on statistical modelling. This paper reviews existing research on this controversy and attempts to unify the evidence within a framework where expertise is utilised in the form of comparative…

  19. Hilltop supernatural inflation and SUSY unified models

    NASA Astrophysics Data System (ADS)

    Kohri, Kazunori; Lim, C. S.; Lin, Chia-Min; Mimura, Yukihiro

    2014-01-01

    In this paper, we consider high scale (100TeV) supersymmetry (SUSY) breaking and realize the idea of hilltop supernatural inflation in concrete particle physics models based on flipped-SU(5)and Pati-Salam models in the framework of supersymmetric grand unified theories (SUSY GUTs). The inflaton can be a flat direction including right-handed sneutrino and the waterfall field is a GUT Higgs. The spectral index is ns = 0.96 which fits very well with recent data by PLANCK satellite. There is no both thermal and non-thermal gravitino problems. Non-thermal leptogenesis can be resulted from the decay of right-handed sneutrino which plays (part of) the role of inflaton.

  20. RANZCR Body Systems Framework of diagnostic imaging examination descriptors.

    PubMed

    Pitman, Alexander G; Penlington, Lisa; Doromal, Darren; Slater, Gregory; Vukolova, Natalia

    2014-08-01

    A unified and logical system of descriptors for diagnostic imaging examinations and procedures is a desirable resource for radiology in Australia and New Zealand and is needed to support core activities of RANZCR. Existing descriptor systems available in Australia and New Zealand (including the Medicare DIST and the ACC Schedule) have significant limitations and are inappropriate for broader clinical application. An anatomically based grid was constructed, with anatomical structures arranged in rows and diagnostic imaging modalities arranged in columns (including nuclear medicine and positron emission tomography). The grid was segregated into five body systems. The cells at the intersection of an anatomical structure row and an imaging modality column were populated with short, formulaic descriptors of the applicable diagnostic imaging examinations. Clinically illogical or physically impossible combinations were 'greyed out'. Where the same examination applied to different anatomical structures, the descriptor was kept identical for the purposes of streamlining. The resulting Body Systems Framework of diagnostic imaging examination descriptors lists all the reasonably common diagnostic imaging examinations currently performed in Australia and New Zealand using a unified grid structure allowing navigation by both referrers and radiologists. The Framework has been placed on the RANZCR website and is available for access free of charge by registered users. The Body Systems Framework of diagnostic imaging examination descriptors is a system of descriptors based on relationships between anatomical structures and imaging modalities. The Framework is now available as a resource and reference point for the radiology profession and to support core College activities. © 2014 The Royal Australian and New Zealand College of Radiologists.

  1. Vegetarian on purpose: Understanding the motivations of plant-based dieters.

    PubMed

    Rosenfeld, Daniel L; Burrow, Anthony L

    2017-09-01

    Much recent research has explored vegetarians' dietary motivations, recurrently highlighting the significant influence they exert on how people view themselves and others. For vegetarians and other plant-based dieters, dietary motivations have been theorized to be a central aspect of identity. Yet not all plant-based dieters are motivated to follow their diets; rather, some face aversions and constraints. In this paper, we propose that motivations, aversions, and constraints constitute three distinct reasons for consuming a plant-based diet. After conceptually distinguishing motivations from aversions and constraints, we critically evaluate the advantages and disadvantages of two conceptual frameworks that exist for studying these motivations systematically: the ethical-health framework and the Unified Model of Vegetarian Identity (UMVI) motivational orientations framework. Importantly, these frameworks serve different purposes, and their suitability often depends on the research question at hand. Particularly given an increasing prevalence of plant-based dieting, cultivating a more holistic understanding of these two frameworks is necessary for advancing this discipline. Directions for future research are discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. War-gaming application for future space systems acquisition

    NASA Astrophysics Data System (ADS)

    Nguyen, Tien M.; Guillen, Andy T.

    2016-05-01

    Recently the U.S. Department of Defense (DOD) released the Defense Innovation Initiative (DII) [1] to focus DOD on five key aspects; Aspect #1: Recruit talented and innovative people, Aspect #2: Reinvigorate war-gaming, Aspect #3: Initiate long-range research and development programs, Aspect #4: Make DOD practices more innovative, and Aspect #5: Advance technology and new operational concepts. Per DII instruction, this paper concentrates on Aspect #2 and Aspect #4 by reinvigorating the war-gaming effort with a focus on an innovative approach for developing the optimum Program and Technical Baselines (PTBs) and their corresponding optimum acquisition strategies for acquiring future space systems. The paper describes a unified approach for applying the war-gaming concept for future DOD acquisition of space systems. The proposed approach includes a Unified Game-based Acquisition Framework (UGAF) and an Advanced Game-Based Mathematical Framework (AGMF) using Bayesian war-gaming engines to optimize PTB solutions and select the corresponding optimum acquisition strategies for acquiring a space system. The framework defines the action space for all players with a complete description of the elements associated with the games, including Department of Defense Acquisition Authority (DAA), stakeholders, warfighters, and potential contractors, War-Gaming Engines (WGEs) played by DAA, WGEs played by Contractor (KTR), and the players' Payoff and Cost functions (PCFs). The AGMF presented here addresses both complete and incomplete information cases. The proposed framework provides a recipe for the DAA and USAF-Space and Missile Systems Center (SMC) to acquire future space systems optimally.

  3. A Unified Methodology for Computing Accurate Quaternion Color Moments and Moment Invariants.

    PubMed

    Karakasis, Evangelos G; Papakostas, George A; Koulouriotis, Dimitrios E; Tourassis, Vassilios D

    2014-02-01

    In this paper, a general framework for computing accurate quaternion color moments and their corresponding invariants is proposed. The proposed unified scheme arose by studying the characteristics of different orthogonal polynomials. These polynomials are used as kernels in order to form moments, the invariants of which can easily be derived. The resulted scheme permits the usage of any polynomial-like kernel in a unified and consistent way. The resulted moments and moment invariants demonstrate robustness to noisy conditions and high discriminative power. Additionally, in the case of continuous moments, accurate computations take place to avoid approximation errors. Based on this general methodology, the quaternion Tchebichef, Krawtchouk, Dual Hahn, Legendre, orthogonal Fourier-Mellin, pseudo Zernike and Zernike color moments, and their corresponding invariants are introduced. A selected paradigm presents the reconstruction capability of each moment family, whereas proper classification scenarios evaluate the performance of color moment invariants.

  4. Towards a unified model of passive drug permeation I: origins of the unstirred water layer with applications to ionic permeation.

    PubMed

    Ghosh, Avijit; Scott, Dennis O; Maurer, Tristan S

    2014-02-14

    In this work, we provide a unified theoretical framework describing how drug molecules can permeate across membranes in neutral and ionized forms for unstirred in vitro systems. The analysis provides a self-consistent basis for the origin of the unstirred water layer (UWL) within the Nernst-Planck framework in the fully unstirred limit and further provides an accounting mechanism based simply on the bulk aqueous solvent diffusion constant of the drug molecule. Our framework makes no new assumptions about the underlying physics of molecular permeation. We hold simply that Nernst-Planck is a reasonable approximation at low concentrations and all physical systems must conserve mass. The applicability of the derived framework has been examined both with respect to the effect of stirring and externally applied voltages to measured permeability. The analysis contains data for 9 compounds extracted from the literature representing a range of permeabilities and aqueous diffusion coefficients. Applicability with respect to ionized permeation is examined using literature data for the permanently charged cation, crystal violet, providing a basis for the underlying mechanism for ionized drug permeation for this molecule as being due to mobile counter-current flow. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. A Unified Framework for Street-View Panorama Stitching

    PubMed Central

    Li, Li; Yao, Jian; Xie, Renping; Xia, Menghan; Zhang, Wei

    2016-01-01

    In this paper, we propose a unified framework to generate a pleasant and high-quality street-view panorama by stitching multiple panoramic images captured from the cameras mounted on the mobile platform. Our proposed framework is comprised of four major steps: image warping, color correction, optimal seam line detection and image blending. Since the input images are captured without a precisely common projection center from the scenes with the depth differences with respect to the cameras to different extents, such images cannot be precisely aligned in geometry. Therefore, an efficient image warping method based on the dense optical flow field is proposed to greatly suppress the influence of large geometric misalignment at first. Then, to lessen the influence of photometric inconsistencies caused by the illumination variations and different exposure settings, we propose an efficient color correction algorithm via matching extreme points of histograms to greatly decrease color differences between warped images. After that, the optimal seam lines between adjacent input images are detected via the graph cut energy minimization framework. At last, the Laplacian pyramid blending algorithm is applied to further eliminate the stitching artifacts along the optimal seam lines. Experimental results on a large set of challenging street-view panoramic images captured form the real world illustrate that the proposed system is capable of creating high-quality panoramas. PMID:28025481

  6. An Unified Multiscale Framework for Planar, Surface, and Curve Skeletonization.

    PubMed

    Jalba, Andrei C; Sobiecki, Andre; Telea, Alexandru C

    2016-01-01

    Computing skeletons of 2D shapes, and medial surface and curve skeletons of 3D shapes, is a challenging task. In particular, there is no unified framework that detects all types of skeletons using a single model, and also produces a multiscale representation which allows to progressively simplify, or regularize, all skeleton types. In this paper, we present such a framework. We model skeleton detection and regularization by a conservative mass transport process from a shape's boundary to its surface skeleton, next to its curve skeleton, and finally to the shape center. The resulting density field can be thresholded to obtain a multiscale representation of progressively simplified surface, or curve, skeletons. We detail a numerical implementation of our framework which is demonstrably stable and has high computational efficiency. We demonstrate our framework on several complex 2D and 3D shapes.

  7. A unified account of perceptual layering and surface appearance in terms of gamut relativity.

    PubMed

    Vladusich, Tony; McDonnell, Mark D

    2014-01-01

    When we look at the world--or a graphical depiction of the world--we perceive surface materials (e.g. a ceramic black and white checkerboard) independently of variations in illumination (e.g. shading or shadow) and atmospheric media (e.g. clouds or smoke). Such percepts are partly based on the way physical surfaces and media reflect and transmit light and partly on the way the human visual system processes the complex patterns of light reaching the eye. One way to understand how these percepts arise is to assume that the visual system parses patterns of light into layered perceptual representations of surfaces, illumination and atmospheric media, one seen through another. Despite a great deal of previous experimental and modelling work on layered representation, however, a unified computational model of key perceptual demonstrations is still lacking. Here we present the first general computational model of perceptual layering and surface appearance--based on a boarder theoretical framework called gamut relativity--that is consistent with these demonstrations. The model (a) qualitatively explains striking effects of perceptual transparency, figure-ground separation and lightness, (b) quantitatively accounts for the role of stimulus- and task-driven constraints on perceptual matching performance, and (c) unifies two prominent theoretical frameworks for understanding surface appearance. The model thereby provides novel insights into the remarkable capacity of the human visual system to represent and identify surface materials, illumination and atmospheric media, which can be exploited in computer graphics applications.

  8. A Unified Account of Perceptual Layering and Surface Appearance in Terms of Gamut Relativity

    PubMed Central

    Vladusich, Tony; McDonnell, Mark D.

    2014-01-01

    When we look at the world—or a graphical depiction of the world—we perceive surface materials (e.g. a ceramic black and white checkerboard) independently of variations in illumination (e.g. shading or shadow) and atmospheric media (e.g. clouds or smoke). Such percepts are partly based on the way physical surfaces and media reflect and transmit light and partly on the way the human visual system processes the complex patterns of light reaching the eye. One way to understand how these percepts arise is to assume that the visual system parses patterns of light into layered perceptual representations of surfaces, illumination and atmospheric media, one seen through another. Despite a great deal of previous experimental and modelling work on layered representation, however, a unified computational model of key perceptual demonstrations is still lacking. Here we present the first general computational model of perceptual layering and surface appearance—based on a boarder theoretical framework called gamut relativity—that is consistent with these demonstrations. The model (a) qualitatively explains striking effects of perceptual transparency, figure-ground separation and lightness, (b) quantitatively accounts for the role of stimulus- and task-driven constraints on perceptual matching performance, and (c) unifies two prominent theoretical frameworks for understanding surface appearance. The model thereby provides novel insights into the remarkable capacity of the human visual system to represent and identify surface materials, illumination and atmospheric media, which can be exploited in computer graphics applications. PMID:25402466

  9. A unified perspective on robot control - The energy Lyapunov function approach

    NASA Technical Reports Server (NTRS)

    Wen, John T.

    1990-01-01

    A unified framework for the stability analysis of robot tracking control is presented. By using an energy-motivated Lyapunov function candidate, the closed-loop stability is shown for a large family of control laws sharing a common structure of proportional and derivative feedback and a model-based feedforward. The feedforward can be zero, partial or complete linearized dynamics, partial or complete nonlinear dynamics, or linearized or nonlinear dynamics with parameter adaptation. As result, the dichotomous approaches to the robot control problem based on the open-loop linearization and nonlinear Lyapunov analysis are both included in this treatment. Furthermore, quantitative estimates of the trade-offs between different schemes in terms of the tracking performance, steady state error, domain of convergence, realtime computation load and required a prior model information are derived.

  10. A Unified Nonlinear Adaptive Approach for Detection and Isolation of Engine Faults

    NASA Technical Reports Server (NTRS)

    Tang, Liang; DeCastro, Jonathan A.; Zhang, Xiaodong; Farfan-Ramos, Luis; Simon, Donald L.

    2010-01-01

    A challenging problem in aircraft engine health management (EHM) system development is to detect and isolate faults in system components (i.e., compressor, turbine), actuators, and sensors. Existing nonlinear EHM methods often deal with component faults, actuator faults, and sensor faults separately, which may potentially lead to incorrect diagnostic decisions and unnecessary maintenance. Therefore, it would be ideal to address sensor faults, actuator faults, and component faults under one unified framework. This paper presents a systematic and unified nonlinear adaptive framework for detecting and isolating sensor faults, actuator faults, and component faults for aircraft engines. The fault detection and isolation (FDI) architecture consists of a parallel bank of nonlinear adaptive estimators. Adaptive thresholds are appropriately designed such that, in the presence of a particular fault, all components of the residual generated by the adaptive estimator corresponding to the actual fault type remain below their thresholds. If the faults are sufficiently different, then at least one component of the residual generated by each remaining adaptive estimator should exceed its threshold. Therefore, based on the specific response of the residuals, sensor faults, actuator faults, and component faults can be isolated. The effectiveness of the approach was evaluated using the NASA C-MAPSS turbofan engine model, and simulation results are presented.

  11. Gender Divide and Acceptance of Collaborative Web 2.0 Applications for Learning in Higher Education

    ERIC Educational Resources Information Center

    Huang, Wen-Hao David; Hood, Denice Ward; Yoo, Sun Joo

    2013-01-01

    Situated in the gender digital divide framework, this survey study investigated the role of computer anxiety in influencing female college students' perceptions toward Web 2.0 applications for learning. Based on 432 college students' "Web 2.0 for learning" perception ratings collected by relevant categories of "Unified Theory of Acceptance and Use…

  12. A Unifying Framework for Causal Analysis in Set-Theoretic Multimethod Research

    ERIC Educational Resources Information Center

    Rohlfing, Ingo; Schneider, Carsten Q.

    2018-01-01

    The combination of Qualitative Comparative Analysis (QCA) with process tracing, which we call set-theoretic multimethod research (MMR), is steadily becoming more popular in empirical research. Despite the fact that both methods have an elected affinity based on set theory, it is not obvious how a within-case method operating in a single case and a…

  13. High Maneuverability Airframe: Investigation of Fin and Canard Sizing for Optimum Maneuverability

    DTIC Science & Technology

    2014-09-01

    overset grids (unified- grid); 5) total variation diminishing discretization based on a new multidimensional interpolation framework; 6) Riemann solvers to...Aerodynamics .........................................................................................3 3.1.1 Solver ...describes the methodology used for the simulations. 3.1.1 Solver The double-precision solver of a commercially available code, CFD ++ v12.1.1, 9

  14. A Subject-Independent Method for Automatically Grading Electromyographic Features During a Fatiguing Contraction

    PubMed Central

    Jesunathadas, Mark; Poston, Brach; Santello, Marco; Ye, Jieping; Panchanathan, Sethuraman

    2014-01-01

    Many studies have attempted to monitor fatigue from electromyogram (EMG) signals. However, fatigue affects EMG in a subject-specific manner. We present here a subject-independent framework for monitoring the changes in EMG features that accompany muscle fatigue based on principal component analysis and factor analysis. The proposed framework is based on several time- and frequency-domain features, unlike most of the existing work, which is based on two to three features. Results show that latent factors obtained from factor analysis on these features provide a robust and unified framework. This framework learns a model from EMG signals of multiple subjects, that form a reference group, and monitors the changes in EMG features during a sustained submaximal contraction on a test subject on a scale from zero to one. The framework was tested on EMG signals collected from 12 muscles of eight healthy subjects. The distribution of factor scores of the test subject, when mapped onto the framework was similar for both the subject-specific and subject-independent cases. PMID:22498666

  15. OpenARC: Extensible OpenACC Compiler Framework for Directive-Based Accelerator Programming Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Seyong; Vetter, Jeffrey S

    2014-01-01

    Directive-based, accelerator programming models such as OpenACC have arisen as an alternative solution to program emerging Scalable Heterogeneous Computing (SHC) platforms. However, the increased complexity in the SHC systems incurs several challenges in terms of portability and productivity. This paper presents an open-sourced OpenACC compiler, called OpenARC, which serves as an extensible research framework to address those issues in the directive-based accelerator programming. This paper explains important design strategies and key compiler transformation techniques needed to implement the reference OpenACC compiler. Moreover, this paper demonstrates the efficacy of OpenARC as a research framework for directive-based programming study, by proposing andmore » implementing OpenACC extensions in the OpenARC framework to 1) support hybrid programming of the unified memory and separate memory and 2) exploit architecture-specific features in an abstract manner. Porting thirteen standard OpenACC programs and three extended OpenACC programs to CUDA GPUs shows that OpenARC performs similarly to a commercial OpenACC compiler, while it serves as a high-level research framework.« less

  16. Exploring Environmental Factors in Nursing Workplaces That Promote Psychological Resilience: Constructing a Unified Theoretical Model.

    PubMed

    Cusack, Lynette; Smith, Morgan; Hegney, Desley; Rees, Clare S; Breen, Lauren J; Witt, Regina R; Rogers, Cath; Williams, Allison; Cross, Wendy; Cheung, Kin

    2016-01-01

    Building nurses' resilience to complex and stressful practice environments is necessary to keep skilled nurses in the workplace and ensuring safe patient care. A unified theoretical framework titled Health Services Workplace Environmental Resilience Model (HSWERM), is presented to explain the environmental factors in the workplace that promote nurses' resilience. The framework builds on a previously-published theoretical model of individual resilience, which identified the key constructs of psychological resilience as self-efficacy, coping and mindfulness, but did not examine environmental factors in the workplace that promote nurses' resilience. This unified theoretical framework was developed using a literary synthesis drawing on data from international studies and literature reviews on the nursing workforce in hospitals. The most frequent workplace environmental factors were identified, extracted and clustered in alignment with key constructs for psychological resilience. Six major organizational concepts emerged that related to a positive resilience-building workplace and formed the foundation of the theoretical model. Three concepts related to nursing staff support (professional, practice, personal) and three related to nursing staff development (professional, practice, personal) within the workplace environment. The unified theoretical model incorporates these concepts within the workplace context, linking to the nurse, and then impacting on personal resilience and workplace outcomes, and its use has the potential to increase staff retention and quality of patient care.

  17. The thermodynamics of dense granular flow and jamming

    NASA Astrophysics Data System (ADS)

    Lu, Shih Yu

    The scope of the thesis is to propose, based on experimental evidence and theoretical validation, a quantifiable connection between systems that exhibit the jamming phenomenon. When jammed, some materials that flow are able to resist deformation so that they appear solid-like on the laboratory scale. But unlike ordinary fusion, which has a critically defined criterion in pressure and temperature, jamming occurs under a wide range of conditions. These condition have been rigorously investigated but at the moment, no self-consistent framework can apply to grains, foam and colloids that may have suddenly ceased to flow. To quantify the jamming behavior, a constitutive model of dense granular flows is deduced from shear-flow experiments. The empirical equations are then generalized, via a thermodynamic approach, into an equation-of-state for jamming. Notably, the unifying theory also predicts the experimental data on the behavior of molecular glassy liquids. This analogy paves a crucial road map for a unifying theoretical framework in condensed matter, for example, ranging from sand to fire retardants to toothpaste.

  18. Multi-View Multi-Instance Learning Based on Joint Sparse Representation and Multi-View Dictionary Learning.

    PubMed

    Li, Bing; Yuan, Chunfeng; Xiong, Weihua; Hu, Weiming; Peng, Houwen; Ding, Xinmiao; Maybank, Steve

    2017-12-01

    In multi-instance learning (MIL), the relations among instances in a bag convey important contextual information in many applications. Previous studies on MIL either ignore such relations or simply model them with a fixed graph structure so that the overall performance inevitably degrades in complex environments. To address this problem, this paper proposes a novel multi-view multi-instance learning algorithm (MIL) that combines multiple context structures in a bag into a unified framework. The novel aspects are: (i) we propose a sparse -graph model that can generate different graphs with different parameters to represent various context relations in a bag, (ii) we propose a multi-view joint sparse representation that integrates these graphs into a unified framework for bag classification, and (iii) we propose a multi-view dictionary learning algorithm to obtain a multi-view graph dictionary that considers cues from all views simultaneously to improve the discrimination of the MIL. Experiments and analyses in many practical applications prove the effectiveness of the M IL.

  19. An LMI approach to design H(infinity) controllers for discrete-time nonlinear systems based on unified models.

    PubMed

    Liu, Meiqin; Zhang, Senlin

    2008-10-01

    A unified neural network model termed standard neural network model (SNNM) is advanced. Based on the robust L(2) gain (i.e. robust H(infinity) performance) analysis of the SNNM with external disturbances, a state-feedback control law is designed for the SNNM to stabilize the closed-loop system and eliminate the effect of external disturbances. The control design constraints are shown to be a set of linear matrix inequalities (LMIs) which can be easily solved by various convex optimization algorithms (e.g. interior-point algorithms) to determine the control law. Most discrete-time recurrent neural network (RNNs) and discrete-time nonlinear systems modelled by neural networks or Takagi and Sugeno (T-S) fuzzy models can be transformed into the SNNMs to be robust H(infinity) performance analyzed or robust H(infinity) controller synthesized in a unified SNNM's framework. Finally, some examples are presented to illustrate the wide application of the SNNMs to the nonlinear systems, and the proposed approach is compared with related methods reported in the literature.

  20. Comparing, optimizing, and benchmarking quantum-control algorithms in a unifying programming framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Machnes, S.; Institute for Theoretical Physics, University of Ulm, D-89069 Ulm; Sander, U.

    2011-08-15

    For paving the way to novel applications in quantum simulation, computation, and technology, increasingly large quantum systems have to be steered with high precision. It is a typical task amenable to numerical optimal control to turn the time course of pulses, i.e., piecewise constant control amplitudes, iteratively into an optimized shape. Here, we present a comparative study of optimal-control algorithms for a wide range of finite-dimensional applications. We focus on the most commonly used algorithms: GRAPE methods which update all controls concurrently, and Krotov-type methods which do so sequentially. Guidelines for their use are given and open research questions aremore » pointed out. Moreover, we introduce a unifying algorithmic framework, DYNAMO (dynamic optimization platform), designed to provide the quantum-technology community with a convenient matlab-based tool set for optimal control. In addition, it gives researchers in optimal-control techniques a framework for benchmarking and comparing newly proposed algorithms with the state of the art. It allows a mix-and-match approach with various types of gradients, update and step-size methods as well as subspace choices. Open-source code including examples is made available at http://qlib.info.« less

  1. Conceptualising paediatric health disparities: a metanarrative systematic review and unified conceptual framework.

    PubMed

    Ridgeway, Jennifer L; Wang, Zhen; Finney Rutten, Lila J; van Ryn, Michelle; Griffin, Joan M; Murad, M Hassan; Asiedu, Gladys B; Egginton, Jason S; Beebe, Timothy J

    2017-08-04

    There exists a paucity of work in the development and testing of theoretical models specific to childhood health disparities even though they have been linked to the prevalence of adult health disparities including high rates of chronic disease. We conducted a systematic review and thematic analysis of existing models of health disparities specific to children to inform development of a unified conceptual framework. We systematically reviewed articles reporting theoretical or explanatory models of disparities on a range of outcomes related to child health. We searched Ovid Medline In-Process & Other Non-Indexed Citations, Ovid MEDLINE, Ovid Embase, Ovid Cochrane Central Register of Controlled Trials, Ovid Cochrane Database of Systematic Reviews, and Scopus (database inception to 9 July 2015). A metanarrative approach guided the analysis process. A total of 48 studies presenting 48 models were included. This systematic review found multiple models but no consensus on one approach. However, we did discover a fair amount of overlap, such that the 48 models reviewed converged into the unified conceptual framework. The majority of models included factors in three domains: individual characteristics and behaviours (88%), healthcare providers and systems (63%), and environment/community (56%), . Only 38% of models included factors in the health and public policies domain. A disease-agnostic unified conceptual framework may inform integration of existing knowledge of child health disparities and guide future research. This multilevel framework can focus attention among clinical, basic and social science research on the relationships between policy, social factors, health systems and the physical environment that impact children's health outcomes. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. Unifying framework for multimodal brain MRI segmentation based on Hidden Markov Chains.

    PubMed

    Bricq, S; Collet, Ch; Armspach, J P

    2008-12-01

    In the frame of 3D medical imaging, accurate segmentation of multimodal brain MR images is of interest for many brain disorders. However, due to several factors such as noise, imaging artifacts, intrinsic tissue variation and partial volume effects, tissue classification remains a challenging task. In this paper, we present a unifying framework for unsupervised segmentation of multimodal brain MR images including partial volume effect, bias field correction, and information given by a probabilistic atlas. Here-proposed method takes into account neighborhood information using a Hidden Markov Chain (HMC) model. Due to the limited resolution of imaging devices, voxels may be composed of a mixture of different tissue types, this partial volume effect is included to achieve an accurate segmentation of brain tissues. Instead of assigning each voxel to a single tissue class (i.e., hard classification), we compute the relative amount of each pure tissue class in each voxel (mixture estimation). Further, a bias field estimation step is added to the proposed algorithm to correct intensity inhomogeneities. Furthermore, atlas priors were incorporated using probabilistic brain atlas containing prior expectations about the spatial localization of different tissue classes. This atlas is considered as a complementary sensor and the proposed method is extended to multimodal brain MRI without any user-tunable parameter (unsupervised algorithm). To validate this new unifying framework, we present experimental results on both synthetic and real brain images, for which the ground truth is available. Comparison with other often used techniques demonstrates the accuracy and the robustness of this new Markovian segmentation scheme.

  3. Hilltop supernatural inflation and SUSY unified models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kohri, Kazunori; Lim, C.S.; Lin, Chia-Min

    2014-01-01

    In this paper, we consider high scale (100TeV) supersymmetry (SUSY) breaking and realize the idea of hilltop supernatural inflation in concrete particle physics models based on flipped-SU(5)and Pati-Salam models in the framework of supersymmetric grand unified theories (SUSY GUTs). The inflaton can be a flat direction including right-handed sneutrino and the waterfall field is a GUT Higgs. The spectral index is n{sub s} = 0.96 which fits very well with recent data by PLANCK satellite. There is no both thermal and non-thermal gravitino problems. Non-thermal leptogenesis can be resulted from the decay of right-handed sneutrino which plays (part of) themore » role of inflaton.« less

  4. Resource management and scheduling policy based on grid for AIoT

    NASA Astrophysics Data System (ADS)

    Zou, Yiqin; Quan, Li

    2017-07-01

    This paper has a research on resource management and scheduling policy based on grid technology for Agricultural Internet of Things (AIoT). Facing the situation of a variety of complex and heterogeneous agricultural resources in AIoT, it is difficult to represent them in a unified way. But from an abstract perspective, there are some common models which can express their characteristics and features. Based on this, we proposed a high-level model called Agricultural Resource Hierarchy Model (ARHM), which can be used for modeling various resources. It introduces the agricultural resource modeling method based on this model. Compared with traditional application-oriented three-layer model, ARHM can hide the differences of different applications and make all applications have a unified interface layer and be implemented without distinction. Furthermore, it proposes a Web Service Resource Framework (WSRF)-based resource management method and the encapsulation structure for it. Finally, it focuses on the discussion of multi-agent-based AG resource scheduler, which is a collaborative service provider pattern in multiple agricultural production domains.

  5. Probabilistic arithmetic automata and their applications.

    PubMed

    Marschall, Tobias; Herms, Inke; Kaltenbach, Hans-Michael; Rahmann, Sven

    2012-01-01

    We present a comprehensive review on probabilistic arithmetic automata (PAAs), a general model to describe chains of operations whose operands depend on chance, along with two algorithms to numerically compute the distribution of the results of such probabilistic calculations. PAAs provide a unifying framework to approach many problems arising in computational biology and elsewhere. We present five different applications, namely 1) pattern matching statistics on random texts, including the computation of the distribution of occurrence counts, waiting times, and clump sizes under hidden Markov background models; 2) exact analysis of window-based pattern matching algorithms; 3) sensitivity of filtration seeds used to detect candidate sequence alignments; 4) length and mass statistics of peptide fragments resulting from enzymatic cleavage reactions; and 5) read length statistics of 454 and IonTorrent sequencing reads. The diversity of these applications indicates the flexibility and unifying character of the presented framework. While the construction of a PAA depends on the particular application, we single out a frequently applicable construction method: We introduce deterministic arithmetic automata (DAAs) to model deterministic calculations on sequences, and demonstrate how to construct a PAA from a given DAA and a finite-memory random text model. This procedure is used for all five discussed applications and greatly simplifies the construction of PAAs. Implementations are available as part of the MoSDi package. Its application programming interface facilitates the rapid development of new applications based on the PAA framework.

  6. Coupled dictionary learning for joint MR image restoration and segmentation

    NASA Astrophysics Data System (ADS)

    Yang, Xuesong; Fan, Yong

    2018-03-01

    To achieve better segmentation of MR images, image restoration is typically used as a preprocessing step, especially for low-quality MR images. Recent studies have demonstrated that dictionary learning methods could achieve promising performance for both image restoration and image segmentation. These methods typically learn paired dictionaries of image patches from different sources and use a common sparse representation to characterize paired image patches, such as low-quality image patches and their corresponding high quality counterparts for the image restoration, and image patches and their corresponding segmentation labels for the image segmentation. Since learning these dictionaries jointly in a unified framework may improve the image restoration and segmentation simultaneously, we propose a coupled dictionary learning method to concurrently learn dictionaries for joint image restoration and image segmentation based on sparse representations in a multi-atlas image segmentation framework. Particularly, three dictionaries, including a dictionary of low quality image patches, a dictionary of high quality image patches, and a dictionary of segmentation label patches, are learned in a unified framework so that the learned dictionaries of image restoration and segmentation can benefit each other. Our method has been evaluated for segmenting the hippocampus in MR T1 images collected with scanners of different magnetic field strengths. The experimental results have demonstrated that our method achieved better image restoration and segmentation performance than state of the art dictionary learning and sparse representation based image restoration and image segmentation methods.

  7. LDA-Based Unified Topic Modeling for Similar TV User Grouping and TV Program Recommendation.

    PubMed

    Pyo, Shinjee; Kim, Eunhui; Kim, Munchurl

    2015-08-01

    Social TV is a social media service via TV and social networks through which TV users exchange their experiences about TV programs that they are viewing. For social TV service, two technical aspects are envisioned: grouping of similar TV users to create social TV communities and recommending TV programs based on group and personal interests for personalizing TV. In this paper, we propose a unified topic model based on grouping of similar TV users and recommending TV programs as a social TV service. The proposed unified topic model employs two latent Dirichlet allocation (LDA) models. One is a topic model of TV users, and the other is a topic model of the description words for viewed TV programs. The two LDA models are then integrated via a topic proportion parameter for TV programs, which enforces the grouping of similar TV users and associated description words for watched TV programs at the same time in a unified topic modeling framework. The unified model identifies the semantic relation between TV user groups and TV program description word groups so that more meaningful TV program recommendations can be made. The unified topic model also overcomes an item ramp-up problem such that new TV programs can be reliably recommended to TV users. Furthermore, from the topic model of TV users, TV users with similar tastes can be grouped as topics, which can then be recommended as social TV communities. To verify our proposed method of unified topic-modeling-based TV user grouping and TV program recommendation for social TV services, in our experiments, we used real TV viewing history data and electronic program guide data from a seven-month period collected by a TV poll agency. The experimental results show that the proposed unified topic model yields an average 81.4% precision for 50 topics in TV program recommendation and its performance is an average of 6.5% higher than that of the topic model of TV users only. For TV user prediction with new TV programs, the average prediction precision was 79.6%. Also, we showed the superiority of our proposed model in terms of both topic modeling performance and recommendation performance compared to two related topic models such as polylingual topic model and bilingual topic model.

  8. A unified framework for approximation in inverse problems for distributed parameter systems

    NASA Technical Reports Server (NTRS)

    Banks, H. T.; Ito, K.

    1988-01-01

    A theoretical framework is presented that can be used to treat approximation techniques for very general classes of parameter estimation problems involving distributed systems that are either first or second order in time. Using the approach developed, one can obtain both convergence and stability (continuous dependence of parameter estimates with respect to the observations) under very weak regularity and compactness assumptions on the set of admissible parameters. This unified theory can be used for many problems found in the recent literature and in many cases offers significant improvements to existing results.

  9. In Search of a Unified Model of Language Contact

    ERIC Educational Resources Information Center

    Winford, Donald

    2013-01-01

    Much previous research has pointed to the need for a unified framework for language contact phenomena -- one that would include social factors and motivations, structural factors and linguistic constraints, and psycholinguistic factors involved in processes of language processing and production. While Contact Linguistics has devoted a great deal…

  10. High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems

    DOE PAGES

    Mahadevan, Vijay S.; Merzari, Elia; Tautges, Timothy; ...

    2014-06-30

    An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in ordermore » to reduce the overall numerical uncertainty while leveraging available computational resources. Finally, the coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework.« less

  11. Transformation of a Traditional, Freshman Biology, Three-Semester Sequence, to a Two-Semester, Integrated Thematically Organized, and Team-Taught Course

    ERIC Educational Resources Information Center

    Soto, Julio G.; Everhart, Jerry

    2016-01-01

    Biology faculty at San José State University developed, piloted, implemented, and assessed a freshmen course sequence based on the macro-to micro-teaching approach that was team-taught, and organized around unifying themes. Content learning assessment drove the conceptual framework of our course sequence. Content student learning increased…

  12. A Unified Model of Student Engagement in Classroom Learning and Classroom Learning Environment: One Measure and One Underlying Construct

    ERIC Educational Resources Information Center

    Cavanagh, Robert F.

    2015-01-01

    This study employed the capabilities-expectations model of engagement in classroom learning based on bio-ecological frameworks of intellectual development and flow theory. According to the capabilities-expectations model, engagement requires a balance between the capabilities of a student for learning in a particular situation and what is expected…

  13. Quantification of causal couplings via dynamical effects: A unifying perspective

    NASA Astrophysics Data System (ADS)

    Smirnov, Dmitry A.

    2014-12-01

    Quantitative characterization of causal couplings from time series is crucial in studies of complex systems of different origin. Various statistical tools for that exist and new ones are still being developed with a tendency to creating a single, universal, model-free quantifier of coupling strength. However, a clear and generally applicable way of interpreting such universal characteristics is lacking. This work suggests a general conceptual framework for causal coupling quantification, which is based on state space models and extends the concepts of virtual interventions and dynamical causal effects. Namely, two basic kinds of interventions (state space and parametric) and effects (orbital or transient and stationary or limit) are introduced, giving four families of coupling characteristics. The framework provides a unifying view of apparently different well-established measures and allows us to introduce new characteristics, always with a definite "intervention-effect" interpretation. It is shown that diverse characteristics cannot be reduced to any single coupling strength quantifier and their interpretation is inevitably model based. The proposed set of dynamical causal effect measures quantifies different aspects of "how the coupling manifests itself in the dynamics," reformulating the very question about the "causal coupling strength."

  14. A general framework to learn surrogate relevance criterion for atlas based image segmentation

    NASA Astrophysics Data System (ADS)

    Zhao, Tingting; Ruan, Dan

    2016-09-01

    Multi-atlas based image segmentation sees great opportunities in the big data era but also faces unprecedented challenges in identifying positive contributors from extensive heterogeneous data. To assess data relevance, image similarity criteria based on various image features widely serve as surrogates for the inaccessible geometric agreement criteria. This paper proposes a general framework to learn image based surrogate relevance criteria to better mimic the behaviors of segmentation based oracle geometric relevance. The validity of its general rationale is verified in the specific context of fusion set selection for image segmentation. More specifically, we first present a unified formulation for surrogate relevance criteria and model the neighborhood relationship among atlases based on the oracle relevance knowledge. Surrogates are then trained to be small for geometrically relevant neighbors and large for irrelevant remotes to the given targets. The proposed surrogate learning framework is verified in corpus callosum segmentation. The learned surrogates demonstrate superiority in inferring the underlying oracle value and selecting relevant fusion set, compared to benchmark surrogates.

  15. A Unified Framework for Creating Domain Dependent Polarity Lexicons from User Generated Reviews

    PubMed Central

    Asghar, Muhammad Zubair; Khan, Aurangzeb; Ahmad, Shakeel; Khan, Imran Ali; Kundi, Fazal Masud

    2015-01-01

    The exponential increase in the explosion of Web-based user generated reviews has resulted in the emergence of Opinion Mining (OM) applications for analyzing the users’ opinions toward products, services, and policies. The polarity lexicons often play a pivotal role in the OM, indicating the positivity and negativity of a term along with the numeric score. However, the commonly available domain independent lexicons are not an optimal choice for all of the domains within the OM applications. The aforementioned is due to the fact that the polarity of a term changes from one domain to other and such lexicons do not contain the correct polarity of a term for every domain. In this work, we focus on the problem of adapting a domain dependent polarity lexicon from set of labeled user reviews and domain independent lexicon to propose a unified learning framework based on the information theory concepts that can assign the terms with correct polarity (+ive, -ive) scores. The benchmarking on three datasets (car, hotel, and drug reviews) shows that our approach improves the performance of the polarity classification by achieving higher accuracy. Moreover, using the derived domain dependent lexicon changed the polarity of terms, and the experimental results show that our approach is more effective than the base line methods. PMID:26466101

  16. A Hybrid Probabilistic Model for Unified Collaborative and Content-Based Image Tagging.

    PubMed

    Zhou, Ning; Cheung, William K; Qiu, Guoping; Xue, Xiangyang

    2011-07-01

    The increasing availability of large quantities of user contributed images with labels has provided opportunities to develop automatic tools to tag images to facilitate image search and retrieval. In this paper, we present a novel hybrid probabilistic model (HPM) which integrates low-level image features and high-level user provided tags to automatically tag images. For images without any tags, HPM predicts new tags based solely on the low-level image features. For images with user provided tags, HPM jointly exploits both the image features and the tags in a unified probabilistic framework to recommend additional tags to label the images. The HPM framework makes use of the tag-image association matrix (TIAM). However, since the number of images is usually very large and user-provided tags are diverse, TIAM is very sparse, thus making it difficult to reliably estimate tag-to-tag co-occurrence probabilities. We developed a collaborative filtering method based on nonnegative matrix factorization (NMF) for tackling this data sparsity issue. Also, an L1 norm kernel method is used to estimate the correlations between image features and semantic concepts. The effectiveness of the proposed approach has been evaluated using three databases containing 5,000 images with 371 tags, 31,695 images with 5,587 tags, and 269,648 images with 5,018 tags, respectively.

  17. From data towards knowledge: revealing the architecture of signaling systems by unifying knowledge mining and data mining of systematic perturbation data.

    PubMed

    Lu, Songjian; Jin, Bo; Cowart, L Ashley; Lu, Xinghua

    2013-01-01

    Genetic and pharmacological perturbation experiments, such as deleting a gene and monitoring gene expression responses, are powerful tools for studying cellular signal transduction pathways. However, it remains a challenge to automatically derive knowledge of a cellular signaling system at a conceptual level from systematic perturbation-response data. In this study, we explored a framework that unifies knowledge mining and data mining towards the goal. The framework consists of the following automated processes: 1) applying an ontology-driven knowledge mining approach to identify functional modules among the genes responding to a perturbation in order to reveal potential signals affected by the perturbation; 2) applying a graph-based data mining approach to search for perturbations that affect a common signal; and 3) revealing the architecture of a signaling system by organizing signaling units into a hierarchy based on their relationships. Applying this framework to a compendium of yeast perturbation-response data, we have successfully recovered many well-known signal transduction pathways; in addition, our analysis has led to many new hypotheses regarding the yeast signal transduction system; finally, our analysis automatically organized perturbed genes as a graph reflecting the architecture of the yeast signaling system. Importantly, this framework transformed molecular findings from a gene level to a conceptual level, which can be readily translated into computable knowledge in the form of rules regarding the yeast signaling system, such as "if genes involved in the MAPK signaling are perturbed, genes involved in pheromone responses will be differentially expressed."

  18. A unified framework for physical print quality

    NASA Astrophysics Data System (ADS)

    Eid, Ahmed; Cooper, Brian; Rippetoe, Ed

    2007-01-01

    In this paper we present a unified framework for physical print quality. This framework includes a design for a testbed, testing methodologies and quality measures of physical print characteristics. An automatic belt-fed flatbed scanning system is calibrated to acquire L* data for a wide range of flat field imagery. Testing methodologies based on wavelet pre-processing and spectral/statistical analysis are designed. We apply the proposed framework to three common printing artifacts: banding, jitter, and streaking. Since these artifacts are directional, wavelet based approaches are used to extract one artifact at a time and filter out other artifacts. Banding is characterized as a medium-to-low frequency, vertical periodic variation down the page. The same definition is applied to the jitter artifact, except that the jitter signal is characterized as a high-frequency signal above the banding frequency range. However, streaking is characterized as a horizontal aperiodic variation in the high-to-medium frequency range. Wavelets at different levels are applied to the input images in different directions to extract each artifact within specified frequency bands. Following wavelet reconstruction, images are converted into 1-D signals describing the artifact under concern. Accurate spectral analysis using a DFT with Blackman-Harris windowing technique is used to extract the power (strength) of periodic signals (banding and jitter). Since streaking is an aperiodic signal, a statistical measure is used to quantify the streaking strength. Experiments on 100 print samples scanned at 600 dpi from 10 different printers show high correlation (75% to 88%) between the ranking of these samples by the proposed metrologies and experts' visual ranking.

  19. A Unified Estimation Framework for State-Related Changes in Effective Brain Connectivity.

    PubMed

    Samdin, S Balqis; Ting, Chee-Ming; Ombao, Hernando; Salleh, Sh-Hussain

    2017-04-01

    This paper addresses the critical problem of estimating time-evolving effective brain connectivity. Current approaches based on sliding window analysis or time-varying coefficient models do not simultaneously capture both slow and abrupt changes in causal interactions between different brain regions. To overcome these limitations, we develop a unified framework based on a switching vector autoregressive (SVAR) model. Here, the dynamic connectivity regimes are uniquely characterized by distinct vector autoregressive (VAR) processes and allowed to switch between quasi-stationary brain states. The state evolution and the associated directed dependencies are defined by a Markov process and the SVAR parameters. We develop a three-stage estimation algorithm for the SVAR model: 1) feature extraction using time-varying VAR (TV-VAR) coefficients, 2) preliminary regime identification via clustering of the TV-VAR coefficients, 3) refined regime segmentation by Kalman smoothing and parameter estimation via expectation-maximization algorithm under a state-space formulation, using initial estimates from the previous two stages. The proposed framework is adaptive to state-related changes and gives reliable estimates of effective connectivity. Simulation results show that our method provides accurate regime change-point detection and connectivity estimates. In real applications to brain signals, the approach was able to capture directed connectivity state changes in functional magnetic resonance imaging data linked with changes in stimulus conditions, and in epileptic electroencephalograms, differentiating ictal from nonictal periods. The proposed framework accurately identifies state-dependent changes in brain network and provides estimates of connectivity strength and directionality. The proposed approach is useful in neuroscience studies that investigate the dynamics of underlying brain states.

  20. Deontological coherence: A framework for commonsense moral reasoning.

    PubMed

    Holyoak, Keith J; Powell, Derek

    2016-11-01

    We review a broad range of work, primarily in cognitive and social psychology, that provides insight into the processes of moral judgment. In particular, we consider research on pragmatic reasoning about regulations and on coherence in decision making, both areas in which psychological theories have been guided by work in legal philosophy. Armed with these essential prerequisites, we sketch a psychological framework for how ordinary people make judgments about moral issues. Based on a literature review, we show how the framework of deontological coherence unifies findings in moral psychology that have often been explained in terms of a grab-bag of heuristics and biases. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  1. A Model-Based Probabilistic Inversion Framework for Wire Fault Detection Using TDR

    NASA Technical Reports Server (NTRS)

    Schuet, Stefan R.; Timucin, Dogan A.; Wheeler, Kevin R.

    2010-01-01

    Time-domain reflectometry (TDR) is one of the standard methods for diagnosing faults in electrical wiring and interconnect systems, with a long-standing history focused mainly on hardware development of both high-fidelity systems for laboratory use and portable hand-held devices for field deployment. While these devices can easily assess distance to hard faults such as sustained opens or shorts, their ability to assess subtle but important degradation such as chafing remains an open question. This paper presents a unified framework for TDR-based chafing fault detection in lossy coaxial cables by combining an S-parameter based forward modeling approach with a probabilistic (Bayesian) inference algorithm. Results are presented for the estimation of nominal and faulty cable parameters from laboratory data.

  2. Modelling Participatory Geographic Information System for Customary Land Conflict Resolution

    NASA Astrophysics Data System (ADS)

    Gyamera, E. A.; Arko-Adjei, A.; Duncan, E. E.; Kuma, J. S. Y.

    2017-11-01

    Since land contributes to about 73 % of most countries Gross Domestic Product (GDP), attention on land rights have tremendously increased globally. Conflicts over land have therefore become part of the major problems associated with land administration. However, the conventional mechanisms for land conflict resolution do not provide satisfactory result to disputants due to various factors. This study sought to develop a Framework of using Participatory Geographic Information System (PGIS) for customary land conflict resolution. The framework was modelled using Unified Modelling Language (UML). The PGIS framework, called butterfly model, consists of three units namely, Social Unit (SU), Technical Unit (TU) and Decision Making Unit (DMU). The name butterfly model for land conflict resolution was adopted for the framework based on its features and properties. The framework has therefore been recommended to be adopted for land conflict resolution in customary areas.

  3. A unified framework for gesture recognition and spatiotemporal gesture segmentation.

    PubMed

    Alon, Jonathan; Athitsos, Vassilis; Yuan, Quan; Sclaroff, Stan

    2009-09-01

    Within the context of hand gesture recognition, spatiotemporal gesture segmentation is the task of determining, in a video sequence, where the gesturing hand is located and when the gesture starts and ends. Existing gesture recognition methods typically assume either known spatial segmentation or known temporal segmentation, or both. This paper introduces a unified framework for simultaneously performing spatial segmentation, temporal segmentation, and recognition. In the proposed framework, information flows both bottom-up and top-down. A gesture can be recognized even when the hand location is highly ambiguous and when information about when the gesture begins and ends is unavailable. Thus, the method can be applied to continuous image streams where gestures are performed in front of moving, cluttered backgrounds. The proposed method consists of three novel contributions: a spatiotemporal matching algorithm that can accommodate multiple candidate hand detections in every frame, a classifier-based pruning framework that enables accurate and early rejection of poor matches to gesture models, and a subgesture reasoning algorithm that learns which gesture models can falsely match parts of other longer gestures. The performance of the approach is evaluated on two challenging applications: recognition of hand-signed digits gestured by users wearing short-sleeved shirts, in front of a cluttered background, and retrieval of occurrences of signs of interest in a video database containing continuous, unsegmented signing in American Sign Language (ASL).

  4. A unifying retinex model based on non-local differential operators

    NASA Astrophysics Data System (ADS)

    Zosso, Dominique; Tran, Giang; Osher, Stanley

    2013-02-01

    In this paper, we present a unifying framework for retinex that is able to reproduce many of the existing retinex implementations within a single model. The fundamental assumption, as shared with many retinex models, is that the observed image is a multiplication between the illumination and the true underlying reflectance of the object. Starting from Morel's 2010 PDE model for retinex, where illumination is supposed to vary smoothly and where the reflectance is thus recovered from a hard-thresholded Laplacian of the observed image in a Poisson equation, we define our retinex model in similar but more general two steps. First, look for a filtered gradient that is the solution of an optimization problem consisting of two terms: The first term is a sparsity prior of the reflectance, such as the TV or H1 norm, while the second term is a quadratic fidelity prior of the reflectance gradient with respect to the observed image gradients. In a second step, since this filtered gradient almost certainly is not a consistent image gradient, we then look for a reflectance whose actual gradient comes close. Beyond unifying existing models, we are able to derive entirely novel retinex formulations by using more interesting non-local versions for the sparsity and fidelity prior. Hence we define within a single framework new retinex instances particularly suited for texture-preserving shadow removal, cartoon-texture decomposition, color and hyperspectral image enhancement.

  5. Exploring Environmental Factors in Nursing Workplaces That Promote Psychological Resilience: Constructing a Unified Theoretical Model

    PubMed Central

    Cusack, Lynette; Smith, Morgan; Hegney, Desley; Rees, Clare S.; Breen, Lauren J.; Witt, Regina R.; Rogers, Cath; Williams, Allison; Cross, Wendy; Cheung, Kin

    2016-01-01

    Building nurses' resilience to complex and stressful practice environments is necessary to keep skilled nurses in the workplace and ensuring safe patient care. A unified theoretical framework titled Health Services Workplace Environmental Resilience Model (HSWERM), is presented to explain the environmental factors in the workplace that promote nurses' resilience. The framework builds on a previously-published theoretical model of individual resilience, which identified the key constructs of psychological resilience as self-efficacy, coping and mindfulness, but did not examine environmental factors in the workplace that promote nurses' resilience. This unified theoretical framework was developed using a literary synthesis drawing on data from international studies and literature reviews on the nursing workforce in hospitals. The most frequent workplace environmental factors were identified, extracted and clustered in alignment with key constructs for psychological resilience. Six major organizational concepts emerged that related to a positive resilience-building workplace and formed the foundation of the theoretical model. Three concepts related to nursing staff support (professional, practice, personal) and three related to nursing staff development (professional, practice, personal) within the workplace environment. The unified theoretical model incorporates these concepts within the workplace context, linking to the nurse, and then impacting on personal resilience and workplace outcomes, and its use has the potential to increase staff retention and quality of patient care. PMID:27242567

  6. The neurosciences and the search for a unified psychology: the science and esthetics of a single framework

    PubMed Central

    Stam, Henderikus J.

    2015-01-01

    The search for a so-called unified or integrated theory has long served as a goal for some psychologists, even if the search is often implicit. But if the established sciences do not have an explicitly unified set of theories, then why should psychology? After examining this question again I argue that psychology is in fact reasonably unified around its methods and its commitment to functional explanations, an indeterminate functionalism. The question of the place of the neurosciences in this framework is complex. On the one hand, the neuroscientific project will not likely renew and synthesize the disparate arms of psychology. On the other hand, their reformulation of what it means to be human will exert an influence in multiple ways. One way to capture that influence is to conceptualize the brain in terms of a technology that we interact with in a manner that we do not yet fully understand. In this way we maintain both a distance from neuro-reductionism and refrain from committing to an unfettered subjectivity. PMID:26500571

  7. Efficient construction of unified continuous and discontinuous Galerkin formulations for the 3D Euler equations

    NASA Astrophysics Data System (ADS)

    Abdi, Daniel S.; Giraldo, Francis X.

    2016-09-01

    A unified approach for the numerical solution of the 3D hyperbolic Euler equations using high order methods, namely continuous Galerkin (CG) and discontinuous Galerkin (DG) methods, is presented. First, we examine how classical CG that uses a global storage scheme can be constructed within the DG framework using constraint imposition techniques commonly used in the finite element literature. Then, we implement and test a simplified version in the Non-hydrostatic Unified Model of the Atmosphere (NUMA) for the case of explicit time integration and a diagonal mass matrix. Constructing CG within the DG framework allows CG to benefit from the desirable properties of DG such as, easier hp-refinement, better stability etc. Moreover, this representation allows for regional mixing of CG and DG depending on the flow regime in an area. The different flavors of CG and DG in the unified implementation are then tested for accuracy and performance using a suite of benchmark problems representative of cloud-resolving scale, meso-scale and global-scale atmospheric dynamics. The value of our unified approach is that we are able to show how to carry both CG and DG methods within the same code and also offer a simple recipe for modifying an existing CG code to DG and vice versa.

  8. U.S. History Framework for the 2010 National Assessment of Educational Progress

    ERIC Educational Resources Information Center

    National Assessment Governing Board, 2009

    2009-01-01

    This framework identifies the main ideas, major events, key individuals, and unifying themes of American history as a basis for preparing the 2010 assessment. The framework recognizes that U.S. history includes powerful ideas, common and diverse traditions, economic developments, technological and scientific innovations, philosophical debates,…

  9. Applying Laban's Movement Framework in Elementary Physical Education

    ERIC Educational Resources Information Center

    Langton, Terence W.

    2007-01-01

    This article recommends raising the bar in elementary physical education by using Laban's movement framework to develop curriculum content in the areas of games, gymnastics, and dance (with physical fitness concepts blended in) in order to help students achieve the NASPE content standards. The movement framework can permeate and unify an…

  10. Analysis model for personal eHealth solutions and services.

    PubMed

    Mykkänen, Juha; Tuomainen, Mika; Luukkonen, Irmeli; Itälä, Timo

    2010-01-01

    In this paper, we present a framework for analysing and assessing various features of personal wellbeing information management services and solutions such as personal health records and citizen-oriented eHealth services. The model is based on general functional and interoperability standards for personal health management applications and generic frameworks for different aspects of analysis. It has been developed and used in the MyWellbeing project in Finland to provide baseline for the research, development and comparison of many different personal wellbeing and health management solutions and to support the development of unified "Coper" concept for citizen empowerment.

  11. Unified first principles description from warm dense matter to ideal ionized gas plasma: electron-ion collisions induced friction.

    PubMed

    Dai, Jiayu; Hou, Yong; Yuan, Jianmin

    2010-06-18

    Electron-ion interactions are central to numerous phenomena in the warm dense matter (WDM) regime and at higher temperature. The electron-ion collisions induced friction at high temperature is introduced in the procedure of ab initio molecular dynamics using the Langevin equation based on density functional theory. In this framework, as a test for Fe and H up to 1000 eV, the equation of state and the transition of electronic structures of the materials with very wide density and temperature can be described, which covers a full range of WDM up to high energy density physics. A unified first principles description from condensed matter to ideal ionized gas plasma is constructed.

  12. A Unified Steganalysis Framework

    DTIC Science & Technology

    2013-04-01

    contains more than 1800 images of different scenes. In the experiments, we used four JPEG based steganography techniques: Out- guess [13], F5 [16], model...also compressed these images again since some of the steganography meth- ods are double compressing the images . Stego- images are generated by embedding...randomly chosen messages (in bits) into 1600 grayscale images using each of the four steganography techniques. A random message length was determined

  13. Research directions in large scale systems and decentralized control

    NASA Technical Reports Server (NTRS)

    Tenney, R. R.

    1980-01-01

    Control theory provides a well established framework for dealing with automatic decision problems and a set of techniques for automatic decision making which exploit special structure, but it does not deal well with complexity. The potential exists for combining control theoretic and knowledge based concepts into a unified approach. The elements of control theory are diagrammed, including modern control and large scale systems.

  14. Programming model for distributed intelligent systems

    NASA Technical Reports Server (NTRS)

    Sztipanovits, J.; Biegl, C.; Karsai, G.; Bogunovic, N.; Purves, B.; Williams, R.; Christiansen, T.

    1988-01-01

    A programming model and architecture which was developed for the design and implementation of complex, heterogeneous measurement and control systems is described. The Multigraph Architecture integrates artificial intelligence techniques with conventional software technologies, offers a unified framework for distributed and shared memory based parallel computational models and supports multiple programming paradigms. The system can be implemented on different hardware architectures and can be adapted to strongly different applications.

  15. The Wage Premium on Tertiary Education: New Estimates for 21 OECD Countries Countries. OECD Economics Department Working Papers, No. 589

    ERIC Educational Resources Information Center

    Strauss, Hubert; de la Maisonneuve, Christine

    2007-01-01

    This paper presents cross-section estimates of gross hourly wage premia on tertiary education. They are based on a unified framework for 21 OECD countries from the 1990s to the early 2000s and use international household surveys to maximise international comparability. The results of the "augmented" Mincerian wage equations point to an…

  16. A Systems Approach to Comprehensive School Reform: Using the Realms of Meaning and the Baldridge Approach as a Systems Framework

    ERIC Educational Resources Information Center

    Miller-Williams, Sheri L.; Kritsonis, William Allan

    2009-01-01

    A system is a group of interacting, interrelated, and interdependent components that form a complex and unified whole. Systems thinking is a way of understanding reality that emphasizes the relationships among systems parts, rather than the parts themselves. Based on a field of study known as "system dynamics", systems thinking has a practical…

  17. UNITY: Confronting Supernova Cosmology's Statistical and Systematic Uncertainties in a Unified Bayesian Framework

    NASA Astrophysics Data System (ADS)

    Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The

    2015-11-01

    While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.

  18. Standard representation and unified stability analysis for dynamic artificial neural network models.

    PubMed

    Kim, Kwang-Ki K; Patrón, Ernesto Ríos; Braatz, Richard D

    2018-02-01

    An overview is provided of dynamic artificial neural network models (DANNs) for nonlinear dynamical system identification and control problems, and convex stability conditions are proposed that are less conservative than past results. The three most popular classes of dynamic artificial neural network models are described, with their mathematical representations and architectures followed by transformations based on their block diagrams that are convenient for stability and performance analyses. Classes of nonlinear dynamical systems that are universally approximated by such models are characterized, which include rigorous upper bounds on the approximation errors. A unified framework and linear matrix inequality-based stability conditions are described for different classes of dynamic artificial neural network models that take additional information into account such as local slope restrictions and whether the nonlinearities within the DANNs are odd. A theoretical example shows reduced conservatism obtained by the conditions. Copyright © 2017. Published by Elsevier Ltd.

  19. ELSI: A unified software interface for Kohn–Sham electronic structure solvers

    DOE PAGES

    Yu, Victor Wen-zhe; Corsetti, Fabiano; Garcia, Alberto; ...

    2017-09-15

    Solving the electronic structure from a generalized or standard eigenproblem is often the bottleneck in large scale calculations based on Kohn-Sham density-functional theory. This problem must be addressed by essentially all current electronic structure codes, based on similar matrix expressions, and by high-performance computation. We here present a unified software interface, ELSI, to access different strategies that address the Kohn-Sham eigenvalue problem. Currently supported algorithms include the dense generalized eigensolver library ELPA, the orbital minimization method implemented in libOMM, and the pole expansion and selected inversion (PEXSI) approach with lower computational complexity for semilocal density functionals. The ELSI interface aimsmore » to simplify the implementation and optimal use of the different strategies, by offering (a) a unified software framework designed for the electronic structure solvers in Kohn-Sham density-functional theory; (b) reasonable default parameters for a chosen solver; (c) automatic conversion between input and internal working matrix formats, and in the future (d) recommendation of the optimal solver depending on the specific problem. As a result, comparative benchmarks are shown for system sizes up to 11,520 atoms (172,800 basis functions) on distributed memory supercomputing architectures.« less

  20. ELSI: A unified software interface for Kohn-Sham electronic structure solvers

    NASA Astrophysics Data System (ADS)

    Yu, Victor Wen-zhe; Corsetti, Fabiano; García, Alberto; Huhn, William P.; Jacquelin, Mathias; Jia, Weile; Lange, Björn; Lin, Lin; Lu, Jianfeng; Mi, Wenhui; Seifitokaldani, Ali; Vázquez-Mayagoitia, Álvaro; Yang, Chao; Yang, Haizhao; Blum, Volker

    2018-01-01

    Solving the electronic structure from a generalized or standard eigenproblem is often the bottleneck in large scale calculations based on Kohn-Sham density-functional theory. This problem must be addressed by essentially all current electronic structure codes, based on similar matrix expressions, and by high-performance computation. We here present a unified software interface, ELSI, to access different strategies that address the Kohn-Sham eigenvalue problem. Currently supported algorithms include the dense generalized eigensolver library ELPA, the orbital minimization method implemented in libOMM, and the pole expansion and selected inversion (PEXSI) approach with lower computational complexity for semilocal density functionals. The ELSI interface aims to simplify the implementation and optimal use of the different strategies, by offering (a) a unified software framework designed for the electronic structure solvers in Kohn-Sham density-functional theory; (b) reasonable default parameters for a chosen solver; (c) automatic conversion between input and internal working matrix formats, and in the future (d) recommendation of the optimal solver depending on the specific problem. Comparative benchmarks are shown for system sizes up to 11,520 atoms (172,800 basis functions) on distributed memory supercomputing architectures.

  1. ELSI: A unified software interface for Kohn–Sham electronic structure solvers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Victor Wen-zhe; Corsetti, Fabiano; Garcia, Alberto

    Solving the electronic structure from a generalized or standard eigenproblem is often the bottleneck in large scale calculations based on Kohn-Sham density-functional theory. This problem must be addressed by essentially all current electronic structure codes, based on similar matrix expressions, and by high-performance computation. We here present a unified software interface, ELSI, to access different strategies that address the Kohn-Sham eigenvalue problem. Currently supported algorithms include the dense generalized eigensolver library ELPA, the orbital minimization method implemented in libOMM, and the pole expansion and selected inversion (PEXSI) approach with lower computational complexity for semilocal density functionals. The ELSI interface aimsmore » to simplify the implementation and optimal use of the different strategies, by offering (a) a unified software framework designed for the electronic structure solvers in Kohn-Sham density-functional theory; (b) reasonable default parameters for a chosen solver; (c) automatic conversion between input and internal working matrix formats, and in the future (d) recommendation of the optimal solver depending on the specific problem. As a result, comparative benchmarks are shown for system sizes up to 11,520 atoms (172,800 basis functions) on distributed memory supercomputing architectures.« less

  2. Sheldon Glashow, the Electroweak Theory, and the Grand Unified Theory

    Science.gov Websites

    ] 'Glashow shared the 1979 Nobel Prize for physics with Steven Weinberg and Abdus Salam for unifying the particle physics and provides a framework for understanding how the early universe evolved and how the our universe came into being," says Lawrence R. Sulak, chairman of the Boston University physics

  3. "UNICERT," or: Towards the Development of a Unified Language Certificate for German Universities.

    ERIC Educational Resources Information Center

    Voss, Bernd

    The standardization of second language proficiency levels for university students in Germany is discussed. Problems with the current system, in which each university has developed its own program of study and proficiency certification, are examined and a framework for development of a unified language certificate for all universities is outlined.…

  4. Generalized Information Theory Meets Human Cognition: Introducing a Unified Framework to Model Uncertainty and Information Search.

    PubMed

    Crupi, Vincenzo; Nelson, Jonathan D; Meder, Björn; Cevolani, Gustavo; Tentori, Katya

    2018-06-17

    Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. Copyright © 2018 Cognitive Science Society, Inc.

  5. Understanding public perceptions of biotechnology through the "Integrative Worldview Framework".

    PubMed

    De Witt, Annick; Osseweijer, Patricia; Pierce, Robin

    2015-07-03

    Biotechnological innovations prompt a range of societal responses that demand understanding. Research has shown such responses are shaped by individuals' cultural worldviews. We aim to demonstrate how the Integrative Worldview Framework (IWF) can be used for analyzing perceptions of biotechnology, by reviewing (1) research on public perceptions of biotechnology and (2) analyses of the stakeholder-debate on the bio-based economy, using the Integrative Worldview Framework (IWF) as analytical lens. This framework operationalizes the concept of worldview and distinguishes between traditional, modern, and postmodern worldviews, among others. Applied to these literatures, this framework illuminates how these worldviews underlie major societal responses, thereby providing a unifying understanding of the literature on perceptions of biotechnology. We conclude the IWF has relevance for informing research on perceptions of socio-technical changes, generating insight into the paradigmatic gaps in social science, and facilitating reflexive and inclusive policy-making and debates on these timely issues. © The Author(s) 2015.

  6. A unified effective-field renormalization-group framework approach for the quenched diluted Ising models

    NASA Astrophysics Data System (ADS)

    de Albuquerque, Douglas F.; Fittipaldi, I. P.

    1994-05-01

    A unified effective-field renormalization-group framework (EFRG) for both quenched bond- and site-diluted Ising models is herein developed by extending recent works. The method, as in the previous works, follows up the same strategy of the mean-field renormalization-group scheme (MFRG), and is achieved by introducing an alternative way for constructing classical effective-field equations of state, based on rigorous Ising spin identities. The concentration dependence of the critical temperature, Tc(p), and the critical concentrations of magnetic atoms, pc, at which the transition temperature goes to zero, are evaluated for several two- and three-dimensional lattice structures. The obtained values of Tc and pc and the resulting phase diagrams for both bond and site cases are much more accurate than those estimated by the standard MFRG approach. Although preserving the same level of simplicity as the MFRG, it is shown that the present EFRG method, even by considering its simplest size-cluster version, provides results that correctly distinguishes those lattices that have the same coordination number, but differ in dimensionality or geometry.

  7. A unified theoretical framework for mapping models for the multi-state Hamiltonian.

    PubMed

    Liu, Jian

    2016-11-28

    We propose a new unified theoretical framework to construct equivalent representations of the multi-state Hamiltonian operator and present several approaches for the mapping onto the Cartesian phase space. After mapping an F-dimensional Hamiltonian onto an F+1 dimensional space, creation and annihilation operators are defined such that the F+1 dimensional space is complete for any combined excitation. Commutation and anti-commutation relations are then naturally derived, which show that the underlying degrees of freedom are neither bosons nor fermions. This sets the scene for developing equivalent expressions of the Hamiltonian operator in quantum mechanics and their classical/semiclassical counterparts. Six mapping models are presented as examples. The framework also offers a novel way to derive such as the well-known Meyer-Miller model.

  8. P21 Framework Definitions

    ERIC Educational Resources Information Center

    Partnership for 21st Century Skills, 2009

    2009-01-01

    To help practitioners integrate skills into the teaching of core academic subjects, the Partnership for 21st Century Skills has developed a unified, collective vision for learning known as the Framework for 21st Century Learning. This Framework describes the skills, knowledge and expertise students must master to succeed in work and life; it is a…

  9. Toward a Unified Validation Framework in Mixed Methods Research

    ERIC Educational Resources Information Center

    Dellinger, Amy B.; Leech, Nancy L.

    2007-01-01

    The primary purpose of this article is to further discussions of validity in mixed methods research by introducing a validation framework to guide thinking about validity in this area. To justify the use of this framework, the authors discuss traditional terminology and validity criteria for quantitative and qualitative research, as well as…

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Yinan; Shi Handuo; Xiong Zhaoxi

    We present a unified universal quantum cloning machine, which combines several different existing universal cloning machines together, including the asymmetric case. In this unified framework, the identical pure states are projected equally into each copy initially constituted by input and one half of the maximally entangled states. We show explicitly that the output states of those universal cloning machines are the same. One importance of this unified cloning machine is that the cloning procession is always the symmetric projection, which reduces dramatically the difficulties for implementation. Also, it is found that this unified cloning machine can be directly modified tomore » the general asymmetric case. Besides the global fidelity and the single-copy fidelity, we also present all possible arbitrary-copy fidelities.« less

  11. KMgene: a unified R package for gene-based association analysis for complex traits.

    PubMed

    Yan, Qi; Fang, Zhou; Chen, Wei; Stegle, Oliver

    2018-02-09

    In this report, we introduce an R package KMgene for performing gene-based association tests for familial, multivariate or longitudinal traits using kernel machine (KM) regression under a generalized linear mixed model (GLMM) framework. Extensive simulations were performed to evaluate the validity of the approaches implemented in KMgene. http://cran.r-project.org/web/packages/KMgene. qi.yan@chp.edu or wei.chen@chp.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2018. Published by Oxford University Press.

  12. Emotion and the prefrontal cortex: An integrative review.

    PubMed

    Dixon, Matthew L; Thiruchselvam, Ravi; Todd, Rebecca; Christoff, Kalina

    2017-10-01

    The prefrontal cortex (PFC) plays a critical role in the generation and regulation of emotion. However, we lack an integrative framework for understanding how different emotion-related functions are organized across the entire expanse of the PFC, as prior reviews have generally focused on specific emotional processes (e.g., decision making) or specific anatomical regions (e.g., orbitofrontal cortex). Additionally, psychological theories and neuroscientific investigations have proceeded largely independently because of the lack of a common framework. Here, we provide a comprehensive review of functional neuroimaging, electrophysiological, lesion, and structural connectivity studies on the emotion-related functions of 8 subregions spanning the entire PFC. We introduce the appraisal-by-content model, which provides a new framework for integrating the diverse range of empirical findings. Within this framework, appraisal serves as a unifying principle for understanding the PFC's role in emotion, while relative content-specialization serves as a differentiating principle for understanding the role of each subregion. A synthesis of data from affective, social, and cognitive neuroscience studies suggests that different PFC subregions are preferentially involved in assigning value to specific types of inputs: exteroceptive sensations, episodic memories and imagined future events, viscero-sensory signals, viscero-motor signals, actions, others' mental states (e.g., intentions), self-related information, and ongoing emotions. We discuss the implications of this integrative framework for understanding emotion regulation, value-based decision making, emotional salience, and refining theoretical models of emotion. This framework provides a unified understanding of how emotional processes are organized across PFC subregions and generates new hypotheses about the mechanisms underlying adaptive and maladaptive emotional functioning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. Theory and applications of structured light single pixel imaging

    NASA Astrophysics Data System (ADS)

    Stokoe, Robert J.; Stockton, Patrick A.; Pezeshki, Ali; Bartels, Randy A.

    2018-02-01

    Many single-pixel imaging techniques have been developed in recent years. Though the methods of image acquisition vary considerably, the methods share unifying features that make general analysis possible. Furthermore, the methods developed thus far are based on intuitive processes that enable simple and physically-motivated reconstruction algorithms, however, this approach may not leverage the full potential of single-pixel imaging. We present a general theoretical framework of single-pixel imaging based on frame theory, which enables general, mathematically rigorous analysis. We apply our theoretical framework to existing single-pixel imaging techniques, as well as provide a foundation for developing more-advanced methods of image acquisition and reconstruction. The proposed frame theoretic framework for single-pixel imaging results in improved noise robustness, decrease in acquisition time, and can take advantage of special properties of the specimen under study. By building on this framework, new methods of imaging with a single element detector can be developed to realize the full potential associated with single-pixel imaging.

  14. Scalable large format 3D displays

    NASA Astrophysics Data System (ADS)

    Chang, Nelson L.; Damera-Venkata, Niranjan

    2010-02-01

    We present a general framework for the modeling and optimization of scalable large format 3-D displays using multiple projectors. Based on this framework, we derive algorithms that can robustly optimize the visual quality of an arbitrary combination of projectors (e.g. tiled, superimposed, combinations of the two) without manual adjustment. The framework creates for the first time a new unified paradigm that is agnostic to a particular configuration of projectors yet robustly optimizes for the brightness, contrast, and resolution of that configuration. In addition, we demonstrate that our algorithms support high resolution stereoscopic video at real-time interactive frame rates achieved on commodity graphics hardware. Through complementary polarization, the framework creates high quality multi-projector 3-D displays at low hardware and operational cost for a variety of applications including digital cinema, visualization, and command-and-control walls.

  15. Gain-Sparsity and Symmetry-Forced Rigidity in the Plane.

    PubMed

    Jordán, Tibor; Kaszanitzky, Viktória E; Tanigawa, Shin-Ichi

    We consider planar bar-and-joint frameworks with discrete point group symmetry in which the joint positions are as generic as possible subject to the symmetry constraint. We provide combinatorial characterizations for symmetry-forced rigidity of such structures with rotation symmetry or dihedral symmetry of order 2 k with odd k , unifying and extending previous work on this subject. We also explore the matroidal background of our results and show that the matroids induced by the row independence of the orbit matrices of the symmetric frameworks are isomorphic to gain sparsity matroids defined on the quotient graph of the framework, whose edges are labeled by elements of the corresponding symmetry group. The proofs are based on new Henneberg type inductive constructions of the gain graphs that correspond to the bases of the matroids in question, which can also be seen as symmetry preserving graph operations in the original graph.

  16. A Framework for Modeling and Simulation of the Artificial

    DTIC Science & Technology

    2012-01-01

    y or n) >> y Name: petra Simple Aspects: face_shape/thin, nose/small, skintone/light, hair_color/black, hair_type/curly Integrated Aspects...Multiconference. Orlando, FL (2012) 23. Mittal, S., Risco- Martin , J.: Netcentric System of Systems Engineering with DEVS Unified Process. CRC Press (2012) 24...Mittal, S., Risco- Martin , J., Zeigler, B.: DEVS-based simulation web services for net-centric T&E. In: Proceedings of the 2007 summer computer

  17. High-Maneuverability Airframe: Initial Investigation of Configuration’s Aft End for Increased Stability, Range, and Maneuverability

    DTIC Science & Technology

    2013-09-01

    including the interaction effects between the fins and canards. 2. Solution Technique 2.1 Computational Aerodynamics The double-precision solver of a...and overset grids (unified-grid). • Total variation diminishing discretization based on a new multidimensional interpolation framework. • Riemann ... solvers to provide proper signal propagation physics including versions for preconditioned forms of the governing equations. • Consistent and

  18. Collusion-resistant multimedia fingerprinting: a unified framework

    NASA Astrophysics Data System (ADS)

    Wu, Min; Trappe, Wade; Wang, Z. Jane; Liu, K. J. Ray

    2004-06-01

    Digital fingerprints are unique labels inserted in different copies of the same content before distribution. Each digital fingerprint is assigned to an inteded recipient, and can be used to trace the culprits who use their content for unintended purposes. Attacks mounted by multiple users, known as collusion attacks, provide a cost-effective method for attenuating the identifying fingerprint from each coluder, thus collusion poses a reeal challenge to protect the digital media data and enforce usage policies. This paper examines a few major design methodologies for collusion-resistant fingerprinting of multimedia, and presents a unified framework that helps highlight the common issues and the uniqueness of different fingerprinting techniques.

  19. Exploring the complementarity of THz pulse imaging and DCE-MRIs: Toward a unified multi-channel classification and a deep learning framework.

    PubMed

    Yin, X-X; Zhang, Y; Cao, J; Wu, J-L; Hadjiloucas, S

    2016-12-01

    We provide a comprehensive account of recent advances in biomedical image analysis and classification from two complementary imaging modalities: terahertz (THz) pulse imaging and dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). The work aims to highlight underlining commonalities in both data structures so that a common multi-channel data fusion framework can be developed. Signal pre-processing in both datasets is discussed briefly taking into consideration advances in multi-resolution analysis and model based fractional order calculus system identification. Developments in statistical signal processing using principal component and independent component analysis are also considered. These algorithms have been developed independently by the THz-pulse imaging and DCE-MRI communities, and there is scope to place them in a common multi-channel framework to provide better software standardization at the pre-processing de-noising stage. A comprehensive discussion of feature selection strategies is also provided and the importance of preserving textural information is highlighted. Feature extraction and classification methods taking into consideration recent advances in support vector machine (SVM) and extreme learning machine (ELM) classifiers and their complex extensions are presented. An outlook on Clifford algebra classifiers and deep learning techniques suitable to both types of datasets is also provided. The work points toward the direction of developing a new unified multi-channel signal processing framework for biomedical image analysis that will explore synergies from both sensing modalities for inferring disease proliferation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. A unifying kinetic framework for modeling oxidoreductase-catalyzed reactions.

    PubMed

    Chang, Ivan; Baldi, Pierre

    2013-05-15

    Oxidoreductases are a fundamental class of enzymes responsible for the catalysis of oxidation-reduction reactions, crucial in most bioenergetic metabolic pathways. From their common root in the ancient prebiotic environment, oxidoreductases have evolved into diverse and elaborate protein structures with specific kinetic properties and mechanisms adapted to their individual functional roles and environmental conditions. While accurate kinetic modeling of oxidoreductases is thus important, current models suffer from limitations to the steady-state domain, lack empirical validation or are too specialized to a single system or set of conditions. To address these limitations, we introduce a novel unifying modeling framework for kinetic descriptions of oxidoreductases. The framework is based on a set of seven elementary reactions that (i) form the basis for 69 pairs of enzyme state transitions for encoding various specific microscopic intra-enzyme reaction networks (micro-models), and (ii) lead to various specific macroscopic steady-state kinetic equations (macro-models) via thermodynamic assumptions. Thus, a synergistic bridge between the micro and macro kinetics can be achieved, enabling us to extract unitary rate constants, simulate reaction variance and validate the micro-models using steady-state empirical data. To help facilitate the application of this framework, we make available RedoxMech: a Mathematica™ software package that automates the generation and customization of micro-models. The Mathematica™ source code for RedoxMech, the documentation and the experimental datasets are all available from: http://www.igb.uci.edu/tools/sb/metabolic-modeling. pfbaldi@ics.uci.edu Supplementary data are available at Bioinformatics online.

  1. A unified computational model of the development of object unity, object permanence, and occluded object trajectory perception.

    PubMed

    Franz, A; Triesch, J

    2010-12-01

    The perception of the unity of objects, their permanence when out of sight, and the ability to perceive continuous object trajectories even during occlusion belong to the first and most important capacities that infants have to acquire. Despite much research a unified model of the development of these abilities is still missing. Here we make an attempt to provide such a unified model. We present a recurrent artificial neural network that learns to predict the motion of stimuli occluding each other and that develops representations of occluded object parts. It represents completely occluded, moving objects for several time steps and successfully predicts their reappearance after occlusion. This framework allows us to account for a broad range of experimental data. Specifically, the model explains how the perception of object unity develops, the role of the width of the occluders, and it also accounts for differences between data for moving and stationary stimuli. We demonstrate that these abilities can be acquired by learning to predict the sensory input. The model makes specific predictions and provides a unifying framework that has the potential to be extended to other visual event categories. Copyright © 2010 Elsevier Inc. All rights reserved.

  2. Reduction from cost-sensitive ordinal ranking to weighted binary classification.

    PubMed

    Lin, Hsuan-Tien; Li, Ling

    2012-05-01

    We present a reduction framework from ordinal ranking to binary classification. The framework consists of three steps: extracting extended examples from the original examples, learning a binary classifier on the extended examples with any binary classification algorithm, and constructing a ranker from the binary classifier. Based on the framework, we show that a weighted 0/1 loss of the binary classifier upper-bounds the mislabeling cost of the ranker, both error-wise and regret-wise. Our framework allows not only the design of good ordinal ranking algorithms based on well-tuned binary classification approaches, but also the derivation of new generalization bounds for ordinal ranking from known bounds for binary classification. In addition, our framework unifies many existing ordinal ranking algorithms, such as perceptron ranking and support vector ordinal regression. When compared empirically on benchmark data sets, some of our newly designed algorithms enjoy advantages in terms of both training speed and generalization performance over existing algorithms. In addition, the newly designed algorithms lead to better cost-sensitive ordinal ranking performance, as well as improved listwise ranking performance.

  3. Adaptive unified continuum FEM modeling of a 3D FSI benchmark problem.

    PubMed

    Jansson, Johan; Degirmenci, Niyazi Cem; Hoffman, Johan

    2017-09-01

    In this paper, we address a 3D fluid-structure interaction benchmark problem that represents important characteristics of biomedical modeling. We present a goal-oriented adaptive finite element methodology for incompressible fluid-structure interaction based on a streamline diffusion-type stabilization of the balance equations for mass and momentum for the entire continuum in the domain, which is implemented in the Unicorn/FEniCS software framework. A phase marker function and its corresponding transport equation are introduced to select the constitutive law, where the mesh tracks the discontinuous fluid-structure interface. This results in a unified simulation method for fluids and structures. We present detailed results for the benchmark problem compared with experiments, together with a mesh convergence study. Copyright © 2016 John Wiley & Sons, Ltd.

  4. The Pursuit of a "Better" Explanation as an Organizing Framework for Science Teaching and Learning

    ERIC Educational Resources Information Center

    Papadouris, Nicos; Vokos, Stamatis; Constantinou, Constantinos P.

    2018-01-01

    This article seeks to make the case for the pursuit of a "better" explanation being a productive organizing framework for science teaching and learning. Underlying this position is the idea that this framework allows promoting, in a unified manner, facility with the scientific practice of constructing explanations, appreciation of its…

  5. Music video shot segmentation using independent component analysis and keyframe extraction based on image complexity

    NASA Astrophysics Data System (ADS)

    Li, Wei; Chen, Ting; Zhang, Wenjun; Shi, Yunyu; Li, Jun

    2012-04-01

    In recent years, Music video data is increasing at an astonishing speed. Shot segmentation and keyframe extraction constitute a fundamental unit in organizing, indexing, retrieving video content. In this paper a unified framework is proposed to detect the shot boundaries and extract the keyframe of a shot. Music video is first segmented to shots by illumination-invariant chromaticity histogram in independent component (IC) analysis feature space .Then we presents a new metric, image complexity, to extract keyframe in a shot which is computed by ICs. Experimental results show the framework is effective and has a good performance.

  6. Space-Time Processing for Tactical Mobile Ad Hoc Networks

    DTIC Science & Technology

    2008-08-01

    vision for multiple concurrent communication settings, i.e., a many-to-many framework where multi-packet transmissions (MPTs) and multi-packet...modelling framework of capacity-delay tradeoffs We have introduced the first unified modeling framework for the computation of fundamental limits o We...dalities in wireless n twor i-packet modelling framework to account for the use of m lti-packet reception (MPR) f ad hoc networks with MPT under

  7. Unified formalism for higher order non-autonomous dynamical systems

    NASA Astrophysics Data System (ADS)

    Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso

    2012-03-01

    This work is devoted to giving a geometric framework for describing higher order non-autonomous mechanical systems. The starting point is to extend the Lagrangian-Hamiltonian unified formalism of Skinner and Rusk for these kinds of systems, generalizing previous developments for higher order autonomous mechanical systems and first-order non-autonomous mechanical systems. Then, we use this unified formulation to derive the standard Lagrangian and Hamiltonian formalisms, including the Legendre-Ostrogradsky map and the Euler-Lagrange and the Hamilton equations, both for regular and singular systems. As applications of our model, two examples of regular and singular physical systems are studied.

  8. Colaborated Architechture Framework for Composition UML 2.0 in Zachman Framework

    NASA Astrophysics Data System (ADS)

    Hermawan; Hastarista, Fika

    2016-01-01

    Zachman Framework (ZF) is the framework of enterprise architechture that most widely adopted in the Enterprise Information System (EIS) development. In this study, has been developed Colaborated Architechture Framework (CAF) to collaborate ZF with Unified Modeling Language (UML) 2.0 modeling. The CAF provides the composition of ZF matrix that each cell is consist of the Model Driven architechture (MDA) from the various UML models and many Software Requirement Specification (SRS) documents. Implementation of this modeling is used to develops Enterprise Resource Planning (ERP). Because ERP have a coverage of applications in large numbers and complexly relations, it is necessary to use Agile Model Driven Design (AMDD) approach as an advanced method to transforms MDA into components of application modules with efficiently and accurately. Finally, through the using of the CAF, give good achievement in fullfilment the needs from all stakeholders that are involved in the overall process stage of Rational Unified Process (RUP), and also obtaining a high satisfaction to fullfiled the functionality features of the ERP software in PT. Iglas (Persero) Gresik.

  9. Family Systems Theory: A Unifying Framework for Codependence.

    ERIC Educational Resources Information Center

    Prest, Layne A.; Protinsky, Howard

    1993-01-01

    Considers addictions and construct of codependence. Offers critical review and synthesis of codependency literature, along with an intergenerational family systems framework for conceptualizing the relationship of the dysfunctional family to the construct of codependence. Presents theoretical basis for systemic clinical work and research in this…

  10. Experimental analysis of chaotic neural network models for combinatorial optimization under a unifying framework.

    PubMed

    Kwok, T; Smith, K A

    2000-09-01

    The aim of this paper is to study both the theoretical and experimental properties of chaotic neural network (CNN) models for solving combinatorial optimization problems. Previously we have proposed a unifying framework which encompasses the three main model types, namely, Chen and Aihara's chaotic simulated annealing (CSA) with decaying self-coupling, Wang and Smith's CSA with decaying timestep, and the Hopfield network with chaotic noise. Each of these models can be represented as a special case under the framework for certain conditions. This paper combines the framework with experimental results to provide new insights into the effect of the chaotic neurodynamics of each model. By solving the N-queen problem of various sizes with computer simulations, the CNN models are compared in different parameter spaces, with optimization performance measured in terms of feasibility, efficiency, robustness and scalability. Furthermore, characteristic chaotic neurodynamics crucial to effective optimization are identified, together with a guide to choosing the corresponding model parameters.

  11. Communication between Brain Areas Based on Nested Oscillations

    PubMed Central

    Kastner, Sabine

    2017-01-01

    Abstract Unraveling how brain regions communicate is crucial for understanding how the brain processes external and internal information. Neuronal oscillations within and across brain regions have been proposed to play a crucial role in this process. Two main hypotheses have been suggested for routing of information based on oscillations, namely communication through coherence and gating by inhibition. Here, we propose a framework unifying these two hypotheses that is based on recent empirical findings. We discuss a theory in which communication between two regions is established by phase synchronization of oscillations at lower frequencies (<25 Hz), which serve as temporal reference frame for information carried by high-frequency activity (>40 Hz). Our framework, consistent with numerous recent empirical findings, posits that cross-frequency interactions are essential for understanding how large-scale cognitive and perceptual networks operate. PMID:28374013

  12. Concentration-driven models revisited: towards a unified framework to model settling tanks in water resource recovery facilities.

    PubMed

    Torfs, Elena; Martí, M Carmen; Locatelli, Florent; Balemans, Sophie; Bürger, Raimund; Diehl, Stefan; Laurent, Julien; Vanrolleghem, Peter A; François, Pierre; Nopens, Ingmar

    2017-02-01

    A new perspective on the modelling of settling behaviour in water resource recovery facilities is introduced. The ultimate goal is to describe in a unified way the processes taking place both in primary settling tanks (PSTs) and secondary settling tanks (SSTs) for a more detailed operation and control. First, experimental evidence is provided, pointing out distributed particle properties (such as size, shape, density, porosity, and flocculation state) as an important common source of distributed settling behaviour in different settling unit processes and throughout different settling regimes (discrete, hindered and compression settling). Subsequently, a unified model framework that considers several particle classes is proposed in order to describe distributions in settling behaviour as well as the effect of variations in particle properties on the settling process. The result is a set of partial differential equations (PDEs) that are valid from dilute concentrations, where they correspond to discrete settling, to concentrated suspensions, where they correspond to compression settling. Consequently, these PDEs model both PSTs and SSTs.

  13. Picking Deep Filter Responses for Fine-Grained Image Recognition (Open Access Author’s Manuscript)

    DTIC Science & Technology

    2016-12-16

    stages. Our method explores a unified framework based on two steps of deep filter response picking. The first picking step is to find distinctive... filters which respond to specific patterns significantly and consistently, and learn a set of part detectors via iteratively alternating between new...positive sample mining and part model retraining. The second picking step is to pool deep filter responses via spatially weighted combination of Fisher

  14. Comparative Effectiveness of TI-84 Graphing Calculators on Algebra I and Geometry Outcomes: A Report of Randomized Experiments in the East Side Union High School District and San Diego Unified School District. Research Report

    ERIC Educational Resources Information Center

    Miller, Gloria I.; Jaciw, Andrew; Hoshiko, Brandon; Wei, Xin

    2007-01-01

    Texas Instruments has undertaken a research program with the goal of producing scientifically-based evidence of the effectiveness of graphing calculators and the "TI-Navigator"[TM] classroom networking system in the context of a professional development and curriculum framework. The program includes a two-year longitudinal study. The…

  15. Integrated pathway-based transcription regulation network mining and visualization based on gene expression profiles.

    PubMed

    Kibinge, Nelson; Ono, Naoaki; Horie, Masafumi; Sato, Tetsuo; Sugiura, Tadao; Altaf-Ul-Amin, Md; Saito, Akira; Kanaya, Shigehiko

    2016-06-01

    Conventionally, workflows examining transcription regulation networks from gene expression data involve distinct analytical steps. There is a need for pipelines that unify data mining and inference deduction into a singular framework to enhance interpretation and hypotheses generation. We propose a workflow that merges network construction with gene expression data mining focusing on regulation processes in the context of transcription factor driven gene regulation. The pipeline implements pathway-based modularization of expression profiles into functional units to improve biological interpretation. The integrated workflow was implemented as a web application software (TransReguloNet) with functions that enable pathway visualization and comparison of transcription factor activity between sample conditions defined in the experimental design. The pipeline merges differential expression, network construction, pathway-based abstraction, clustering and visualization. The framework was applied in analysis of actual expression datasets related to lung, breast and prostrate cancer. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. OpenTox predictive toxicology framework: toxicological ontology and semantic media wiki-based OpenToxipedia

    PubMed Central

    2012-01-01

    Background The OpenTox Framework, developed by the partners in the OpenTox project (http://www.opentox.org), aims at providing a unified access to toxicity data, predictive models and validation procedures. Interoperability of resources is achieved using a common information model, based on the OpenTox ontologies, describing predictive algorithms, models and toxicity data. As toxicological data may come from different, heterogeneous sources, a deployed ontology, unifying the terminology and the resources, is critical for the rational and reliable organization of the data, and its automatic processing. Results The following related ontologies have been developed for OpenTox: a) Toxicological ontology – listing the toxicological endpoints; b) Organs system and Effects ontology – addressing organs, targets/examinations and effects observed in in vivo studies; c) ToxML ontology – representing semi-automatic conversion of the ToxML schema; d) OpenTox ontology– representation of OpenTox framework components: chemical compounds, datasets, types of algorithms, models and validation web services; e) ToxLink–ToxCast assays ontology and f) OpenToxipedia community knowledge resource on toxicology terminology. OpenTox components are made available through standardized REST web services, where every compound, data set, and predictive method has a unique resolvable address (URI), used to retrieve its Resource Description Framework (RDF) representation, or to initiate the associated calculations and generate new RDF-based resources. The services support the integration of toxicity and chemical data from various sources, the generation and validation of computer models for toxic effects, seamless integration of new algorithms and scientifically sound validation routines and provide a flexible framework, which allows building arbitrary number of applications, tailored to solving different problems by end users (e.g. toxicologists). Availability The OpenTox toxicological ontology projects may be accessed via the OpenTox ontology development page http://www.opentox.org/dev/ontology; the OpenTox ontology is available as OWL at http://opentox.org/api/1 1/opentox.owl, the ToxML - OWL conversion utility is an open source resource available at http://ambit.svn.sourceforge.net/viewvc/ambit/branches/toxml-utils/ PMID:22541598

  17. OpenTox predictive toxicology framework: toxicological ontology and semantic media wiki-based OpenToxipedia.

    PubMed

    Tcheremenskaia, Olga; Benigni, Romualdo; Nikolova, Ivelina; Jeliazkova, Nina; Escher, Sylvia E; Batke, Monika; Baier, Thomas; Poroikov, Vladimir; Lagunin, Alexey; Rautenberg, Micha; Hardy, Barry

    2012-04-24

    The OpenTox Framework, developed by the partners in the OpenTox project (http://www.opentox.org), aims at providing a unified access to toxicity data, predictive models and validation procedures. Interoperability of resources is achieved using a common information model, based on the OpenTox ontologies, describing predictive algorithms, models and toxicity data. As toxicological data may come from different, heterogeneous sources, a deployed ontology, unifying the terminology and the resources, is critical for the rational and reliable organization of the data, and its automatic processing. The following related ontologies have been developed for OpenTox: a) Toxicological ontology - listing the toxicological endpoints; b) Organs system and Effects ontology - addressing organs, targets/examinations and effects observed in in vivo studies; c) ToxML ontology - representing semi-automatic conversion of the ToxML schema; d) OpenTox ontology- representation of OpenTox framework components: chemical compounds, datasets, types of algorithms, models and validation web services; e) ToxLink-ToxCast assays ontology and f) OpenToxipedia community knowledge resource on toxicology terminology.OpenTox components are made available through standardized REST web services, where every compound, data set, and predictive method has a unique resolvable address (URI), used to retrieve its Resource Description Framework (RDF) representation, or to initiate the associated calculations and generate new RDF-based resources.The services support the integration of toxicity and chemical data from various sources, the generation and validation of computer models for toxic effects, seamless integration of new algorithms and scientifically sound validation routines and provide a flexible framework, which allows building arbitrary number of applications, tailored to solving different problems by end users (e.g. toxicologists). The OpenTox toxicological ontology projects may be accessed via the OpenTox ontology development page http://www.opentox.org/dev/ontology; the OpenTox ontology is available as OWL at http://opentox.org/api/1 1/opentox.owl, the ToxML - OWL conversion utility is an open source resource available at http://ambit.svn.sourceforge.net/viewvc/ambit/branches/toxml-utils/

  18. A Unified Model of Geostrophic Adjustment and Frontogenesis

    NASA Astrophysics Data System (ADS)

    Taylor, John; Shakespeare, Callum

    2013-11-01

    Fronts, or regions with strong horizontal density gradients, are ubiquitous and dynamically important features of the ocean and atmosphere. In the ocean, fronts are associated with enhanced air-sea fluxes, turbulence, and biological productivity, while atmospheric fronts are associated with some of the most extreme weather events. Here, we describe a new mathematical framework for describing the formation of fronts, or frontogenesis. This framework unifies two classical problems in geophysical fluid dynamics, geostrophic adjustment and strain-driven frontogenesis, and provides a number of important extensions beyond previous efforts. The model solutions closely match numerical simulations during the early stages of frontogenesis, and provide a means to describe the development of turbulence at mature fronts.

  19. Integrating diverse databases into an unified analysis framework: a Galaxy approach

    PubMed Central

    Blankenberg, Daniel; Coraor, Nathan; Von Kuster, Gregory; Taylor, James; Nekrutenko, Anton

    2011-01-01

    Recent technological advances have lead to the ability to generate large amounts of data for model and non-model organisms. Whereas, in the past, there have been a relatively small number of central repositories that serve genomic data, an increasing number of distinct specialized data repositories and resources have been established. Here, we describe a generic approach that provides for the integration of a diverse spectrum of data resources into a unified analysis framework, Galaxy (http://usegalaxy.org). This approach allows the simplified coupling of external data resources with the data analysis tools available to Galaxy users, while leveraging the native data mining facilities of the external data resources. Database URL: http://usegalaxy.org PMID:21531983

  20. Putting the School Interoperability Framework to the Test

    ERIC Educational Resources Information Center

    Mercurius, Neil; Burton, Glenn; Hopkins, Bill; Larsen, Hans

    2004-01-01

    The Jurupa Unified School District in Southern California recently partnered with Microsoft, Dell and the Zone Integration Group for the implementation of a School Interoperability Framework (SIF) database repository model throughout the district (Magner 2002). A two-week project--the Integrated District Education Applications System, better known…

  1. Generalized Multilevel Structural Equation Modeling

    ERIC Educational Resources Information Center

    Rabe-Hesketh, Sophia; Skrondal, Anders; Pickles, Andrew

    2004-01-01

    A unifying framework for generalized multilevel structural equation modeling is introduced. The models in the framework, called generalized linear latent and mixed models (GLLAMM), combine features of generalized linear mixed models (GLMM) and structural equation models (SEM) and consist of a response model and a structural model for the latent…

  2. A Clustering-Based Approach to Enriching Code Foraging Environment.

    PubMed

    Niu, Nan; Jin, Xiaoyu; Niu, Zhendong; Cheng, Jing-Ru C; Li, Ling; Kataev, Mikhail Yu

    2016-09-01

    Developers often spend valuable time navigating and seeking relevant code in software maintenance. Currently, there is a lack of theoretical foundations to guide tool design and evaluation to best shape the code base to developers. This paper contributes a unified code navigation theory in light of the optimal food-foraging principles. We further develop a novel framework for automatically assessing the foraging mechanisms in the context of program investigation. We use the framework to examine to what extent the clustering of software entities affects code foraging. Our quantitative analysis of long-lived open-source projects suggests that clustering enriches the software environment and improves foraging efficiency. Our qualitative inquiry reveals concrete insights into real developer's behavior. Our research opens the avenue toward building a new set of ecologically valid code navigation tools.

  3. Traits Without Borders: Integrating Functional Diversity Across Scales.

    PubMed

    Carmona, Carlos P; de Bello, Francesco; Mason, Norman W H; Lepš, Jan

    2016-05-01

    Owing to the conceptual complexity of functional diversity (FD), a multitude of different methods are available for measuring it, with most being operational at only a small range of spatial scales. This causes uncertainty in ecological interpretations and limits the potential to generalize findings across studies or compare patterns across scales. We solve this problem by providing a unified framework expanding on and integrating existing approaches. The framework, based on trait probability density (TPD), is the first to fully implement the Hutchinsonian concept of the niche as a probabilistic hypervolume in estimating FD. This novel approach could revolutionize FD-based research by allowing quantification of the various FD components from organismal to macroecological scales, and allowing seamless transitions between scales. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Synergy of the Developed 6D BIM Framework and Conception of the nD BIM Framework and nD BIM Process Ontology

    ERIC Educational Resources Information Center

    O'Keeffe, Shawn Edward

    2013-01-01

    The author developed a unified nD framework and process ontology for Building Information Modeling (BIM). The research includes a framework developed for 6D BIM, nD BIM, and nD ontology that defines the domain and sub-domain constructs for future nD BIM dimensions. The nD ontology defines the relationships of kinds within any new proposed…

  5. Failure to Visualize and Describe Operations: The Evolution and Implementation of the Operational Framework

    DTIC Science & Technology

    2017-05-25

    Operations, and Unified Land Operations) and the US Army’s leader development model identifies how the education , training, and experience of field-grade...officers have failed in their incorporation of the framework because they lack the education , training, and experience for the use of the framework... education , training, and experience of field-grade officers at the division level have influenced their use of the operational framework. The cause for

  6. Parametric models to relate spike train and LFP dynamics with neural information processing.

    PubMed

    Banerjee, Arpan; Dean, Heather L; Pesaran, Bijan

    2012-01-01

    Spike trains and local field potentials (LFPs) resulting from extracellular current flows provide a substrate for neural information processing. Understanding the neural code from simultaneous spike-field recordings and subsequent decoding of information processing events will have widespread applications. One way to demonstrate an understanding of the neural code, with particular advantages for the development of applications, is to formulate a parametric statistical model of neural activity and its covariates. Here, we propose a set of parametric spike-field models (unified models) that can be used with existing decoding algorithms to reveal the timing of task or stimulus specific processing. Our proposed unified modeling framework captures the effects of two important features of information processing: time-varying stimulus-driven inputs and ongoing background activity that occurs even in the absence of environmental inputs. We have applied this framework for decoding neural latencies in simulated and experimentally recorded spike-field sessions obtained from the lateral intraparietal area (LIP) of awake, behaving monkeys performing cued look-and-reach movements to spatial targets. Using both simulated and experimental data, we find that estimates of trial-by-trial parameters are not significantly affected by the presence of ongoing background activity. However, including background activity in the unified model improves goodness of fit for predicting individual spiking events. Uncovering the relationship between the model parameters and the timing of movements offers new ways to test hypotheses about the relationship between neural activity and behavior. We obtained significant spike-field onset time correlations from single trials using a previously published data set where significantly strong correlation was only obtained through trial averaging. We also found that unified models extracted a stronger relationship between neural response latency and trial-by-trial behavioral performance than existing models of neural information processing. Our results highlight the utility of the unified modeling framework for characterizing spike-LFP recordings obtained during behavioral performance.

  7. Toward a unified approach to dose-response modeling in ecotoxicology.

    PubMed

    Ritz, Christian

    2010-01-01

    This study reviews dose-response models that are used in ecotoxicology. The focus lies on clarification of differences and similarities between models, and as a side effect, their different guises in ecotoxicology are unravelled. A look at frequently used dose-response models reveals major discrepancies, among other things in naming conventions. Therefore, there is a need for a unified view on dose-response modeling in order to improve the understanding of it and to facilitate communication and comparison of findings across studies, thus realizing its full potential. This study attempts to establish a general framework that encompasses most dose-response models that are of interest to ecotoxicologists in practice. The framework includes commonly used models such as the log-logistic and Weibull models, but also features entire suites of models as found in various guidance documents. An outline on how the proposed framework can be implemented in statistical software systems is also provided.

  8. Theoretical Foundation of Copernicus: A Unified System for Trajectory Design and Optimization

    NASA Technical Reports Server (NTRS)

    Ocampo, Cesar; Senent, Juan S.; Williams, Jacob

    2010-01-01

    The fundamental methods are described for the general spacecraft trajectory design and optimization software system called Copernicus. The methods rely on a unified framework that is used to model, design, and optimize spacecraft trajectories that may operate in complex gravitational force fields, use multiple propulsion systems, and involve multiple spacecraft. The trajectory model, with its associated equations of motion and maneuver models, are discussed.

  9. Unified framework for triaxial accelerometer-based fall event detection and classification using cumulants and hierarchical decision tree classifier.

    PubMed

    Kambhampati, Satya Samyukta; Singh, Vishal; Manikandan, M Sabarimalai; Ramkumar, Barathram

    2015-08-01

    In this Letter, the authors present a unified framework for fall event detection and classification using the cumulants extracted from the acceleration (ACC) signals acquired using a single waist-mounted triaxial accelerometer. The main objective of this Letter is to find suitable representative cumulants and classifiers in effectively detecting and classifying different types of fall and non-fall events. It was discovered that the first level of the proposed hierarchical decision tree algorithm implements fall detection using fifth-order cumulants and support vector machine (SVM) classifier. In the second level, the fall event classification algorithm uses the fifth-order cumulants and SVM. Finally, human activity classification is performed using the second-order cumulants and SVM. The detection and classification results are compared with those of the decision tree, naive Bayes, multilayer perceptron and SVM classifiers with different types of time-domain features including the second-, third-, fourth- and fifth-order cumulants and the signal magnitude vector and signal magnitude area. The experimental results demonstrate that the second- and fifth-order cumulant features and SVM classifier can achieve optimal detection and classification rates of above 95%, as well as the lowest false alarm rate of 1.03%.

  10. A general unified framework to assess the sampling variance of heritability estimates using pedigree or marker-based relationships.

    PubMed

    Visscher, Peter M; Goddard, Michael E

    2015-01-01

    Heritability is a population parameter of importance in evolution, plant and animal breeding, and human medical genetics. It can be estimated using pedigree designs and, more recently, using relationships estimated from markers. We derive the sampling variance of the estimate of heritability for a wide range of experimental designs, assuming that estimation is by maximum likelihood and that the resemblance between relatives is solely due to additive genetic variation. We show that well-known results for balanced designs are special cases of a more general unified framework. For pedigree designs, the sampling variance is inversely proportional to the variance of relationship in the pedigree and it is proportional to 1/N, whereas for population samples it is approximately proportional to 1/N(2), where N is the sample size. Variation in relatedness is a key parameter in the quantification of the sampling variance of heritability. Consequently, the sampling variance is high for populations with large recent effective population size (e.g., humans) because this causes low variation in relationship. However, even using human population samples, low sampling variance is possible with high N. Copyright © 2015 by the Genetics Society of America.

  11. A theoretical framework for antigay aggression: Review of established and hypothesized effects within the context of the general aggression model⋆

    PubMed Central

    Parrott, Dominic J.

    2008-01-01

    Theory and research on antigay aggression has identified different motives that facilitate aggression based on sexual orientation. However, the individual and situational determinants of antigay aggression associated with these motivations have yet to be organized within a single theoretical framework. This limits researchers’ ability to organize existing knowledge, link that knowledge with related aggression theory, and guide the application of new findings. To address these limitations, this article argues for the use of an existing conceptual framework to guide thinking and generate new research in this area of study. Contemporary theories of antigay aggression, and empirical support for these theories, are reviewed and interpreted within the unifying framework of the general aggression model [Anderson, C.A. & Bushman, B.J. (2002). Human aggression. Annual Review of Psychology, 53, 27–51.]. It is concluded that this conceptual framework will facilitate investigation of individual and situational risk factors that may contribute to antigay aggression and guide development of individual-level intervention. PMID:18355952

  12. Generalizability Theory as a Unifying Framework of Measurement Reliability in Adolescent Research

    ERIC Educational Resources Information Center

    Fan, Xitao; Sun, Shaojing

    2014-01-01

    In adolescence research, the treatment of measurement reliability is often fragmented, and it is not always clear how different reliability coefficients are related. We show that generalizability theory (G-theory) is a comprehensive framework of measurement reliability, encompassing all other reliability methods (e.g., Pearson "r,"…

  13. Knowledge Discovery from Posts in Online Health Communities Using Unified Medical Language System.

    PubMed

    Chen, Donghua; Zhang, Runtong; Liu, Kecheng; Hou, Lei

    2018-06-19

    Patient-reported posts in Online Health Communities (OHCs) contain various valuable information that can help establish knowledge-based online support for online patients. However, utilizing these reports to improve online patient services in the absence of appropriate medical and healthcare expert knowledge is difficult. Thus, we propose a comprehensive knowledge discovery method that is based on the Unified Medical Language System for the analysis of narrative posts in OHCs. First, we propose a domain-knowledge support framework for OHCs to provide a basis for post analysis. Second, we develop a Knowledge-Involved Topic Modeling (KI-TM) method to extract and expand explicit knowledge within the text. We propose four metrics, namely, explicit knowledge rate, latent knowledge rate, knowledge correlation rate, and perplexity, for the evaluation of the KI-TM method. Our experimental results indicate that our proposed method outperforms existing methods in terms of providing knowledge support. Our method enhances knowledge support for online patients and can help develop intelligent OHCs in the future.

  14. Framework for event-based semidistributed modeling that unifies the SCS-CN method, VIC, PDM, and TOPMODEL

    NASA Astrophysics Data System (ADS)

    Bartlett, M. S.; Parolari, A. J.; McDonnell, J. J.; Porporato, A.

    2016-09-01

    Hydrologists and engineers may choose from a range of semidistributed rainfall-runoff models such as VIC, PDM, and TOPMODEL, all of which predict runoff from a distribution of watershed properties. However, these models are not easily compared to event-based data and are missing ready-to-use analytical expressions that are analogous to the SCS-CN method. The SCS-CN method is an event-based model that describes the runoff response with a rainfall-runoff curve that is a function of the cumulative storm rainfall and antecedent wetness condition. Here we develop an event-based probabilistic storage framework and distill semidistributed models into analytical, event-based expressions for describing the rainfall-runoff response. The event-based versions called VICx, PDMx, and TOPMODELx also are extended with a spatial description of the runoff concept of "prethreshold" and "threshold-excess" runoff, which occur, respectively, before and after infiltration exceeds a storage capacity threshold. For total storm rainfall and antecedent wetness conditions, the resulting ready-to-use analytical expressions define the source areas (fraction of the watershed) that produce runoff by each mechanism. They also define the probability density function (PDF) representing the spatial variability of runoff depths that are cumulative values for the storm duration, and the average unit area runoff, which describes the so-called runoff curve. These new event-based semidistributed models and the traditional SCS-CN method are unified by the same general expression for the runoff curve. Since the general runoff curve may incorporate different model distributions, it may ease the way for relating such distributions to land use, climate, topography, ecology, geology, and other characteristics.

  15. An integrative model of auditory phantom perception: tinnitus as a unified percept of interacting separable subnetworks.

    PubMed

    De Ridder, Dirk; Vanneste, Sven; Weisz, Nathan; Londero, Alain; Schlee, Winnie; Elgoyhen, Ana Belen; Langguth, Berthold

    2014-07-01

    Tinnitus is a considered to be an auditory phantom phenomenon, a persistent conscious percept of a salient memory trace, externally attributed, in the absence of a sound source. It is perceived as a phenomenological unified coherent percept, binding multiple separable clinical characteristics, such as its loudness, the sidedness, the type (pure tone, noise), the associated distress and so on. A theoretical pathophysiological framework capable of explaining all these aspects in one model is highly needed. The model must incorporate both the deafferentation based neurophysiological models and the dysfunctional noise canceling model, and propose a 'tinnitus core' subnetwork. The tinnitus core can be defined as the minimal set of brain areas that needs to be jointly activated (=subnetwork) for tinnitus to be consciously perceived, devoid of its affective components. The brain areas involved in the other separable characteristics of tinnitus can be retrieved by studies on spontaneous resting state magnetic and electrical activity in people with tinnitus, evaluated for the specific aspect investigated and controlled for other factors. By combining these functional imaging studies with neuromodulation techniques some of the correlations are turned into causal relationships. Thereof, a heuristic pathophysiological framework is constructed, integrating the tinnitus perceptual core with the other tinnitus related aspects. This phenomenological unified percept of tinnitus can be considered an emergent property of multiple, parallel, dynamically changing and partially overlapping subnetworks, each with a specific spontaneous oscillatory pattern and functional connectivity signature. Communication between these different subnetworks is proposed to occur at hubs, brain areas that are involved in multiple subnetworks simultaneously. These hubs can take part in each separable subnetwork at different frequencies. Communication between the subnetworks is proposed to occur at discrete oscillatory frequencies. As such, the brain uses multiple nonspecific networks in parallel, each with their own oscillatory signature, that adapt to the context to construct a unified percept possibly by synchronized activation integrated at hubs at discrete oscillatory frequencies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Physically Based Modeling and Simulation with Dynamic Spherical Volumetric Simplex Splines

    PubMed Central

    Tan, Yunhao; Hua, Jing; Qin, Hong

    2009-01-01

    In this paper, we present a novel computational modeling and simulation framework based on dynamic spherical volumetric simplex splines. The framework can handle the modeling and simulation of genus-zero objects with real physical properties. In this framework, we first develop an accurate and efficient algorithm to reconstruct the high-fidelity digital model of a real-world object with spherical volumetric simplex splines which can represent with accuracy geometric, material, and other properties of the object simultaneously. With the tight coupling of Lagrangian mechanics, the dynamic volumetric simplex splines representing the object can accurately simulate its physical behavior because it can unify the geometric and material properties in the simulation. The visualization can be directly computed from the object’s geometric or physical representation based on the dynamic spherical volumetric simplex splines during simulation without interpolation or resampling. We have applied the framework for biomechanic simulation of brain deformations, such as brain shifting during the surgery and brain injury under blunt impact. We have compared our simulation results with the ground truth obtained through intra-operative magnetic resonance imaging and the real biomechanic experiments. The evaluations demonstrate the excellent performance of our new technique. PMID:20161636

  17. Classifying clinical decision making: a unifying approach.

    PubMed

    Buckingham, C D; Adams, A

    2000-10-01

    This is the first of two linked papers exploring decision making in nursing which integrate research evidence from different clinical and academic disciplines. Currently there are many decision-making theories, each with their own distinctive concepts and terminology, and there is a tendency for separate disciplines to view their own decision-making processes as unique. Identifying good nursing decisions and where improvements can be made is therefore problematic, and this can undermine clinical and organizational effectiveness, as well as nurses' professional status. Within the unifying framework of psychological classification, the overall aim of the two papers is to clarify and compare terms, concepts and processes identified in a diversity of decision-making theories, and to demonstrate their underlying similarities. It is argued that the range of explanations used across disciplines can usefully be re-conceptualized as classification behaviour. This paper explores problems arising from multiple theories of decision making being applied to separate clinical disciplines. Attention is given to detrimental effects on nursing practice within the context of multidisciplinary health-care organizations and the changing role of nurses. The different theories are outlined and difficulties in applying them to nursing decisions highlighted. An alternative approach based on a general model of classification is then presented in detail to introduce its terminology and the unifying framework for interpreting all types of decisions. The classification model is used to provide the context for relating alternative philosophical approaches and to define decision-making activities common to all clinical domains. This may benefit nurses by improving multidisciplinary collaboration and weakening clinical elitism.

  18. Towards a Grand Unified Theory of sports performance.

    PubMed

    Glazier, Paul S

    2017-12-01

    Sports performance is generally considered to be governed by a range of interacting physiological, biomechanical, and psychological variables, amongst others. Despite sports performance being multi-factorial, however, the majority of performance-oriented sports science research has predominantly been monodisciplinary in nature, presumably due, at least in part, to the lack of a unifying theoretical framework required to integrate the various subdisciplines of sports science. In this target article, I propose a Grand Unified Theory (GUT) of sports performance-and, by elaboration, sports science-based around the constraints framework introduced originally by Newell (1986). A central tenet of this GUT is that, at both the intra- and inter-individual levels of analysis, patterns of coordination and control, which directly determine the performance outcome, emerge from the confluence of interacting organismic, environmental, and task constraints via the formation and self-organisation of coordinative structures. It is suggested that this GUT could be used to: foster interdisciplinary research collaborations; break down the silos that have developed in sports science and restore greater disciplinary balance to the field; promote a more holistic understanding of sports performance across all levels of analysis; increase explanatory power of applied research work; provide stronger rationale for data collection and variable selection; and direct the development of integrated performance monitoring technologies. This GUT could also provide a scientifically rigorous basis for integrating the subdisciplines of sports science in applied sports science support programmes adopted by high-performance agencies and national governing bodies for various individual and team sports. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Evaluating Comprehensive State Tobacco Prevention and Control Programs Using an Outcome Indicator Framework.

    PubMed

    Fulmer, Erika; Rogers, Todd; Glasgow, LaShawn; Brown, Susan; Kuiper, Nicole

    2018-03-01

    The outcome indicator framework helps tobacco prevention and control programs (TCPs) plan and implement theory-driven evaluations of their efforts to reduce and prevent tobacco use. Tobacco use is the single-most preventable cause of morbidity and mortality in the United States. The implementation of public health best practices by comprehensive state TCPs has been shown to prevent the initiation of tobacco use, reduce tobacco use prevalence, and decrease tobacco-related health care expenditures. Achieving and sustaining program goals require TCPs to evaluate the effectiveness and impact of their programs. To guide evaluation efforts by TCPs, the Centers for Disease Control and Prevention's Office on Smoking and Health developed an outcome indicator framework that includes a high-level logic model and evidence-based outcome indicators for each tobacco prevention and control goal area. In this article, we describe how TCPs and other community organizations can use the outcome indicator framework in their evaluation efforts. We also discuss how the framework is used at the national level to unify tobacco prevention and control efforts across varying state contexts, identify promising practices, and expand the public health evidence base.

  20. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  1. Unified Behavior Framework for Discrete Event Simulation Systems

    DTIC Science & Technology

    2015-03-26

    I would like to thank Dr. Hodson for his guidance and direction throughout the AFIT program. I also would like to thank my thesis committee members...SPA Sense-Plan-Act SSL System Service Layer TCA Task Control Architecture TRP Teleo-Reactive Program UAV Unmanned Aerial Vehicle UBF Unified Behavior...a teleo-reactive architecture [11]. Teleo-Reactive Programs ( TRPs ) are composed of a list of rules, where each has a condition and an action. When the

  2. Evolutionary game theory meets social science: is there a unifying rule for human cooperation?

    PubMed

    Rosas, Alejandro

    2010-05-21

    Evolutionary game theory has shown that human cooperation thrives in different types of social interactions with a PD structure. Models treat the cooperative strategies within the different frameworks as discrete entities and sometimes even as contenders. Whereas strong reciprocity was acclaimed as superior to classic reciprocity for its ability to defeat defectors in public goods games, recent experiments and simulations show that costly punishment fails to promote cooperation in the IR and DR games, where classic reciprocity succeeds. My aim is to show that cooperative strategies across frameworks are capable of a unified treatment, for they are governed by a common underlying rule or norm. An analysis of the reputation and action rules that govern some representative cooperative strategies both in models and in economic experiments confirms that the different frameworks share a conditional action rule and several reputation rules. The common conditional rule contains an option between costly punishment and withholding benefits that provides alternative enforcement methods against defectors. Depending on the framework, individuals can switch to the appropriate strategy and method of enforcement. The stability of human cooperation looks more promising if one mechanism controls successful strategies across frameworks. Published by Elsevier Ltd.

  3. Disaster Metrics: A Comprehensive Framework for Disaster Evaluation Typologies.

    PubMed

    Wong, Diana F; Spencer, Caroline; Boyd, Lee; Burkle, Frederick M; Archer, Frank

    2017-10-01

    Introduction The frequency of disasters is increasing around the world with more people being at risk. There is a moral imperative to improve the way in which disaster evaluations are undertaken and reported with the aim of reducing preventable mortality and morbidity in future events. Disasters are complex events and undertaking disaster evaluations is a specialized area of study at an international level. Hypothesis/Problem While some frameworks have been developed to support consistent disaster research and evaluation, they lack validation, consistent terminology, and standards for reporting across the different phases of a disaster. There is yet to be an agreed, comprehensive framework to structure disaster evaluation typologies. The aim of this paper is to outline an evolving comprehensive framework for disaster evaluation typologies. It is anticipated that this new framework will facilitate an agreement on identifying, structuring, and relating the various evaluations found in the disaster setting with a view to better understand the process, outcomes, and impacts of the effectiveness and efficiency of interventions. Research was undertaken in two phases: (1) a scoping literature review (peer-reviewed and "grey literature") was undertaken to identify current evaluation frameworks and typologies used in the disaster setting; and (2) a structure was developed that included the range of typologies identified in Phase One and suggests possible relationships in the disaster setting. No core, unifying framework to structure disaster evaluation and research was identified in the literature. The authors propose a "Comprehensive Framework for Disaster Evaluation Typologies" that identifies, structures, and suggests relationships for the various typologies detected. The proposed Comprehensive Framework for Disaster Evaluation Typologies outlines the different typologies of disaster evaluations that were identified in this study and brings them together into a single framework. This unique, unifying framework has relevance at an international level and is expected to benefit the disaster, humanitarian, and development sectors. The next step is to undertake a validation process that will include international leaders with experience in evaluation, in general, and disasters specifically. This work promotes an environment for constructive dialogue on evaluations in the disaster setting to strengthen the evidence base for interventions across the disaster spectrum. It remains a work in progress. Wong DF , Spencer C , Boyd L , Burkle FM Jr. , Archer F . Disaster metrics: a comprehensive framework for disaster evaluation typologies. Prehosp Disaster Med. 2017;32(5):501-514.

  4. General System Theory: Toward a Conceptual Framework for Science and Technology Education for All.

    ERIC Educational Resources Information Center

    Chen, David; Stroup, Walter

    1993-01-01

    Suggests using general system theory as a unifying theoretical framework for science and technology education for all. Five reasons are articulated: the multidisciplinary nature of systems theory, the ability to engage complexity, the capacity to describe system dynamics, the ability to represent the relationship between microlevel and…

  5. Making Learning Personally Meaningful: A New Framework for Relevance Research

    ERIC Educational Resources Information Center

    Priniski, Stacy J.; Hecht, Cameron A.; Harackiewicz, Judith M.

    2018-01-01

    Personal relevance goes by many names in the motivation literature, stemming from a number of theoretical frameworks. Currently these lines of research are being conducted in parallel with little synthesis across them, perhaps because there is no unifying definition of the relevance construct within which this research can be situated. In this…

  6. Unifying Different Theories of Learning: Theoretical Framework and Empirical Evidence

    ERIC Educational Resources Information Center

    Phan, Huy Phuong

    2008-01-01

    The main aim of this research study was to test out a conceptual model encompassing the theoretical frameworks of achievement goals, study processing strategies, effort, and reflective thinking practice. In particular, it was postulated that the causal influences of achievement goals on academic performance are direct and indirect through study…

  7. Enabling Curriculum Change in Physical Education: The Interplay between Policy Constructors and Practitioners

    ERIC Educational Resources Information Center

    MacLean, Justine; Mulholland, Rosemary; Gray, Shirley; Horrell, Andrew

    2015-01-01

    Background: Curriculum for Excellence, a new national policy initiative in Scottish Schools, provides a unified curricular framework for children aged 3-18. Within this framework, Physical Education (PE) now forms part of a collective alongside physical activity and sport, subsumed by the newly created curriculum area of "Health and…

  8. The SCEC Unified Community Velocity Model (UCVM) Software Framework for Distributing and Querying Seismic Velocity Models

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Taborda, R.; Callaghan, S.; Shaw, J. H.; Plesch, A.; Olsen, K. B.; Jordan, T. H.; Goulet, C. A.

    2017-12-01

    Crustal seismic velocity models and datasets play a key role in regional three-dimensional numerical earthquake ground-motion simulation, full waveform tomography, modern physics-based probabilistic earthquake hazard analysis, as well as in other related fields including geophysics, seismology, and earthquake engineering. The standard material properties provided by a seismic velocity model are P- and S-wave velocities and density for any arbitrary point within the geographic volume for which the model is defined. Many seismic velocity models and datasets are constructed by synthesizing information from multiple sources and the resulting models are delivered to users in multiple file formats, such as text files, binary files, HDF-5 files, structured and unstructured grids, and through computer applications that allow for interactive querying of material properties. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) software framework to facilitate the registration and distribution of existing and future seismic velocity models to the SCEC community. The UCVM software framework is designed to provide a standard query interface to multiple, alternative velocity models, even if the underlying velocity models are defined in different formats or use different geographic projections. The UCVM framework provides a comprehensive set of open-source tools for querying seismic velocity model properties, combining regional 3D models and 1D background models, visualizing 3D models, and generating computational models in the form of regular grids or unstructured meshes that can be used as inputs for ground-motion simulations. The UCVM framework helps researchers compare seismic velocity models and build equivalent simulation meshes from alternative velocity models. These capabilities enable researchers to evaluate the impact of alternative velocity models in ground-motion simulations and seismic hazard analysis applications. In this poster, we summarize the key components of the UCVM framework and describe the impact it has had in various computational geoscientific applications.

  9. A Unified Framework for the Infection Dynamics of Zoonotic Spillover and Spread.

    PubMed

    Lo Iacono, Giovanni; Cunningham, Andrew A; Fichet-Calvet, Elisabeth; Garry, Robert F; Grant, Donald S; Leach, Melissa; Moses, Lina M; Nichols, Gordon; Schieffelin, John S; Shaffer, Jeffrey G; Webb, Colleen T; Wood, James L N

    2016-09-01

    A considerable amount of disease is transmitted from animals to humans and many of these zoonoses are neglected tropical diseases. As outbreaks of SARS, avian influenza and Ebola have demonstrated, however, zoonotic diseases are serious threats to global public health and are not just problems confined to remote regions. There are two fundamental, and poorly studied, stages of zoonotic disease emergence: 'spillover', i.e. transmission of pathogens from animals to humans, and 'stuttering transmission', i.e. when limited human-to-human infections occur, leading to self-limiting chains of transmission. We developed a transparent, theoretical framework, based on a generalization of Poisson processes with memory of past human infections, that unifies these stages. Once we have quantified pathogen dynamics in the reservoir, with some knowledge of the mechanism of contact, the approach provides a tool to estimate the likelihood of spillover events. Comparisons with independent agent-based models demonstrates the ability of the framework to correctly estimate the relative contributions of human-to-human vs animal transmission. As an illustrative example, we applied our model to Lassa fever, a rodent-borne, viral haemorrhagic disease common in West Africa, for which data on human outbreaks were available. The approach developed here is general and applicable to a range of zoonoses. This kind of methodology is of crucial importance for the scientific, medical and public health communities working at the interface between animal and human diseases to assess the risk associated with the disease and to plan intervention and appropriate control measures. The Lassa case study revealed important knowledge gaps, and opportunities, arising from limited knowledge of the temporal patterns in reporting, abundance of and infection prevalence in, the host reservoir.

  10. A Unified Framework for the Infection Dynamics of Zoonotic Spillover and Spread

    PubMed Central

    Cunningham, Andrew A.; Fichet-Calvet, Elisabeth; Garry, Robert F.; Grant, Donald S.; Leach, Melissa; Moses, Lina M.; Nichols, Gordon; Schieffelin, John S.; Shaffer, Jeffrey G.; Webb, Colleen T.; Wood, James L. N.

    2016-01-01

    A considerable amount of disease is transmitted from animals to humans and many of these zoonoses are neglected tropical diseases. As outbreaks of SARS, avian influenza and Ebola have demonstrated, however, zoonotic diseases are serious threats to global public health and are not just problems confined to remote regions. There are two fundamental, and poorly studied, stages of zoonotic disease emergence: ‘spillover’, i.e. transmission of pathogens from animals to humans, and ‘stuttering transmission’, i.e. when limited human-to-human infections occur, leading to self-limiting chains of transmission. We developed a transparent, theoretical framework, based on a generalization of Poisson processes with memory of past human infections, that unifies these stages. Once we have quantified pathogen dynamics in the reservoir, with some knowledge of the mechanism of contact, the approach provides a tool to estimate the likelihood of spillover events. Comparisons with independent agent-based models demonstrates the ability of the framework to correctly estimate the relative contributions of human-to-human vs animal transmission. As an illustrative example, we applied our model to Lassa fever, a rodent-borne, viral haemorrhagic disease common in West Africa, for which data on human outbreaks were available. The approach developed here is general and applicable to a range of zoonoses. This kind of methodology is of crucial importance for the scientific, medical and public health communities working at the interface between animal and human diseases to assess the risk associated with the disease and to plan intervention and appropriate control measures. The Lassa case study revealed important knowledge gaps, and opportunities, arising from limited knowledge of the temporal patterns in reporting, abundance of and infection prevalence in, the host reservoir. PMID:27588425

  11. Trichotomous processes in early memory development, aging, and neurocognitive impairment: a unified theory.

    PubMed

    Brainerd, C J; Reyna, V F; Howe, M L

    2009-10-01

    One of the most extensively investigated topics in the adult memory literature, dual memory processes, has had virtually no impact on the study of early memory development. The authors remove the key obstacles to such research by formulating a trichotomous theory of recall that combines the traditional dual processes of recollection and familiarity with a reconstruction process. The theory is then embedded in a hidden Markov model that measures all 3 processes with low-burden tasks that are appropriate for even young children. These techniques are applied to a large corpus of developmental studies of recall, yielding stable findings about the emergence of dual memory processes between childhood and young adulthood and generating tests of many theoretical predictions. The techniques are extended to the study of healthy aging and to the memory sequelae of common forms of neurocognitive impairment, resulting in a theoretical framework that is unified over 4 major domains of memory research: early development, mainstream adult research, aging, and neurocognitive impairment. The techniques are also extended to recognition, creating a unified dual process framework for recall and recognition.

  12. Unified nonlinear analysis for nonhomogeneous anisotropic beams with closed cross sections

    NASA Technical Reports Server (NTRS)

    Atilgan, Ali R.; Hodges, Dewey H.

    1991-01-01

    A unified methodology for geometrically nonlinear analysis of nonhomogeneous, anisotropic beams is presented. A 2D cross-sectional analysis and a nonlinear 1D global deformation analysis are derived from the common framework of a 3D, geometrically nonlinear theory of elasticity. The only restrictions are that the strain and local rotation are small compared to unity and that warping displacements are small relative to the cross-sectional dimensions. It is concluded that the warping solutions can be affected by large deformation and that this could alter the incremental stiffnes of the section. It is shown that sectional constants derived from the published, linear analysis can be used in the present nonlinear, 1D analysis governing the global deformation of the beam, which is based on intrinsic equations for nonlinear beam behavior. Excellent correlation is obtained with published experimental results for both isotropic and anisotropic beams undergoing large deflections.

  13. Dynamic Information Management and Exchange for Command and Control Applications, Modelling and Enforcing Category-Based Access Control via Term Rewriting

    DTIC Science & Technology

    2015-03-01

    a hotel and a hospital. 2. Event handler for emergency policies (item 2 above): this has been implemented in two UG projects, one project developed a...Workshop on Logical and Se- mantic Frameworks, with Applications, Brasilia, Brazil , September 2014. Electronic Notes in Theoretical Computer Science (to...Brasilia, Brazil , September 2014, 2015. [3] S. Barker. The next 700 access control models or a unifying meta-model? In SACMAT 2009, 14th ACM Symposium on

  14. Domain shape instabilities and dendrite domain growth in uniaxial ferroelectrics

    NASA Astrophysics Data System (ADS)

    Shur, Vladimir Ya.; Akhmatkhanov, Andrey R.

    2018-01-01

    The effects of domain wall shape instabilities and the formation of nanodomains in front of moving walls obtained in various uniaxial ferroelectrics are discussed. Special attention is paid to the formation of self-assembled nanoscale and dendrite domain structures under highly non-equilibrium switching conditions. All obtained results are considered in the framework of the unified kinetic approach to domain structure evolution based on the analogy with first-order phase transformation. This article is part of the theme issue `From atomistic interfaces to dendritic patterns'.

  15. Generalized Preconditioned Locally Harmonic Residual Eigensolver (GPLHR) v0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    VECHARYNSKI, EUGENE; YANG, CHAO

    The software contains a MATLAB implementation of the Generalized Preconditioned Locally Harmonic Residual (GPLHR) method for solving standard and generalized non-Hermitian eigenproblems. The method is particularly useful for computing a subset of eigenvalues, and their eigen- or Schur vectors, closest to a given shift. The proposed method is based on block iterations and can take advantage of a preconditioner if it is available. It does not need to perform exact shift-and-invert transformation. Standard and generalized eigenproblems are handled in a unified framework.

  16. High-Order Methods for Computational Fluid Dynamics: A Brief Review of Compact Differential Formulations on Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Huynh, H. T.; Wang, Z. J.; Vincent, P. E.

    2013-01-01

    Popular high-order schemes with compact stencils for Computational Fluid Dynamics (CFD) include Discontinuous Galerkin (DG), Spectral Difference (SD), and Spectral Volume (SV) methods. The recently proposed Flux Reconstruction (FR) approach or Correction Procedure using Reconstruction (CPR) is based on a differential formulation and provides a unifying framework for these high-order schemes. Here we present a brief review of recent developments for the FR/CPR schemes as well as some pacing items.

  17. Constraints on single entity driven inflationary and radiation eras

    NASA Astrophysics Data System (ADS)

    Bouhmadi-López, Mariam; Chen, Pisin; Liu, Yen-Wei

    2012-07-01

    We present a model that attempts to fuse the inflationary era and the subsequent radiation dominated era under a unified framework so as to provide a smooth transition between the two. The model is based on a modification of the generalized Chaplygin gas. We constrain the model observationally by mapping the primordial power spectrum of the scalar perturbations to the latest data of WMAP7. We compute as well the spectrum of the primordial gravitational waves as would be measured today.

  18. Unifying practice schedules in the timescales of motor learning and performance.

    PubMed

    Verhoeven, F Martijn; Newell, Karl M

    2018-06-01

    In this article, we elaborate from a multiple time scales model of motor learning to examine the independent and integrated effects of massed and distributed practice schedules within- and between-sessions on the persistent (learning) and transient (warm-up, fatigue) processes of performance change. The timescales framework reveals the influence of practice distribution on four learning-related processes: the persistent processes of learning and forgetting, and the transient processes of warm-up decrement and fatigue. The superposition of the different processes of practice leads to a unified set of effects for massed and distributed practice within- and between-sessions in learning motor tasks. This analysis of the interaction between the duration of the interval of practice trials or sessions and parameters of the introduced time scale model captures the unified influence of the between trial and session scheduling of practice on learning and performance. It provides a starting point for new theoretically based hypotheses, and the scheduling of practice that minimizes the negative effects of warm-up decrement, fatigue and forgetting while exploiting the positive effects of learning and retention. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Deep linear autoencoder and patch clustering-based unified one-dimensional coding of image and video

    NASA Astrophysics Data System (ADS)

    Li, Honggui

    2017-09-01

    This paper proposes a unified one-dimensional (1-D) coding framework of image and video, which depends on deep learning neural network and image patch clustering. First, an improved K-means clustering algorithm for image patches is employed to obtain the compact inputs of deep artificial neural network. Second, for the purpose of best reconstructing original image patches, deep linear autoencoder (DLA), a linear version of the classical deep nonlinear autoencoder, is introduced to achieve the 1-D representation of image blocks. Under the circumstances of 1-D representation, DLA is capable of attaining zero reconstruction error, which is impossible for the classical nonlinear dimensionality reduction methods. Third, a unified 1-D coding infrastructure for image, intraframe, interframe, multiview video, three-dimensional (3-D) video, and multiview 3-D video is built by incorporating different categories of videos into the inputs of patch clustering algorithm. Finally, it is shown in the results of simulation experiments that the proposed methods can simultaneously gain higher compression ratio and peak signal-to-noise ratio than those of the state-of-the-art methods in the situation of low bitrate transmission.

  20. Retooling Institutional Support Infrastructure for Clinical Research

    PubMed Central

    Snyder, Denise C.; Brouwer, Rebecca N.; Ennis, Cory L.; Spangler, Lindsey L.; Ainsworth, Terry L.; Budinger, Susan; Mullen, Catherine; Hawley, Jeffrey; Uhlenbrauck, Gina; Stacy, Mark

    2016-01-01

    Clinical research activities at academic medical centers are challenging to oversee. Without effective research administration, a continually evolving set of regulatory and institutional requirements can detract investigator and study team attention away from a focus on scientific gain, study conduct, and patient safety. However, even when the need for research administration is recognized, there can be struggles over what form it should take. Central research administration may be viewed negatively, with individual groups preferring to maintain autonomy over processes. Conversely, a proliferation of individualized approaches across an institution can create inefficiencies or invite risk. This article describes experiences establishing a unified research support office at the Duke University School of Medicine based on a framework of customer support. The Duke Office of Clinical Research was formed in 2012 with a vision that research administration at academic medical centers should help clinical investigators navigate the complex research environment and operationalize research ideas. The office provides an array of services that have received high satisfaction ratings. The authors describe the ongoing culture change necessary for success of the unified research support office. Lessons learned from implementation of the Duke Office of Clinical Research may serve as a model for other institutions undergoing a transition to unified research support. PMID:27125563

  1. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-off on Phenotype Robustness in Biological Networks Part I: Gene Regulatory Networks in Systems and Evolutionary Biology

    PubMed Central

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties observed in biological systems at different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be enough to confer intrinsic robustness in order to tolerate intrinsic parameter fluctuations, genetic robustness for buffering genetic variations, and environmental robustness for resisting environmental disturbances. With this, the phenotypic stability of biological network can be maintained, thus guaranteeing phenotype robustness. This paper presents a survey on biological systems and then develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation in systems and evolutionary biology. Further, from the unifying mathematical framework, it was discovered that the phenotype robustness criterion for biological networks at different levels relies upon intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness. When this is true, the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in systems and evolutionary biology can also be investigated through their corresponding phenotype robustness criterion from the systematic point of view. PMID:23515240

  2. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-off on Phenotype Robustness in Biological Networks Part I: Gene Regulatory Networks in Systems and Evolutionary Biology.

    PubMed

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties observed in biological systems at different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be enough to confer intrinsic robustness in order to tolerate intrinsic parameter fluctuations, genetic robustness for buffering genetic variations, and environmental robustness for resisting environmental disturbances. With this, the phenotypic stability of biological network can be maintained, thus guaranteeing phenotype robustness. This paper presents a survey on biological systems and then develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation in systems and evolutionary biology. Further, from the unifying mathematical framework, it was discovered that the phenotype robustness criterion for biological networks at different levels relies upon intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness. When this is true, the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in systems and evolutionary biology can also be investigated through their corresponding phenotype robustness criterion from the systematic point of view.

  3. A unifying framework for marginalized random intercept models of correlated binary outcomes

    PubMed Central

    Swihart, Bruce J.; Caffo, Brian S.; Crainiceanu, Ciprian M.

    2013-01-01

    We demonstrate that many current approaches for marginal modeling of correlated binary outcomes produce likelihoods that are equivalent to the copula-based models herein. These general copula models of underlying latent threshold random variables yield likelihood-based models for marginal fixed effects estimation and interpretation in the analysis of correlated binary data with exchangeable correlation structures. Moreover, we propose a nomenclature and set of model relationships that substantially elucidates the complex area of marginalized random intercept models for binary data. A diverse collection of didactic mathematical and numerical examples are given to illustrate concepts. PMID:25342871

  4. A unified software framework for deriving, visualizing, and exploring abstraction networks for ontologies

    PubMed Central

    Ochs, Christopher; Geller, James; Perl, Yehoshua; Musen, Mark A.

    2016-01-01

    Software tools play a critical role in the development and maintenance of biomedical ontologies. One important task that is difficult without software tools is ontology quality assurance. In previous work, we have introduced different kinds of abstraction networks to provide a theoretical foundation for ontology quality assurance tools. Abstraction networks summarize the structure and content of ontologies. One kind of abstraction network that we have used repeatedly to support ontology quality assurance is the partial-area taxonomy. It summarizes structurally and semantically similar concepts within an ontology. However, the use of partial-area taxonomies was ad hoc and not generalizable. In this paper, we describe the Ontology Abstraction Framework (OAF), a unified framework and software system for deriving, visualizing, and exploring partial-area taxonomy abstraction networks. The OAF includes support for various ontology representations (e.g., OWL and SNOMED CT's relational format). A Protégé plugin for deriving “live partial-area taxonomies” is demonstrated. PMID:27345947

  5. A unified software framework for deriving, visualizing, and exploring abstraction networks for ontologies.

    PubMed

    Ochs, Christopher; Geller, James; Perl, Yehoshua; Musen, Mark A

    2016-08-01

    Software tools play a critical role in the development and maintenance of biomedical ontologies. One important task that is difficult without software tools is ontology quality assurance. In previous work, we have introduced different kinds of abstraction networks to provide a theoretical foundation for ontology quality assurance tools. Abstraction networks summarize the structure and content of ontologies. One kind of abstraction network that we have used repeatedly to support ontology quality assurance is the partial-area taxonomy. It summarizes structurally and semantically similar concepts within an ontology. However, the use of partial-area taxonomies was ad hoc and not generalizable. In this paper, we describe the Ontology Abstraction Framework (OAF), a unified framework and software system for deriving, visualizing, and exploring partial-area taxonomy abstraction networks. The OAF includes support for various ontology representations (e.g., OWL and SNOMED CT's relational format). A Protégé plugin for deriving "live partial-area taxonomies" is demonstrated. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. North American Science Symposium: Toward a unified framework for inventorying and monitoring forest ecosystem resources

    Treesearch

    Celedonio Aguirre-Bravo; Carlos Rodriguez Franco

    1999-01-01

    The general objective of this Symposium was to build on the best science and technology available to assure that the data and information produced in future inventory and monitoring programs are comparable, quality assured, available, and adequate for their intended purposes, thereby providing a reliable framework for characterization, assessment, and management of...

  7. A high-resolution bioclimate map of the world: a unifying framework for global biodiversity research and monitoring

    USGS Publications Warehouse

    Metzger, Marc J.; Bunce, Robert G.H.; Jongman, Rob H.G.; Sayre, Roger G.; Trabucco, Antonio; Zomer, Robert

    2013-01-01

    Main conclusions: The GEnS provides a robust spatial analytical framework for the aggregation of local observations, identification of gaps in current monitoring efforts and systematic design of complementary and new monitoring and research. The dataset is available for non-commercial use through the GEO portal (http://www.geoportal.org).

  8. The Nation's Report Card Science 2009 Trial Urban District Snapshot Report. Fresno Unified School District. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…

  9. Teaching Introductory Business Statistics Using the DCOVA Framework

    ERIC Educational Resources Information Center

    Levine, David M.; Stephan, David F.

    2011-01-01

    Introductory business statistics students often receive little guidance on how to apply the methods they learn to further business objectives they may one day face. And those students may fail to see the continuity among the topics taught in an introductory course if they learn those methods outside a context that provides a unifying framework.…

  10. The Nation's Report Card Science 2009 Trial Urban District Snapshot Report. Fresno Unified School District. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…

  11. Locomotion Dynamics for Bio-inspired Robots with Soft Appendages: Application to Flapping Flight and Passive Swimming

    NASA Astrophysics Data System (ADS)

    Boyer, Frédéric; Porez, Mathieu; Morsli, Ferhat; Morel, Yannick

    2017-08-01

    In animal locomotion, either in fish or flying insects, the use of flexible terminal organs or appendages greatly improves the performance of locomotion (thrust and lift). In this article, we propose a general unified framework for modeling and simulating the (bio-inspired) locomotion of robots using soft organs. The proposed approach is based on the model of Mobile Multibody Systems (MMS). The distributed flexibilities are modeled according to two major approaches: the Floating Frame Approach (FFA) and the Geometrically Exact Approach (GEA). Encompassing these two approaches in the Newton-Euler modeling formalism of robotics, this article proposes a unique modeling framework suited to the fast numerical integration of the dynamics of a MMS in both the FFA and the GEA. This general framework is applied on two illustrative examples drawn from bio-inspired locomotion: the passive swimming in von Karman Vortex Street, and the hovering flight with flexible flapping wings.

  12. ContextProvider: Context awareness for medical monitoring applications.

    PubMed

    Mitchell, Michael; Meyers, Christopher; Wang, An-I Andy; Tyson, Gary

    2011-01-01

    Smartphones are sensor-rich and Internet-enabled. With their on-board sensors, web services, social media, and external biosensors, smartphones can provide contextual information about the device, user, and environment, thereby enabling the creation of rich, biologically driven applications. We introduce ContextProvider, a framework that offers a unified, query-able interface to contextual data on the device. Unlike other context-based frameworks, ContextProvider offers interactive user feedback, self-adaptive sensor polling, and minimal reliance on third-party infrastructure. ContextProvider also allows for rapid development of new context and bio-aware applications. Evaluation of ContextProvider shows the incorporation of an additional monitoring sensor into the framework with fewer than 100 lines of Java code. With adaptive sensor monitoring, power consumption per sensor can be reduced down to 1% overhead. Finally, through the use of context, accuracy of data interpretation can be improved by up to 80%.

  13. BIRCH: a user-oriented, locally-customizable, bioinformatics system.

    PubMed

    Fristensky, Brian

    2007-02-09

    Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere.

  14. BIRCH: A user-oriented, locally-customizable, bioinformatics system

    PubMed Central

    Fristensky, Brian

    2007-01-01

    Background Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. Results BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. Conclusion BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere. PMID:17291351

  15. Qa-1/HLA-E-restricted regulatory CD8+ T cells and self-nonself discrimination: an essay on peripheral T-cell regulation.

    PubMed

    Jiang, Hong; Chess, Leonard

    2008-11-01

    By discriminating self from nonself and controlling the magnitude and class of immune responses, the immune system mounts effective immunity against virtually any foreign antigens but avoids harmful immune responses to self. These are two equally important and related but distinct processes, which function in concert to ensure an optimal function of the immune system. Immunologically relevant clinical problems often occur because of failure of either process, especially the former. Currently, there is no unified conceptual framework to characterize the precise relationship between thymic negative selection and peripheral immune regulation, which is the basis for understanding self-non-self discrimination versus control of magnitude and class of immune responses. In this article, we explore a novel hypothesis of how the immune system discriminates self from nonself in the periphery during adaptive immunity. This hypothesis permits rational analysis of various seemingly unrelated biomedical problems inherent in immunologic disorders that cannot be uniformly interpreted by any currently existing paradigms. The proposed hypothesis is based on a unified conceptual framework of the "avidity model of peripheral T-cell regulation" that we originally proposed and tested, in both basic and clinical immunology, to understand how the immune system achieves self-nonself discrimination in the periphery.

  16. Visuomotor control, eye movements, and steering: A unified approach for incorporating feedback, feedforward, and internal models.

    PubMed

    Lappi, Otto; Mole, Callum

    2018-06-11

    The authors present an approach to the coordination of eye movements and locomotion in naturalistic steering tasks. It is based on recent empirical research, in particular, on driver eye movements, that poses challenges for existing accounts of how we visually steer a course. They first analyze how the ideas of feedback and feedforward processes and internal models are treated in control theoretical steering models within vision science and engineering, which share an underlying architecture but have historically developed in very separate ways. The authors then show how these traditions can be naturally (re)integrated with each other and with contemporary neuroscience, to better understand the skill and gaze strategies involved. They then propose a conceptual model that (a) gives a unified account to the coordination of gaze and steering control, (b) incorporates higher-level path planning, and (c) draws on the literature on paired forward and inverse models in predictive control. Although each of these (a-c) has been considered before (also in the context of driving), integrating them into a single framework and the authors' multiple waypoint identification hypothesis within that framework are novel. The proposed hypothesis is relevant to all forms of visually guided locomotion. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  17. Beyond Containment and Deterrence: A Security Framework for Europe in the 21st Century

    DTIC Science & Technology

    1990-04-02

    decades of the 21st Century in Europe, and examines DDO FJoA 1473 E. T1O. Of INOV 65 IS OBSOLETE Uaf eSECRIT CUnclassified SECURITY CLASSIFICATION’ OF THIS... Poland , and parts of France and Russia, but it did not truely unify Germany. Bismarck unified only parts of Germany which he could constrain under...Europe, Central Europe, the Balkans, and the Soviet Union. Central Europe includes Vest Germany, East Germany, Austria, Czechoslavakia, Poland , and

  18. Towards a Unified Description of the Electroweak Nuclear Response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benhar, Omar; Lovato, Alessandro

    2015-06-01

    We briefly review the growing efforts to set up a unified framework for the description of neutrino interactions with atomic nuclei and nuclear matter, applicable in the broad kinematical region corresponding to neutrino energies ranging between few MeV and few GeV. The emerging picture suggests that the formalism of nuclear many-body theory (NMBT) can be exploited to obtain the neutrino-nucleus cross-sections needed for both the interpretation of oscillation signals and simulations of neutrino transport in compact stars

  19. Quantum and Information Thermodynamics: A Unifying Framework Based on Repeated Interactions

    NASA Astrophysics Data System (ADS)

    Strasberg, Philipp; Schaller, Gernot; Brandes, Tobias; Esposito, Massimiliano

    2017-04-01

    We expand the standard thermodynamic framework of a system coupled to a thermal reservoir by considering a stream of independently prepared units repeatedly put into contact with the system. These units can be in any nonequilibrium state and interact with the system with an arbitrary strength and duration. We show that this stream constitutes an effective resource of nonequilibrium free energy, and we identify the conditions under which it behaves as a heat, work, or information reservoir. We also show that this setup provides a natural framework to analyze information erasure ("Landauer's principle") and feedback-controlled systems ("Maxwell's demon"). In the limit of a short system-unit interaction time, we further demonstrate that this setup can be used to provide a thermodynamically sound interpretation to many effective master equations. We discuss how nonautonomously driven systems, micromasers, lasing without inversion and the electronic Maxwell demon can be thermodynamically analyzed within our framework. While the present framework accounts for quantum features (e.g., squeezing, entanglement, coherence), we also show that quantum resources do not offer any advantage compared to classical ones in terms of the maximum extractable work.

  20. Unified Desktop for Monitoring & Control Applications - The Open Navigator Framework Applied for Control Centre and EGSE Applications

    NASA Astrophysics Data System (ADS)

    Brauer, U.

    2007-08-01

    The Open Navigator Framework (ONF) was developed to provide a unified and scalable platform for user interface integration. The main objective for the framework was to raise usability of monitoring and control consoles and to provide a reuse of software components in different application areas. ONF is currently applied for the Columbus onboard crew interface, the commanding application for the Columbus Control Centre, the Columbus user facilities specialized user interfaces, the Mission Execution Crew Assistant (MECA) study and EADS Astrium internal R&D projects. ONF provides a well documented and proven middleware for GUI components (Java plugin interface, simplified concept similar to Eclipse). The overall application configuration is performed within a graphical user interface for layout and component selection. The end-user does not have to work in the underlying XML configuration files. ONF was optimized to provide harmonized user interfaces for monitoring and command consoles. It provides many convenience functions designed together with flight controllers and onboard crew: user defined workspaces, incl. support for multi screens efficient communication mechanism between the components integrated web browsing and documentation search &viewing consistent and integrated menus and shortcuts common logging and application configuration (properties) supervision interface for remote plugin GUI access (web based) A large number of operationally proven ONF components have been developed: Command Stack & History: Release of commands and follow up the command acknowledges System Message Panel: Browse, filter and search system messages/events Unified Synoptic System: Generic synoptic display system Situational Awareness : Show overall subsystem status based on monitoring of key parameters System Model Browser: Browse mission database defintions (measurements, commands, events) Flight Procedure Executor: Execute checklist and logical flow interactive procedures Web Browser : Integrated browser reference documentation and operations data Timeline Viewer: View master timeline as Gantt chart Search: Local search of operations products (e.g. documentation, procedures, displays) All GUI components access the underlying spacecraft data (commanding, reporting data, events, command history) via a common library providing adaptors for the current deployments (Columbus MCS, Columbus onboard Data Management System, Columbus Trainer raw packet protocol). New Adaptors are easy to develop. Currently an adaptor to SCOS 2000 is developed as part of a study for the ESTEC standardization section ("USS for ESTEC Reference Facility").

  1. Information Geometry for Landmark Shape Analysis: Unifying Shape Representation and Deformation

    PubMed Central

    Peter, Adrian M.; Rangarajan, Anand

    2010-01-01

    Shape matching plays a prominent role in the comparison of similar structures. We present a unifying framework for shape matching that uses mixture models to couple both the shape representation and deformation. The theoretical foundation is drawn from information geometry wherein information matrices are used to establish intrinsic distances between parametric densities. When a parameterized probability density function is used to represent a landmark-based shape, the modes of deformation are automatically established through the information matrix of the density. We first show that given two shapes parameterized by Gaussian mixture models (GMMs), the well-known Fisher information matrix of the mixture model is also a Riemannian metric (actually, the Fisher-Rao Riemannian metric) and can therefore be used for computing shape geodesics. The Fisher-Rao metric has the advantage of being an intrinsic metric and invariant to reparameterization. The geodesic—computed using this metric—establishes an intrinsic deformation between the shapes, thus unifying both shape representation and deformation. A fundamental drawback of the Fisher-Rao metric is that it is not available in closed form for the GMM. Consequently, shape comparisons are computationally very expensive. To address this, we develop a new Riemannian metric based on generalized ϕ-entropy measures. In sharp contrast to the Fisher-Rao metric, the new metric is available in closed form. Geodesic computations using the new metric are considerably more efficient. We validate the performance and discriminative capabilities of these new information geometry-based metrics by pairwise matching of corpus callosum shapes. We also study the deformations of fish shapes that have various topological properties. A comprehensive comparative analysis is also provided using other landmark-based distances, including the Hausdorff distance, the Procrustes metric, landmark-based diffeomorphisms, and the bending energies of the thin-plate (TPS) and Wendland splines. PMID:19110497

  2. Pattern matching through Chaos Game Representation: bridging numerical and discrete data structures for biological sequence analysis

    PubMed Central

    2012-01-01

    Background Chaos Game Representation (CGR) is an iterated function that bijectively maps discrete sequences into a continuous domain. As a result, discrete sequences can be object of statistical and topological analyses otherwise reserved to numerical systems. Characteristically, CGR coordinates of substrings sharing an L-long suffix will be located within 2-L distance of each other. In the two decades since its original proposal, CGR has been generalized beyond its original focus on genomic sequences and has been successfully applied to a wide range of problems in bioinformatics. This report explores the possibility that it can be further extended to approach algorithms that rely on discrete, graph-based representations. Results The exploratory analysis described here consisted of selecting foundational string problems and refactoring them using CGR-based algorithms. We found that CGR can take the role of suffix trees and emulate sophisticated string algorithms, efficiently solving exact and approximate string matching problems such as finding all palindromes and tandem repeats, and matching with mismatches. The common feature of these problems is that they use longest common extension (LCE) queries as subtasks of their procedures, which we show to have a constant time solution with CGR. Additionally, we show that CGR can be used as a rolling hash function within the Rabin-Karp algorithm. Conclusions The analysis of biological sequences relies on algorithmic foundations facing mounting challenges, both logistic (performance) and analytical (lack of unifying mathematical framework). CGR is found to provide the latter and to promise the former: graph-based data structures for sequence analysis operations are entailed by numerical-based data structures produced by CGR maps, providing a unifying analytical framework for a diversity of pattern matching problems. PMID:22551152

  3. Steepest entropy ascent model for far-nonequilibrium thermodynamics: Unified implementation of the maximum entropy production principle

    NASA Astrophysics Data System (ADS)

    Beretta, Gian Paolo

    2014-10-01

    By suitable reformulations, we cast the mathematical frameworks of several well-known different approaches to the description of nonequilibrium dynamics into a unified formulation valid in all these contexts, which extends to such frameworks the concept of steepest entropy ascent (SEA) dynamics introduced by the present author in previous works on quantum thermodynamics. Actually, the present formulation constitutes a generalization also for the quantum thermodynamics framework. The analysis emphasizes that in the SEA modeling principle a key role is played by the geometrical metric with respect to which to measure the length of a trajectory in state space. In the near-thermodynamic-equilibrium limit, the metric tensor is directly related to the Onsager's generalized resistivity tensor. Therefore, through the identification of a suitable metric field which generalizes the Onsager generalized resistance to the arbitrarily far-nonequilibrium domain, most of the existing theories of nonequilibrium thermodynamics can be cast in such a way that the state exhibits the spontaneous tendency to evolve in state space along the path of SEA compatible with the conservation constraints and the boundary conditions. The resulting unified family of SEA dynamical models is intrinsically and strongly consistent with the second law of thermodynamics. The non-negativity of the entropy production is a general and readily proved feature of SEA dynamics. In several of the different approaches to nonequilibrium description we consider here, the SEA concept has not been investigated before. We believe it defines the precise meaning and the domain of general validity of the so-called maximum entropy production principle. Therefore, it is hoped that the present unifying approach may prove useful in providing a fresh basis for effective, thermodynamically consistent, numerical models and theoretical treatments of irreversible conservative relaxation towards equilibrium from far nonequilibrium states. The mathematical frameworks we consider are the following: (A) statistical or information-theoretic models of relaxation; (B) small-scale and rarefied gas dynamics (i.e., kinetic models for the Boltzmann equation); (C) rational extended thermodynamics, macroscopic nonequilibrium thermodynamics, and chemical kinetics; (D) mesoscopic nonequilibrium thermodynamics, continuum mechanics with fluctuations; and (E) quantum statistical mechanics, quantum thermodynamics, mesoscopic nonequilibrium quantum thermodynamics, and intrinsic quantum thermodynamics.

  4. War-gaming application for future space systems acquisition: MATLAB implementation of war-gaming acquisition models and simulation results

    NASA Astrophysics Data System (ADS)

    Vienhage, Paul; Barcomb, Heather; Marshall, Karel; Black, William A.; Coons, Amanda; Tran, Hien T.; Nguyen, Tien M.; Guillen, Andy T.; Yoh, James; Kizer, Justin; Rogers, Blake A.

    2017-05-01

    The paper describes the MATLAB (MathWorks) programs that were developed during the REU workshop1 to implement The Aerospace Corporation developed Unified Game-based Acquisition Framework and Advanced Game - based Mathematical Framework (UGAF-AGMF) and its associated War-Gaming Engine (WGE) models. Each game can be played from the perspectives of the Department of Defense Acquisition Authority (DAA) or of an individual contractor (KTR). The programs also implement Aerospace's optimum "Program and Technical Baseline (PTB) and associated acquisition" strategy that combines low Total Ownership Cost (TOC) with innovative designs while still meeting warfighter needs. The paper also describes the Bayesian Acquisition War-Gaming approach using Monte Carlo simulations, a numerical analysis technique to account for uncertainty in decision making, which simulate the PTB development and acquisition processes and will detail the procedure of the implementation and the interactions between the games.

  5. A concept ideation framework for medical device design.

    PubMed

    Hagedorn, Thomas J; Grosse, Ian R; Krishnamurty, Sundar

    2015-06-01

    Medical device design is a challenging process, often requiring collaboration between medical and engineering domain experts. This collaboration can be best institutionalized through systematic knowledge transfer between the two domains coupled with effective knowledge management throughout the design innovation process. Toward this goal, we present the development of a semantic framework for medical device design that unifies a large medical ontology with detailed engineering functional models along with the repository of design innovation information contained in the US Patent Database. As part of our development, existing medical, engineering, and patent document ontologies were modified and interlinked to create a comprehensive medical device innovation and design tool with appropriate properties and semantic relations to facilitate knowledge capture, enrich existing knowledge, and enable effective knowledge reuse for different scenarios. The result is a Concept Ideation Framework for Medical Device Design (CIFMeDD). Key features of the resulting framework include function-based searching and automated inter-domain reasoning to uniquely enable identification of functionally similar procedures, tools, and inventions from multiple domains based on simple semantic searches. The significance and usefulness of the resulting framework for aiding in conceptual design and innovation in the medical realm are explored via two case studies examining medical device design problems. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. An optimization-based framework for anisotropic simplex mesh adaptation

    NASA Astrophysics Data System (ADS)

    Yano, Masayuki; Darmofal, David L.

    2012-09-01

    We present a general framework for anisotropic h-adaptation of simplex meshes. Given a discretization and any element-wise, localizable error estimate, our adaptive method iterates toward a mesh that minimizes error for a given degrees of freedom. Utilizing mesh-metric duality, we consider a continuous optimization problem of the Riemannian metric tensor field that provides an anisotropic description of element sizes. First, our method performs a series of local solves to survey the behavior of the local error function. This information is then synthesized using an affine-invariant tensor manipulation framework to reconstruct an approximate gradient of the error function with respect to the metric tensor field. Finally, we perform gradient descent in the metric space to drive the mesh toward optimality. The method is first demonstrated to produce optimal anisotropic meshes minimizing the L2 projection error for a pair of canonical problems containing a singularity and a singular perturbation. The effectiveness of the framework is then demonstrated in the context of output-based adaptation for the advection-diffusion equation using a high-order discontinuous Galerkin discretization and the dual-weighted residual (DWR) error estimate. The method presented provides a unified framework for optimizing both the element size and anisotropy distribution using an a posteriori error estimate and enables efficient adaptation of anisotropic simplex meshes for high-order discretizations.

  7. Standards-based sensor interoperability and networking SensorWeb: an overview

    NASA Astrophysics Data System (ADS)

    Bolling, Sam

    2012-06-01

    The War fighter lacks a unified Intelligence, Surveillance, and Reconnaissance (ISR) environment to conduct mission planning, command and control (C2), tasking, collection, exploitation, processing, and data discovery of disparate sensor data across the ISR Enterprise. Legacy sensors and applications are not standardized or integrated for assured, universal access. Existing tasking and collection capabilities are not unified across the enterprise, inhibiting robust C2 of ISR including near-real time, cross-cueing operations. To address these critical needs, the National Measurement and Signature Intelligence (MASINT) Office (NMO), and partnering Combatant Commands and Intelligence Agencies are developing SensorWeb, an architecture that harmonizes heterogeneous sensor data to a common standard for users to discover, access, observe, subscribe to and task sensors. The SensorWeb initiative long term goal is to establish an open commercial standards-based, service-oriented framework to facilitate plug and play sensors. The current development effort will produce non-proprietary deliverables, intended as a Government off the Shelf (GOTS) solution to address the U.S. and Coalition nations' inability to quickly and reliably detect, identify, map, track, and fully understand security threats and operational activities.

  8. A unified framework for the evaluation of surrogate endpoints in mental-health clinical trials.

    PubMed

    Molenberghs, Geert; Burzykowski, Tomasz; Alonso, Ariel; Assam, Pryseley; Tilahun, Abel; Buyse, Marc

    2010-06-01

    For a number of reasons, surrogate endpoints are considered instead of the so-called true endpoint in clinical studies, especially when such endpoints can be measured earlier, and/or with less burden for patient and experimenter. Surrogate endpoints may occur more frequently than their standard counterparts. For these reasons, it is not surprising that the use of surrogate endpoints in clinical practice is increasing. Building on the seminal work of Prentice(1) and Freedman et al.,(2) Buyse et al. (3) framed the evaluation exercise within a meta-analytic setting, in an effort to overcome difficulties that necessarily surround evaluation efforts based on a single trial. In this article, we review the meta-analytic approach for continuous outcomes, discuss extensions to non-normal and longitudinal settings, as well as proposals to unify the somewhat disparate collection of validation measures currently on the market. Implications for design and for predicting the effect of treatment in a new trial, based on the surrogate, are discussed. A case study in schizophrenia is analysed.

  9. Robust Pedestrian Tracking and Recognition from FLIR Video: A Unified Approach via Sparse Coding

    PubMed Central

    Li, Xin; Guo, Rui; Chen, Chao

    2014-01-01

    Sparse coding is an emerging method that has been successfully applied to both robust object tracking and recognition in the vision literature. In this paper, we propose to explore a sparse coding-based approach toward joint object tracking-and-recognition and explore its potential in the analysis of forward-looking infrared (FLIR) video to support nighttime machine vision systems. A key technical contribution of this work is to unify existing sparse coding-based approaches toward tracking and recognition under the same framework, so that they can benefit from each other in a closed-loop. On the one hand, tracking the same object through temporal frames allows us to achieve improved recognition performance through dynamical updating of template/dictionary and combining multiple recognition results; on the other hand, the recognition of individual objects facilitates the tracking of multiple objects (i.e., walking pedestrians), especially in the presence of occlusion within a crowded environment. We report experimental results on both the CASIAPedestrian Database and our own collected FLIR video database to demonstrate the effectiveness of the proposed joint tracking-and-recognition approach. PMID:24961216

  10. Predicting medical staff intention to use an online reporting system with modified unified theory of acceptance and use of technology.

    PubMed

    Chang, I-Chiu; Hsu, Hui-Mei

    2012-01-01

    Barriers to report incident events using an online information system (IS) may be different from those of a paper-based reporting system. The nationwide online Patient-Safety Reporting System (PSRS) contains a value judgment behind use of the system, similar to the Value of Perceived Consequence (VPC), which is seldom discussed in ISs applications of other disciplines. This study developed a more adequate research framework by integrating the VPC construct into the well-known Unified Theory of Acceptance and Use of Technology (UTAUT) model as a theoretical base to explore the predictors of medical staff's intention to use online PSRS. The results showed that management support was an important factor to influence medical staff's intention of using PSRS. The effects of factors such as performance expectancy, perceived positive, and perceived negative consequence on medical staff's intention of using PSRS were moderated by gender, age, experience, and occupation. The results proved that the modified UTAUT model is significant and useful in predicting medical staff's intention of using the nationwide online PSRS.

  11. An ontology for component-based models of water resource systems

    NASA Astrophysics Data System (ADS)

    Elag, Mostafa; Goodall, Jonathan L.

    2013-08-01

    Component-based modeling is an approach for simulating water resource systems where a model is composed of a set of components, each with a defined modeling objective, interlinked through data exchanges. Component-based modeling frameworks are used within the hydrologic, atmospheric, and earth surface dynamics modeling communities. While these efforts have been advancing, it has become clear that the water resources modeling community in particular, and arguably the larger earth science modeling community as well, faces a challenge of fully and precisely defining the metadata for model components. The lack of a unified framework for model component metadata limits interoperability between modeling communities and the reuse of models across modeling frameworks due to ambiguity about the model and its capabilities. To address this need, we propose an ontology for water resources model components that describes core concepts and relationships using the Web Ontology Language (OWL). The ontology that we present, which is termed the Water Resources Component (WRC) ontology, is meant to serve as a starting point that can be refined over time through engagement by the larger community until a robust knowledge framework for water resource model components is achieved. This paper presents the methodology used to arrive at the WRC ontology, the WRC ontology itself, and examples of how the ontology can aid in component-based water resources modeling by (i) assisting in identifying relevant models, (ii) encouraging proper model coupling, and (iii) facilitating interoperability across earth science modeling frameworks.

  12. Re-engineering the Federal planning process: A total Federal planning strategy, integrating NEPA with modern management tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eccleston, C.H.

    1997-09-05

    The National Environmental Policy Act (NEPA) of 1969 was established by Congress more than a quarter of a century ago, yet there is a surprising lack of specific tools, techniques, and methodologies for effectively implementing these regulatory requirements. Lack of professionally accepted techniques is a principal factor responsible for many inefficiencies. Often, decision makers do not fully appreciate or capitalize on the true potential which NEPA provides as a platform for planning future actions. New approaches and modem management tools must be adopted to fully achieve NEPA`s mandate. A new strategy, referred to as Total Federal Planning, is proposed formore » unifying large-scale federal planning efforts under a single, systematic, structured, and holistic process. Under this approach, the NEPA planning process provides a unifying framework for integrating all early environmental and nonenvironmental decision-making factors into a single comprehensive planning process. To promote effectiveness and efficiency, modem tools and principles from the disciplines of Value Engineering, Systems Engineering, and Total Quality Management are incorporated. Properly integrated and implemented, these planning tools provide the rigorous, structured, and disciplined framework essential in achieving effective planning. Ultimately, the goal of a Total Federal Planning strategy is to construct a unified and interdisciplinary framework that substantially improves decision-making, while reducing the time, cost, redundancy, and effort necessary to comply with environmental and other planning requirements. At a time when Congress is striving to re-engineer the governmental framework, apparatus, and process, a Total Federal Planning philosophy offers a systematic approach for uniting the disjointed and often convoluted planning process currently used by most federal agencies. Potentially this approach has widespread implications in the way federal planning is approached.« less

  13. The Nation's Report Card Science 2009 Trial Urban District Snapshot Report. San Diego Unified School District. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…

  14. The Nation's Report Card Science 2009 Trial Urban District Snapshot Report. San Diego Unified School District. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…

  15. The Nation's Report Card Science 2009 Trial Urban District Snapshot Report. Los Angeles Unified School District. Grade 4, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…

  16. The Nation's Report Card Science 2009 Trial Urban District Snapshot Report. Los Angeles Unified School District. Grade 8, Public Schools

    ERIC Educational Resources Information Center

    National Center for Education Statistics, 2011

    2011-01-01

    Guided by a new framework, the National Assessment of Educational Progress (NAEP) science assessment was updated in 2009 to keep the content current with key developments in science, curriculum standards, assessments, and research. The 2009 framework organizes science content into three broad content areas. Physical science includes concepts…

  17. A unifying framework for quantifying the nature of animal interactions.

    PubMed

    Potts, Jonathan R; Mokross, Karl; Lewis, Mark A

    2014-07-06

    Collective phenomena, whereby agent-agent interactions determine spatial patterns, are ubiquitous in the animal kingdom. On the other hand, movement and space use are also greatly influenced by the interactions between animals and their environment. Despite both types of interaction fundamentally influencing animal behaviour, there has hitherto been no unifying framework for the models proposed in both areas. Here, we construct a general method for inferring population-level spatial patterns from underlying individual movement and interaction processes, a key ingredient in building a statistical mechanics for ecological systems. We show that resource selection functions, as well as several examples of collective motion models, arise as special cases of our framework, thus bringing together resource selection analysis and collective animal behaviour into a single theory. In particular, we focus on combining the various mechanistic models of territorial interactions in the literature with step selection functions, by incorporating interactions into the step selection framework and demonstrating how to derive territorial patterns from the resulting models. We demonstrate the efficacy of our model by application to a population of insectivore birds in the Amazon rainforest. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  18. Practical Application of Model-based Programming and State-based Architecture to Space Missions

    NASA Technical Reports Server (NTRS)

    Horvath, Gregory A.; Ingham, Michel D.; Chung, Seung; Martin, Oliver; Williams, Brian

    2006-01-01

    Innovative systems and software engineering solutions are required to meet the increasingly challenging demands of deep-space robotic missions. While recent advances in the development of an integrated systems and software engineering approach have begun to address some of these issues, they are still at the core highly manual and, therefore, error-prone. This paper describes a task aimed at infusing MIT's model-based executive, Titan, into JPL's Mission Data System (MDS), a unified state-based architecture, systems engineering process, and supporting software framework. Results of the task are presented, including a discussion of the benefits and challenges associated with integrating mature model-based programming techniques and technologies into a rigorously-defined domain specific architecture.

  19. 40 CFR 300.105 - General organization concepts.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... capabilities. (b) Three fundamental kinds of activities are performed pursuant to the NCP: (1) Preparedness....205(c). (d) The basic framework for the response management structure is a system (e.g., a unified...

  20. Study designs for identification of rare disease variants in complex diseases: the utility of family-based designs.

    PubMed

    Ionita-Laza, Iuliana; Ottman, Ruth

    2011-11-01

    The recent progress in sequencing technologies makes possible large-scale medical sequencing efforts to assess the importance of rare variants in complex diseases. The results of such efforts depend heavily on the use of efficient study designs and analytical methods. We introduce here a unified framework for association testing of rare variants in family-based designs or designs based on unselected affected individuals. This framework allows us to quantify the enrichment in rare disease variants in families containing multiple affected individuals and to investigate the optimal design of studies aiming to identify rare disease variants in complex traits. We show that for many complex diseases with small values for the overall sibling recurrence risk ratio, such as Alzheimer's disease and most cancers, sequencing affected individuals with a positive family history of the disease can be extremely advantageous for identifying rare disease variants. In contrast, for complex diseases with large values of the sibling recurrence risk ratio, sequencing unselected affected individuals may be preferable.

  1. Next Generation Extended Lagrangian Quantum-based Molecular Dynamics

    NASA Astrophysics Data System (ADS)

    Negre, Christian

    2017-06-01

    A new framework for extended Lagrangian first-principles molecular dynamics simulations is presented, which overcomes shortcomings of regular, direct Born-Oppenheimer molecular dynamics, while maintaining important advantages of the unified extended Lagrangian formulation of density functional theory pioneered by Car and Parrinello three decades ago. The new framework allows, for the first time, energy conserving, linear-scaling Born-Oppenheimer molecular dynamics simulations, which is necessary to study larger and more realistic systems over longer simulation times than previously possible. Expensive, self-consinstent-field optimizations are avoided and normal integration time steps of regular, direct Born-Oppenheimer molecular dynamics can be used. Linear scaling electronic structure theory is presented using a graph-based approach that is ideal for parallel calculations on hybrid computer platforms. For the first time, quantum based Born-Oppenheimer molecular dynamics simulation is becoming a practically feasible approach in simulations of +100,000 atoms-representing a competitive alternative to classical polarizable force field methods. In collaboration with: Anders Niklasson, Los Alamos National Laboratory.

  2. Multichannel blind iterative image restoration.

    PubMed

    Sroubek, Filip; Flusser, Jan

    2003-01-01

    Blind image deconvolution is required in many applications of microscopy imaging, remote sensing, and astronomical imaging. Unfortunately in a single-channel framework, serious conceptual and numerical problems are often encountered. Very recently, an eigenvector-based method (EVAM) was proposed for a multichannel framework which determines perfectly convolution masks in a noise-free environment if channel disparity, called co-primeness, is satisfied. We propose a novel iterative algorithm based on recent anisotropic denoising techniques of total variation and a Mumford-Shah functional with the EVAM restoration condition included. A linearization scheme of half-quadratic regularization together with a cell-centered finite difference discretization scheme is used in the algorithm and provides a unified approach to the solution of total variation or Mumford-Shah. The algorithm performs well even on very noisy images and does not require an exact estimation of mask orders. We demonstrate capabilities of the algorithm on synthetic data. Finally, the algorithm is applied to defocused images taken with a digital camera and to data from astronomical ground-based observations of the Sun.

  3. Probabilistic Graphical Model Representation in Phylogenetics

    PubMed Central

    Höhna, Sebastian; Heath, Tracy A.; Boussau, Bastien; Landis, Michael J.; Ronquist, Fredrik; Huelsenbeck, John P.

    2014-01-01

    Recent years have seen a rapid expansion of the model space explored in statistical phylogenetics, emphasizing the need for new approaches to statistical model representation and software development. Clear communication and representation of the chosen model is crucial for: (i) reproducibility of an analysis, (ii) model development, and (iii) software design. Moreover, a unified, clear and understandable framework for model representation lowers the barrier for beginners and nonspecialists to grasp complex phylogenetic models, including their assumptions and parameter/variable dependencies. Graphical modeling is a unifying framework that has gained in popularity in the statistical literature in recent years. The core idea is to break complex models into conditionally independent distributions. The strength lies in the comprehensibility, flexibility, and adaptability of this formalism, and the large body of computational work based on it. Graphical models are well-suited to teach statistical models, to facilitate communication among phylogeneticists and in the development of generic software for simulation and statistical inference. Here, we provide an introduction to graphical models for phylogeneticists and extend the standard graphical model representation to the realm of phylogenetics. We introduce a new graphical model component, tree plates, to capture the changing structure of the subgraph corresponding to a phylogenetic tree. We describe a range of phylogenetic models using the graphical model framework and introduce modules to simplify the representation of standard components in large and complex models. Phylogenetic model graphs can be readily used in simulation, maximum likelihood inference, and Bayesian inference using, for example, Metropolis–Hastings or Gibbs sampling of the posterior distribution. [Computation; graphical models; inference; modularization; statistical phylogenetics; tree plate.] PMID:24951559

  4. Using enterprise architecture to analyse how organisational structure impact motivation and learning

    NASA Astrophysics Data System (ADS)

    Närman, Pia; Johnson, Pontus; Gingnell, Liv

    2016-06-01

    When technology, environment, or strategies change, organisations need to adjust their structures accordingly. These structural changes do not always enhance the organisational performance as intended partly because organisational developers do not understand the consequences of structural changes in performance. This article presents a model-based analysis framework for quantitative analysis of the effect of organisational structure on organisation performance in terms of employee motivation and learning. The model is based on Mintzberg's work on organisational structure. The quantitative analysis is formalised using the Object Constraint Language (OCL) and the Unified Modelling Language (UML) and implemented in an enterprise architecture tool.

  5. Generalized Stefan-Boltzmann Law

    NASA Astrophysics Data System (ADS)

    Montambaux, Gilles

    2018-03-01

    We reconsider the thermodynamic derivation by L. Boltzmann of the Stefan law and we generalize it for various different physical systems whose chemical potential vanishes. Being only based on classical arguments, therefore independent of the quantum statistics, this derivation applies as well to the saturated Bose gas in various geometries as to "compensated" Fermi gas near a neutrality point, such as a gas of Weyl Fermions. It unifies in the same framework the thermodynamics of many different bosonic or fermionic non-interacting gases which were until now described in completely different contexts.

  6. Cognitive Hypnotherapy as a Transdiagnostic Protocol for Emotional Disorders.

    PubMed

    Alladin, Assen; Amundson, Jon

    2016-01-01

    This article describes cognitive hypnotherapy (CH), an integrative treatment that provides an evidence-based framework for synthesizing clinical practice and research. CH combines hypnotherapy with cognitive-behavior therapy in the management of emotional disorders. This blended version of clinical practice meets criteria for an assimilative model of integrative psychotherapy, which incorporates both theory and empirical findings. Issues related to (a) additive effect of hypnosis in treatment, (b) transdiagnostic consideration, and (c) unified treatment protocols in the treatment of emotional disorders are considered in light of cognitive hypnotherapy.

  7. The kinetic origin of delayed yielding in metallic glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ye, Y. F.; Liu, X. D.; Wang, S.

    2016-06-20

    Recent experiments showed that irreversible structural change or plasticity could occur in metallic glasses (MGs) even within the apparent elastic limit after a sufficiently long waiting time. To explain this phenomenon, a stochastic shear transformation model is developed based on a unified rate theory to predict delayed yielding in MGs, which is validated afterwards through extensive atomistic simulations carried out on different MGs. On a fundamental level, an analytic framework is established in this work that links time, stress, and temperature altogether into a general yielding criterion for MGs.

  8. A unified material decomposition framework for quantitative dual- and triple-energy CT imaging.

    PubMed

    Zhao, Wei; Vernekohl, Don; Han, Fei; Han, Bin; Peng, Hao; Yang, Yong; Xing, Lei; Min, James K

    2018-04-21

    Many clinical applications depend critically on the accurate differentiation and classification of different types of materials in patient anatomy. This work introduces a unified framework for accurate nonlinear material decomposition and applies it, for the first time, in the concept of triple-energy CT (TECT) for enhanced material differentiation and classification as well as dual-energy CT (DECT). We express polychromatic projection into a linear combination of line integrals of material-selective images. The material decomposition is then turned into a problem of minimizing the least-squares difference between measured and estimated CT projections. The optimization problem is solved iteratively by updating the line integrals. The proposed technique is evaluated by using several numerical phantom measurements under different scanning protocols. The triple-energy data acquisition is implemented at the scales of micro-CT and clinical CT imaging with commercial "TwinBeam" dual-source DECT configuration and a fast kV switching DECT configuration. Material decomposition and quantitative comparison with a photon counting detector and with the presence of a bow-tie filter are also performed. The proposed method provides quantitative material- and energy-selective images examining realistic configurations for both DECT and TECT measurements. Compared to the polychromatic kV CT images, virtual monochromatic images show superior image quality. For the mouse phantom, quantitative measurements show that the differences between gadodiamide and iodine concentrations obtained using TECT and idealized photon counting CT (PCCT) are smaller than 8 and 1 mg/mL, respectively. TECT outperforms DECT for multicontrast CT imaging and is robust with respect to spectrum estimation. For the thorax phantom, the differences between the concentrations of the contrast map and the corresponding true reference values are smaller than 7 mg/mL for all of the realistic configurations. A unified framework for both DECT and TECT imaging has been established for the accurate extraction of material compositions using currently available commercial DECT configurations. The novel technique is promising to provide an urgently needed solution for several CT-based diagnostic and therapy applications, especially for the diagnosis of cardiovascular and abdominal diseases where multicontrast imaging is involved. © 2018 American Association of Physicists in Medicine.

  9. A framework for streamlining research workflow in neuroscience and psychology

    PubMed Central

    Kubilius, Jonas

    2014-01-01

    Successful accumulation of knowledge is critically dependent on the ability to verify and replicate every part of scientific conduct. However, such principles are difficult to enact when researchers continue to resort on ad-hoc workflows and with poorly maintained code base. In this paper I examine the needs of neuroscience and psychology community, and introduce psychopy_ext, a unifying framework that seamlessly integrates popular experiment building, analysis and manuscript preparation tools by choosing reasonable defaults and implementing relatively rigid patterns of workflow. This structure allows for automation of multiple tasks, such as generated user interfaces, unit testing, control analyses of stimuli, single-command access to descriptive statistics, and publication quality plotting. Taken together, psychopy_ext opens an exciting possibility for a faster, more robust code development and collaboration for researchers. PMID:24478691

  10. Reasons to Conserve Nature.

    PubMed

    Pearson, Richard G

    2016-05-01

    Is it sufficient to base arguments for conservation on the intrinsic value of nature, regardless of the services and economic benefits that biodiversity provides for humans? This question underlies much recent debate that has been at times acrimonious and has led to calls for a more inclusive approach to conservation. Yet melding different ideologies within a unified conceptual framework has proven difficult. Here I describe an approach that recognizes the importance of the level of biological organization and spatial extent in determining the strength of alternative arguments for why we should conserve nature. I argue that the framework helps reconcile contrasting viewpoints and brings clarity to when different conservation management approaches (for instance, regulation versus monetary valuation) are most appropriate. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.

  11. A Unified Approach to Model-Based Planning and Execution

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Dorais, Gregory A.; Fry, Chuck; Levinson, Richard; Plaunt, Christian; Norvig, Peter (Technical Monitor)

    2000-01-01

    Writing autonomous software is complex, requiring the coordination of functionally and technologically diverse software modules. System and mission engineers must rely on specialists familiar with the different software modules to translate requirements into application software. Also, each module often encodes the same requirement in different forms. The results are high costs and reduced reliability due to the difficulty of tracking discrepancies in these encodings. In this paper we describe a unified approach to planning and execution that we believe provides a unified representational and computational framework for an autonomous agent. We identify the four main components whose interplay provides the basis for the agent's autonomous behavior: the domain model, the plan database, the plan running module, and the planner modules. This representational and problem solving approach can be applied at all levels of the architecture of a complex agent, such as Remote Agent. In the rest of the paper we briefly describe the Remote Agent architecture. The new agent architecture proposed here aims at achieving the full Remote Agent functionality. We then give the fundamental ideas behind the new agent architecture and point out some implication of the structure of the architecture, mainly in the area of reactivity and interaction between reactive and deliberative decision making. We conclude with related work and current status.

  12. LIFE CYCLE ENGINEERING GUIDELINES

    EPA Science Inventory

    This document provides guidelines for the implementation of LCE concepts, information, and techniques in engineering products, systems, processes, and facilities. To make this document as practical and useable as possible, a unifying LCE framework is presented. Subsequent topics ...

  13. Unified approach to redshift in cosmological/black hole spacetimes and synchronous frame

    NASA Astrophysics Data System (ADS)

    Toporensky, A. V.; Zaslavskii, O. B.; Popov, S. B.

    2018-01-01

    Usually, interpretation of redshift in static spacetimes (for example, near black holes) is opposed to that in cosmology. In this methodological note, we show that both explanations are unified in a natural picture. This is achieved if, considering the static spacetime, one (i) makes a transition to a synchronous frame, and (ii) returns to the original frame by means of local Lorentz boost. To reach our goal, we consider a rather general class of spherically symmetric spacetimes. In doing so, we construct frames that generalize the well-known Lemaitre and Painlevé-Gullstand ones and elucidate the relation between them. This helps us to understand, in a unifying approach, how gravitation reveals itself in different branches of general relativity. This framework can be useful for general relativity university courses.

  14. Impact of Beads and Drops on a Repellent Solid Surface: A Unified Description

    NASA Astrophysics Data System (ADS)

    Arora, S.; Fromental, J.-M.; Mora, S.; Phou, Ty; Ramos, L.; Ligoure, C.

    2018-04-01

    We investigate freely expanding sheets formed by ultrasoft gel beads, and liquid and viscoelastic drops, produced by the impact of the bead or drop on a silicon wafer covered with a thin layer of liquid nitrogen that suppresses viscous dissipation thanks to an inverse Leidenfrost effect. Our experiments show a unified behavior for the impact dynamics that holds for solids, liquids, and viscoelastic fluids and that we rationalize by properly taking into account elastocapillary effects. In this framework, the classical impact dynamics of solids and liquids, as far as viscous dissipation is negligible, appears as the asymptotic limits of a universal theoretical description. A novel material-dependent characteristic velocity that includes both capillary and bulk elasticity emerges from this unified description of the physics of impact.

  15. SCIFIO: an extensible framework to support scientific image formats.

    PubMed

    Hiner, Mark C; Rueden, Curtis T; Eliceiri, Kevin W

    2016-12-07

    No gold standard exists in the world of scientific image acquisition; a proliferation of instruments each with its own proprietary data format has made out-of-the-box sharing of that data nearly impossible. In the field of light microscopy, the Bio-Formats library was designed to translate such proprietary data formats to a common, open-source schema, enabling sharing and reproduction of scientific results. While Bio-Formats has proved successful for microscopy images, the greater scientific community was lacking a domain-independent framework for format translation. SCIFIO (SCientific Image Format Input and Output) is presented as a freely available, open-source library unifying the mechanisms of reading and writing image data. The core of SCIFIO is its modular definition of formats, the design of which clearly outlines the components of image I/O to encourage extensibility, facilitated by the dynamic discovery of the SciJava plugin framework. SCIFIO is structured to support coexistence of multiple domain-specific open exchange formats, such as Bio-Formats' OME-TIFF, within a unified environment. SCIFIO is a freely available software library developed to standardize the process of reading and writing scientific image formats.

  16. Evolutionary dynamics of tree invasions: complementing the unified framework for biological invasions.

    PubMed

    Zenni, Rafael Dudeque; Dickie, Ian A; Wingfield, Michael J; Hirsch, Heidi; Crous, Casparus J; Meyerson, Laura A; Burgess, Treena I; Zimmermann, Thalita G; Klock, Metha M; Siemann, Evan; Erfmeier, Alexandra; Aragon, Roxana; Montti, Lia; Le Roux, Johannes J

    2016-12-30

    Evolutionary processes greatly impact the outcomes of biological invasions. An extensive body of research suggests that invasive populations often undergo phenotypic and ecological divergence from their native sources. Evolution also operates at different and distinct stages during the invasion process. Thus, it is important to incorporate evolutionary change into frameworks of biological invasions because it allows us to conceptualize how these processes may facilitate or hinder invasion success. Here, we review such processes, with an emphasis on tree invasions, and place them in the context of the unified framework for biological invasions. The processes and mechanisms described are pre-introduction evolutionary history, sampling effect, founder effect, genotype-by-environment interactions, admixture, hybridization, polyploidization, rapid evolution, epigenetics, and second-genomes. For the last, we propose that co-evolved symbionts, both beneficial and harmful, which are closely physiologically associated with invasive species, contain critical genetic traits that affect the evolutionary dynamics of biological invasions. By understanding the mechanisms underlying invasion success, researchers will be better equipped to predict, understand, and manage biological invasions. Published by Oxford University Press on behalf of the Annals of Botany Company.

  17. Evolutionary dynamics of tree invasions: complementing the unified framework for biological invasions

    PubMed Central

    Dickie, Ian A.; Wingfield, Michael J.; Hirsch, Heidi; Crous, Casparus J.; Meyerson, Laura A.; Burgess, Treena I.; Zimmermann, Thalita G.; Klock, Metha M.; Siemann, Evan; Erfmeier, Alexandra; Aragon, Roxana; Montti, Lia; Le Roux, Johannes J.

    2017-01-01

    Abstract Evolutionary processes greatly impact the outcomes of biological invasions. An extensive body of research suggests that invasive populations often undergo phenotypic and ecological divergence from their native sources. Evolution also operates at different and distinct stages during the invasion process. Thus, it is important to incorporate evolutionary change into frameworks of biological invasions because it allows us to conceptualize how these processes may facilitate or hinder invasion success. Here, we review such processes, with an emphasis on tree invasions, and place them in the context of the unified framework for biological invasions. The processes and mechanisms described are pre-introduction evolutionary history, sampling effect, founder effect, genotype-by-environment interactions, admixture, hybridization, polyploidization, rapid evolution, epigenetics and second-genomes. For the last, we propose that co-evolved symbionts, both beneficial and harmful, which are closely physiologically associated with invasive species, contain critical genetic traits that affect the evolutionary dynamics of biological invasions. By understanding the mechanisms underlying invasion success, researchers will be better equipped to predict, understand and manage biological invasions. PMID:28039118

  18. Managing urban water systems with significant adaptation deficits - a unified framework for secondary cities

    NASA Astrophysics Data System (ADS)

    Pathirana, A.; Radhakrishnan, M.; Zevenbergen, C.; Quan, N. H.

    2016-12-01

    The need to address the shortcomings of urban systems - adaptation deficit - and shortcomings in response to climate change - `adaptation gap' - are both major challenges in maintaining the livability and sustainability of cities. However, the adaptation actions defined in terms of type I (addressing adaptation deficits) and type II (addressing adaptation gaps), often compete and conflict each other in the secondary cities of the global south. Extending the concept of the environmental Kuznets curve, this paper argues that a unified framework that calls for synergistic action on type I and type II adaptation is essential in order for these cities to maintain their livability, sustainability and resilience facing extreme rates of urbanization and rapid onset of climate change. The proposed framework has been demonstrated in Can Tho, Vietnam, where there are significant adaptation deficits due to rapid urbanisation and adaptation gaps due to climate change and socio-economic changes. The analysis in Can Tho reveals the lack of integration between type I and type II measures that could be overcome by closer integration between various stakeholders in terms of planning, prioritising and implementing the adaptation measures.

  19. Unified framework for automated iris segmentation using distantly acquired face images.

    PubMed

    Tan, Chun-Wei; Kumar, Ajay

    2012-09-01

    Remote human identification using iris biometrics has high civilian and surveillance applications and its success requires the development of robust segmentation algorithm to automatically extract the iris region. This paper presents a new iris segmentation framework which can robustly segment the iris images acquired using near infrared or visible illumination. The proposed approach exploits multiple higher order local pixel dependencies to robustly classify the eye region pixels into iris or noniris regions. Face and eye detection modules have been incorporated in the unified framework to automatically provide the localized eye region from facial image for iris segmentation. We develop robust postprocessing operations algorithm to effectively mitigate the noisy pixels caused by the misclassification. Experimental results presented in this paper suggest significant improvement in the average segmentation errors over the previously proposed approaches, i.e., 47.5%, 34.1%, and 32.6% on UBIRIS.v2, FRGC, and CASIA.v4 at-a-distance databases, respectively. The usefulness of the proposed approach is also ascertained from recognition experiments on three different publicly available databases.

  20. Leaving behind our preparadigmatic past: Professional psychology as a unified clinical science.

    PubMed

    Melchert, Timothy P

    2016-09-01

    The behavioral and neurosciences have made remarkable progress recently in advancing the scientific understanding of human psychology. Though research in many areas is still in its early stages, knowledge of many psychological processes is now firmly grounded in experimental tests of falsifiable theories and supports a unified, paradigmatic understanding of human psychology that is thoroughly consistent with the rest of the natural sciences. This new body of knowledge poses critical questions for professional psychology, which still often relies on the traditional theoretical orientations and other preparadigmatic practices for guiding important aspects of clinical education and practice. This article argues that professional psychology needs to systematically transition to theoretical frameworks and a curriculum that are based on an integrated scientific understanding of human psychology. Doing so would be of historic importance for the field and would result in major changes to professional psychology education and practice. It would also allow the field to emerge as a true clinical science. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  1. Adaptive evolutionary conservation: towards a unified concept for defining conservation units.

    PubMed

    Fraser, D J; Bernatchez, L

    2001-12-01

    Recent years have seen a debate over various methods that could objectively prioritize conservation value below the species level. Most prominent among these has been the evolutionarily significant unit (ESU). We reviewed ESU concepts with the aim of proposing a more unified concept that would reconcile opposing views. Like species concepts, conflicting ESU concepts are all essentially aiming to define the same thing: segments of species whose divergence can be measured or evaluated by putting differential emphasis on the role of evolutionary forces at varied temporal scales. Thus, differences between ESU concepts lie more in the criteria used to define the ESUs themselves rather than in their fundamental essence. We provide a context-based framework for delineating ESUs which circumvents much of this situation. Rather than embroil in a befuddled debate over an optimal criterion, the key to a solution is accepting that differing criteria will work more dynamically than others and can be used alone or in combination depending on the situation. These assertions constitute the impetus behind adaptive evolutionary conservation.

  2. Simultaneous CT-MRI Reconstruction for Constrained Imaging Geometries using Structural Coupling and Compressive Sensing

    PubMed Central

    Xi, Yan; Zhao, Jun; Bennett, James R.; Stacy, Mitchel R.; Sinusas, Albert J.; Wang, Ge

    2016-01-01

    Objective A unified reconstruction framework is presented for simultaneous CT-MRI reconstruction. Significance Combined CT-MRI imaging has the potential for improved results in existing preclinical and clinical applications, as well as opening novel research directions for future applications. Methods In an ideal CT-MRI scanner, CT and MRI acquisitions would occur simultaneously, and hence would be inherently registered in space and time. Alternatively, separately acquired CT and MRI scans can be fused to simulate an instantaneous acquisition. In this study, structural coupling and compressive sensing techniques are combined to unify CT and MRI reconstructions. A bidirectional image estimation method was proposed to connect images from different modalities. Hence, CT and MRI data serve as prior knowledge to each other for better CT and MRI image reconstruction than what could be achieved with separate reconstruction. Results Our integrated reconstruction methodology is demonstrated with numerical phantom and real-dataset based experiments, and has yielded promising results. PMID:26672028

  3. Information Object Definition–based Unified Modeling Language Representation of DICOM Structured Reporting

    PubMed Central

    Tirado-Ramos, Alfredo; Hu, Jingkun; Lee, K.P.

    2002-01-01

    Supplement 23 to DICOM (Digital Imaging and Communications for Medicine), Structured Reporting, is a specification that supports a semantically rich representation of image and waveform content, enabling experts to share image and related patient information. DICOM SR supports the representation of textual and coded data linked to images and waveforms. Nevertheless, the medical information technology community needs models that work as bridges between the DICOM relational model and open object-oriented technologies. The authors assert that representations of the DICOM Structured Reporting standard, using object-oriented modeling languages such as the Unified Modeling Language, can provide a high-level reference view of the semantically rich framework of DICOM and its complex structures. They have produced an object-oriented model to represent the DICOM SR standard and have derived XML-exchangeable representations of this model using World Wide Web Consortium specifications. They expect the model to benefit developers and system architects who are interested in developing applications that are compliant with the DICOM SR specification. PMID:11751804

  4. Continuous structural evolution of calcium carbonate particles: a unifying model of copolymer-mediated crystallization.

    PubMed

    Kulak, Alex N; Iddon, Peter; Li, Yuting; Armes, Steven P; Cölfen, Helmut; Paris, Oskar; Wilson, Rory M; Meldrum, Fiona C

    2007-03-28

    Two double-hydrophilic block copolymers, each comprising a nonionic block and an anionic block comprising pendent aromatic sulfonate groups, were used as additives to modify the crystallization of CaCO3. Marked morphological changes in the CaCO3 particles were observed depending on the reaction conditions used. A poly(ethylene oxide)-b-poly(sodium 4-styrenesulfonate) diblock copolymer was particularly versatile in effecting a morphological change in calcite particles, and a continuous structural transition in the product particles from polycrystalline to mesocrystal to single crystal was observed with variation in the calcium concentration. The existence of this structural sequence provides unique insight into the mechanism of polymer-mediated crystallization. We propose that it reflects continuity in the crystallization mechanism itself, spanning the limits from nonoriented aggregation of nanoparticles to classical ion-by-ion growth. The various pathways to polycrystalline, mesocrystal, and single-crystal particles, which had previously been considered to be distinct, therefore all form part of a unifying crystallization framework based on the aggregation of precursor subunits.

  5. Word-level language modeling for P300 spellers based on discriminative graphical models

    NASA Astrophysics Data System (ADS)

    Delgado Saa, Jaime F.; de Pesters, Adriana; McFarland, Dennis; Çetin, Müjdat

    2015-04-01

    Objective. In this work we propose a probabilistic graphical model framework that uses language priors at the level of words as a mechanism to increase the performance of P300-based spellers. Approach. This paper is concerned with brain-computer interfaces based on P300 spellers. Motivated by P300 spelling scenarios involving communication based on a limited vocabulary, we propose a probabilistic graphical model framework and an associated classification algorithm that uses learned statistical models of language at the level of words. Exploiting such high-level contextual information helps reduce the error rate of the speller. Main results. Our experimental results demonstrate that the proposed approach offers several advantages over existing methods. Most importantly, it increases the classification accuracy while reducing the number of times the letters need to be flashed, increasing the communication rate of the system. Significance. The proposed approach models all the variables in the P300 speller in a unified framework and has the capability to correct errors in previous letters in a word, given the data for the current one. The structure of the model we propose allows the use of efficient inference algorithms, which in turn makes it possible to use this approach in real-time applications.

  6. A Unified Framework for Periodic, On-Demand, and User-Specified Software Information

    NASA Technical Reports Server (NTRS)

    Kolano, Paul Z.

    2004-01-01

    Although grid computing can increase the number of resources available to a user; not all resources on the grid may have a software environment suitable for running a given application. To provide users with the necessary assistance for selecting resources with compatible software environments and/or for automatically establishing such environments, it is necessary to have an accurate source of information about the software installed across the grid. This paper presents a new OGSI-compliant software information service that has been implemented as part of NASA's Information Power Grid project. This service is built on top of a general framework for reconciling information from periodic, on-demand, and user-specified sources. Information is retrieved using standard XPath queries over a single unified namespace independent of the information's source. Two consumers of the provided software information, the IPG Resource Broker and the IPG Neutralization Service, are briefly described.

  7. Semantically enabled image similarity search

    NASA Astrophysics Data System (ADS)

    Casterline, May V.; Emerick, Timothy; Sadeghi, Kolia; Gosse, C. A.; Bartlett, Brent; Casey, Jason

    2015-05-01

    Georeferenced data of various modalities are increasingly available for intelligence and commercial use, however effectively exploiting these sources demands a unified data space capable of capturing the unique contribution of each input. This work presents a suite of software tools for representing geospatial vector data and overhead imagery in a shared high-dimension vector or embedding" space that supports fused learning and similarity search across dissimilar modalities. While the approach is suitable for fusing arbitrary input types, including free text, the present work exploits the obvious but computationally difficult relationship between GIS and overhead imagery. GIS is comprised of temporally-smoothed but information-limited content of a GIS, while overhead imagery provides an information-rich but temporally-limited perspective. This processing framework includes some important extensions of concepts in literature but, more critically, presents a means to accomplish them as a unified framework at scale on commodity cloud architectures.

  8. Motor symptoms in Parkinson's disease: A unified framework.

    PubMed

    Moustafa, Ahmed A; Chakravarthy, Srinivasa; Phillips, Joseph R; Gupta, Ankur; Keri, Szabolcs; Polner, Bertalan; Frank, Michael J; Jahanshahi, Marjan

    2016-09-01

    Parkinson's disease (PD) is characterized by a range of motor symptoms. Besides the cardinal symptoms (akinesia and bradykinesia, tremor and rigidity), PD patients show additional motor deficits, including: gait disturbance, impaired handwriting, grip force and speech deficits, among others. Some of these motor symptoms (e.g., deficits of gait, speech, and handwriting) have similar clinical profiles, neural substrates, and respond similarly to dopaminergic medication and deep brain stimulation (DBS). Here, we provide an extensive review of the clinical characteristics and neural substrates of each of these motor symptoms, to highlight precisely how PD and its medical and surgical treatments impact motor symptoms. In conclusion, we offer a unified framework for understanding the range of motor symptoms in PD. We argue that various motor symptoms in PD reflect dysfunction of neural structures responsible for action selection, motor sequencing, and coordination and execution of movement. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Benefits of a Unified LaSRS++ Simulation for NAS-Wide and High-Fidelity Modeling

    NASA Technical Reports Server (NTRS)

    Glaab, Patricia; Madden, Michael

    2014-01-01

    The LaSRS++ high-fidelity vehicle simulation was extended in 2012 to support a NAS-wide simulation mode. Since the initial proof-of-concept, the LaSRS++ NAS-wide simulation is maturing into a research-ready tool. A primary benefit of this new capability is the consolidation of the two modeling paradigms under a single framework to save cost, facilitate iterative concept testing between the two tools, and to promote communication and model sharing between user communities at Langley. Specific benefits of each type of modeling are discussed along with the expected benefits of the unified framework. Current capability details of the LaSRS++ NAS-wide simulations are provided, including the visualization tool, live data interface, trajectory generators, terminal routing for arrivals and departures, maneuvering, re-routing, navigation, winds, and turbulence. The plan for future development is also described.

  10. Depth Reconstruction from Single Images Using a Convolutional Neural Network and a Condition Random Field Model.

    PubMed

    Liu, Dan; Liu, Xuejun; Wu, Yiguang

    2018-04-24

    This paper presents an effective approach for depth reconstruction from a single image through the incorporation of semantic information and local details from the image. A unified framework for depth acquisition is constructed by joining a deep Convolutional Neural Network (CNN) and a continuous pairwise Conditional Random Field (CRF) model. Semantic information and relative depth trends of local regions inside the image are integrated into the framework. A deep CNN network is firstly used to automatically learn a hierarchical feature representation of the image. To get more local details in the image, the relative depth trends of local regions are incorporated into the network. Combined with semantic information of the image, a continuous pairwise CRF is then established and is used as the loss function of the unified model. Experiments on real scenes demonstrate that the proposed approach is effective and that the approach obtains satisfactory results.

  11. Robust nonlinear control of vectored thrust aircraft

    NASA Technical Reports Server (NTRS)

    Doyle, John C.; Murray, Richard; Morris, John

    1993-01-01

    An interdisciplinary program in robust control for nonlinear systems with applications to a variety of engineering problems is outlined. Major emphasis will be placed on flight control, with both experimental and analytical studies. This program builds on recent new results in control theory for stability, stabilization, robust stability, robust performance, synthesis, and model reduction in a unified framework using Linear Fractional Transformations (LFT's), Linear Matrix Inequalities (LMI's), and the structured singular value micron. Most of these new advances have been accomplished by the Caltech controls group independently or in collaboration with researchers in other institutions. These recent results offer a new and remarkably unified framework for all aspects of robust control, but what is particularly important for this program is that they also have important implications for system identification and control of nonlinear systems. This combines well with Caltech's expertise in nonlinear control theory, both in geometric methods and methods for systems with constraints and saturations.

  12. Snoopy--a unifying Petri net framework to investigate biomolecular networks.

    PubMed

    Rohr, Christian; Marwan, Wolfgang; Heiner, Monika

    2010-04-01

    To investigate biomolecular networks, Snoopy provides a unifying Petri net framework comprising a family of related Petri net classes. Models can be hierarchically structured, allowing for the mastering of larger networks. To move easily between the qualitative, stochastic and continuous modelling paradigms, models can be converted into each other. We get models sharing structure, but specialized by their kinetic information. The analysis and iterative reverse engineering of biomolecular networks is supported by the simultaneous use of several Petri net classes, while the graphical user interface adapts dynamically to the active one. Built-in animation and simulation are complemented by exports to various analysis tools. Snoopy facilitates the addition of new Petri net classes thanks to its generic design. Our tool with Petri net samples is available free of charge for non-commercial use at http://www-dssz.informatik.tu-cottbus.de/snoopy.html; supported operating systems: Mac OS X, Windows and Linux (selected distributions).

  13. VisRseq: R-based visual framework for analysis of sequencing data

    PubMed Central

    2015-01-01

    Background Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. Results We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. Conclusions To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights. PMID:26328469

  14. VisRseq: R-based visual framework for analysis of sequencing data.

    PubMed

    Younesy, Hamid; Möller, Torsten; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven J M

    2015-01-01

    Several tools have been developed to enable biologists to perform initial browsing and exploration of sequencing data. However the computational tool set for further analyses often requires significant computational expertise to use and many of the biologists with the knowledge needed to interpret these data must rely on programming experts. We present VisRseq, a framework for analysis of sequencing datasets that provides a computationally rich and accessible framework for integrative and interactive analyses without requiring programming expertise. We achieve this aim by providing R apps, which offer a semi-auto generated and unified graphical user interface for computational packages in R and repositories such as Bioconductor. To address the interactivity limitation inherent in R libraries, our framework includes several native apps that provide exploration and brushing operations as well as an integrated genome browser. The apps can be chained together to create more powerful analysis workflows. To validate the usability of VisRseq for analysis of sequencing data, we present two case studies performed by our collaborators and report their workflow and insights.

  15. Using Competency-Based Digital Open Learning Activities to Facilitate and Promote Health Professions Education (OLAmeD): A Proposal

    PubMed Central

    Barman, Linda

    2016-01-01

    Background Traditional learning in medical education has been transformed with the advent of information technology. We have recently seen global initiatives to produce online activities in an effort to scale up learning opportunities through learning management systems and massive open online courses for both undergraduate and continued professional education. Despite the positive impact of such efforts, factors such as cost, time, resources, and the specificity of educational contexts restrict the design and exchange of online medical educational activities. Objective The goal is to address the stated issues within the health professions education context while promoting learning by proposing the Online Learning Activities for Medical Education (OLAmeD) concept which builds on unified competency frameworks and generic technical standards for education. Methods We outline how frameworks used to describe a set of competencies for a specific topic in medical education across medical schools in the United States and Europe can be compared to identify commonalities that could result in a unified set of competencies representing both contexts adequately. Further, we examine how technical standards could be used to allow standardization, seamless sharing, and reusability of educational content. Results The entire process of developing and sharing OLAmeD is structured and presented in a set of steps using as example Urology as a part of clinical surgery specialization. Conclusions Beyond supporting the development, sharing, and repurposing of educational content, we expect OLAmeD to work as a tool that promotes learning and sets a base for a community of medical educational content developers across different educational contexts. PMID:27390226

  16. Using Competency-Based Digital Open Learning Activities to Facilitate and Promote Health Professions Education (OLAmeD): A Proposal.

    PubMed

    Vaitsis, Christos; Stathakarou, Natalia; Barman, Linda; Zary, Nabil; McGrath, Cormac

    2016-07-07

    Traditional learning in medical education has been transformed with the advent of information technology. We have recently seen global initiatives to produce online activities in an effort to scale up learning opportunities through learning management systems and massive open online courses for both undergraduate and continued professional education. Despite the positive impact of such efforts, factors such as cost, time, resources, and the specificity of educational contexts restrict the design and exchange of online medical educational activities. The goal is to address the stated issues within the health professions education context while promoting learning by proposing the Online Learning Activities for Medical Education (OLAmeD) concept which builds on unified competency frameworks and generic technical standards for education. We outline how frameworks used to describe a set of competencies for a specific topic in medical education across medical schools in the United States and Europe can be compared to identify commonalities that could result in a unified set of competencies representing both contexts adequately. Further, we examine how technical standards could be used to allow standardization, seamless sharing, and reusability of educational content. The entire process of developing and sharing OLAmeD is structured and presented in a set of steps using as example Urology as a part of clinical surgery specialization. Beyond supporting the development, sharing, and repurposing of educational content, we expect OLAmeD to work as a tool that promotes learning and sets a base for a community of medical educational content developers across different educational contexts.

  17. Unifying Screening Processes Within the PROSPR Consortium: A Conceptual Model for Breast, Cervical, and Colorectal Cancer Screening

    PubMed Central

    Kim, Jane J.; Schapira, Marilyn M.; Tosteson, Anna N. A.; Zauber, Ann G.; Geiger, Ann M.; Kamineni, Aruna; Weaver, Donald L.; Tiro, Jasmin A.

    2015-01-01

    General frameworks of the cancer screening process are available, but none directly compare the process in detail across different organ sites. This limits the ability of medical and public health professionals to develop and evaluate coordinated screening programs that apply resources and population management strategies available for one cancer site to other sites. We present a trans-organ conceptual model that incorporates a single screening episode for breast, cervical, and colorectal cancers into a unified framework based on clinical guidelines and protocols; the model concepts could be expanded to other organ sites. The model covers four types of care in the screening process: risk assessment, detection, diagnosis, and treatment. Interfaces between different provider teams (eg, primary care and specialty care), including communication and transfer of responsibility, may occur when transitioning between types of care. Our model highlights across each organ site similarities and differences in steps, interfaces, and transitions in the screening process and documents the conclusion of a screening episode. This model was developed within the National Cancer Institute–funded consortium Population-based Research Optimizing Screening through Personalized Regimens (PROSPR). PROSPR aims to optimize the screening process for breast, cervical, and colorectal cancer and includes seven research centers and a statistical coordinating center. Given current health care reform initiatives in the United States, this conceptual model can facilitate the development of comprehensive quality metrics for cancer screening and promote trans-organ comparative cancer screening research. PROSPR findings will support the design of interventions that improve screening outcomes across multiple cancer sites. PMID:25957378

  18. Nonnegative definite EAP and ODF estimation via a unified multi-shell HARDI reconstruction.

    PubMed

    Cheng, Jian; Jiang, Tianzi; Deriche, Rachid

    2012-01-01

    In High Angular Resolution Diffusion Imaging (HARDI), Orientation Distribution Function (ODF) and Ensemble Average Propagator (EAP) are two important Probability Density Functions (PDFs) which reflect the water diffusion and fiber orientations. Spherical Polar Fourier Imaging (SPFI) is a recent model-free multi-shell HARDI method which estimates both EAP and ODF from the diffusion signals with multiple b values. As physical PDFs, ODFs and EAPs are nonnegative definite respectively in their domains S2 and R3. However, existing ODF/EAP estimation methods like SPFI seldom consider this natural constraint. Although some works considered the nonnegative constraint on the given discrete samples of ODF/EAP, the estimated ODF/EAP is not guaranteed to be nonnegative definite in the whole continuous domain. The Riemannian framework for ODFs and EAPs has been proposed via the square root parameterization based on pre-estimated ODFs and EAPs by other methods like SPFI. However, there is no work on how to estimate the square root of ODF/EAP called as the wavefuntion directly from diffusion signals. In this paper, based on the Riemannian framework for ODFs/EAPs and Spherical Polar Fourier (SPF) basis representation, we propose a unified model-free multi-shell HARDI method, named as Square Root Parameterized Estimation (SRPE), to simultaneously estimate both the wavefunction of EAPs and the nonnegative definite ODFs and EAPs from diffusion signals. The experiments on synthetic data and real data showed SRPE is more robust to noise and has better EAP reconstruction than SPFI, especially for EAP profiles at large radius.

  19. A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations.

    PubMed

    Hahne, Jan; Helias, Moritz; Kunkel, Susanne; Igarashi, Jun; Bolten, Matthias; Frommer, Andreas; Diesmann, Markus

    2015-01-01

    Contemporary simulators for networks of point and few-compartment model neurons come with a plethora of ready-to-use neuron and synapse models and support complex network topologies. Recent technological advancements have broadened the spectrum of application further to the efficient simulation of brain-scale networks on supercomputers. In distributed network simulations the amount of spike data that accrues per millisecond and process is typically low, such that a common optimization strategy is to communicate spikes at relatively long intervals, where the upper limit is given by the shortest synaptic transmission delay in the network. This approach is well-suited for simulations that employ only chemical synapses but it has so far impeded the incorporation of gap-junction models, which require instantaneous neuronal interactions. Here, we present a numerical algorithm based on a waveform-relaxation technique which allows for network simulations with gap junctions in a way that is compatible with the delayed communication strategy. Using a reference implementation in the NEST simulator, we demonstrate that the algorithm and the required data structures can be smoothly integrated with existing code such that they complement the infrastructure for spiking connections. To show that the unified framework for gap-junction and spiking interactions achieves high performance and delivers high accuracy in the presence of gap junctions, we present benchmarks for workstations, clusters, and supercomputers. Finally, we discuss limitations of the novel technology.

  20. A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations

    PubMed Central

    Hahne, Jan; Helias, Moritz; Kunkel, Susanne; Igarashi, Jun; Bolten, Matthias; Frommer, Andreas; Diesmann, Markus

    2015-01-01

    Contemporary simulators for networks of point and few-compartment model neurons come with a plethora of ready-to-use neuron and synapse models and support complex network topologies. Recent technological advancements have broadened the spectrum of application further to the efficient simulation of brain-scale networks on supercomputers. In distributed network simulations the amount of spike data that accrues per millisecond and process is typically low, such that a common optimization strategy is to communicate spikes at relatively long intervals, where the upper limit is given by the shortest synaptic transmission delay in the network. This approach is well-suited for simulations that employ only chemical synapses but it has so far impeded the incorporation of gap-junction models, which require instantaneous neuronal interactions. Here, we present a numerical algorithm based on a waveform-relaxation technique which allows for network simulations with gap junctions in a way that is compatible with the delayed communication strategy. Using a reference implementation in the NEST simulator, we demonstrate that the algorithm and the required data structures can be smoothly integrated with existing code such that they complement the infrastructure for spiking connections. To show that the unified framework for gap-junction and spiking interactions achieves high performance and delivers high accuracy in the presence of gap junctions, we present benchmarks for workstations, clusters, and supercomputers. Finally, we discuss limitations of the novel technology. PMID:26441628

  1. Unified theory for stochastic modelling of hydroclimatic processes: Preserving marginal distributions, correlation structures, and intermittency

    NASA Astrophysics Data System (ADS)

    Papalexiou, Simon Michael

    2018-05-01

    Hydroclimatic processes come in all "shapes and sizes". They are characterized by different spatiotemporal correlation structures and probability distributions that can be continuous, mixed-type, discrete or even binary. Simulating such processes by reproducing precisely their marginal distribution and linear correlation structure, including features like intermittency, can greatly improve hydrological analysis and design. Traditionally, modelling schemes are case specific and typically attempt to preserve few statistical moments providing inadequate and potentially risky distribution approximations. Here, a single framework is proposed that unifies, extends, and improves a general-purpose modelling strategy, based on the assumption that any process can emerge by transforming a specific "parent" Gaussian process. A novel mathematical representation of this scheme, introducing parametric correlation transformation functions, enables straightforward estimation of the parent-Gaussian process yielding the target process after the marginal back transformation, while it provides a general description that supersedes previous specific parameterizations, offering a simple, fast and efficient simulation procedure for every stationary process at any spatiotemporal scale. This framework, also applicable for cyclostationary and multivariate modelling, is augmented with flexible parametric correlation structures that parsimoniously describe observed correlations. Real-world simulations of various hydroclimatic processes with different correlation structures and marginals, such as precipitation, river discharge, wind speed, humidity, extreme events per year, etc., as well as a multivariate example, highlight the flexibility, advantages, and complete generality of the method.

  2. Toxicology ontology perspectives.

    PubMed

    Hardy, Barry; Apic, Gordana; Carthew, Philip; Clark, Dominic; Cook, David; Dix, Ian; Escher, Sylvia; Hastings, Janna; Heard, David J; Jeliazkova, Nina; Judson, Philip; Matis-Mitchell, Sherri; Mitic, Dragana; Myatt, Glenn; Shah, Imran; Spjuth, Ola; Tcheremenskaia, Olga; Toldo, Luca; Watson, David; White, Andrew; Yang, Chihae

    2012-01-01

    The field of predictive toxicology requires the development of open, public, computable, standardized toxicology vocabularies and ontologies to support the applications required by in silico, in vitro, and in vivo toxicology methods and related analysis and reporting activities. In this article we review ontology developments based on a set of perspectives showing how ontologies are being used in predictive toxicology initiatives and applications. Perspectives on resources and initiatives reviewed include OpenTox, eTOX, Pistoia Alliance, ToxWiz, Virtual Liver, EU-ADR, BEL, ToxML, and Bioclipse. We also review existing ontology developments in neighboring fields that can contribute to establishing an ontological framework for predictive toxicology. A significant set of resources is already available to provide a foundation for an ontological framework for 21st century mechanistic-based toxicology research. Ontologies such as ToxWiz provide a basis for application to toxicology investigations, whereas other ontologies under development in the biological, chemical, and biomedical communities could be incorporated in an extended future framework. OpenTox has provided a semantic web framework for the implementation of such ontologies into software applications and linked data resources. Bioclipse developers have shown the benefit of interoperability obtained through ontology by being able to link their workbench application with remote OpenTox web services. Although these developments are promising, an increased international coordination of efforts is greatly needed to develop a more unified, standardized, and open toxicology ontology framework.

  3. A unifying framework for rigid multibody dynamics and serial and parallel computational issues

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Jain, Abhinandan

    1989-01-01

    A unifying framework for various formulations of the dynamics of open-chain rigid multibody systems is discussed. Their suitability for serial and parallel processing is assessed. The framework is based on the derivation of intrinsic, i.e., coordinate-free, equations of the algorithms which provides a suitable abstraction and permits a distinction to be made between the computational redundancy in the intrinsic and extrinsic equations. A set of spatial notation is used which allows the derivation of the various algorithms in a common setting and thus clarifies the relationships among them. The three classes of algorithms viz., O(n), O(n exp 2) and O(n exp 3) or the solution of the dynamics problem are investigated. Researchers begin with the derivation of O(n exp 3) algorithms based on the explicit computation of the mass matrix and it provides insight into the underlying basis of the O(n) algorithms. From a computational perspective, the optimal choice of a coordinate frame for the projection of the intrinsic equations is discussed and the serial computational complexity of the different algorithms is evaluated. The three classes of algorithms are also analyzed for suitability for parallel processing. It is shown that the problem belongs to the class of N C and the time and processor bounds are of O(log2/2(n)) and O(n exp 4), respectively. However, the algorithm that achieves the above bounds is not stable. Researchers show that the fastest stable parallel algorithm achieves a computational complexity of O(n) with O(n exp 4), respectively. However, the algorithm that achieves the above bounds is not stable. Researchers show that the fastest stable parallel algorithm achieves a computational complexity of O(n) with O(n exp 2) processors, and results from the parallelization of the O(n exp 3) serial algorithm.

  4. A Unified Analysis of Structured Sonar-terrain Data using Bayesian Functional Mixed Models.

    PubMed

    Zhu, Hongxiao; Caspers, Philip; Morris, Jeffrey S; Wu, Xiaowei; Müller, Rolf

    2018-01-01

    Sonar emits pulses of sound and uses the reflected echoes to gain information about target objects. It offers a low cost, complementary sensing modality for small robotic platforms. While existing analytical approaches often assume independence across echoes, real sonar data can have more complicated structures due to device setup or experimental design. In this paper, we consider sonar echo data collected from multiple terrain substrates with a dual-channel sonar head. Our goals are to identify the differential sonar responses to terrains and study the effectiveness of this dual-channel design in discriminating targets. We describe a unified analytical framework that achieves these goals rigorously, simultaneously, and automatically. The analysis was done by treating the echo envelope signals as functional responses and the terrain/channel information as covariates in a functional regression setting. We adopt functional mixed models that facilitate the estimation of terrain and channel effects while capturing the complex hierarchical structure in data. This unified analytical framework incorporates both Gaussian models and robust models. We fit the models using a full Bayesian approach, which enables us to perform multiple inferential tasks under the same modeling framework, including selecting models, estimating the effects of interest, identifying significant local regions, discriminating terrain types, and describing the discriminatory power of local regions. Our analysis of the sonar-terrain data identifies time regions that reflect differential sonar responses to terrains. The discriminant analysis suggests that a multi- or dual-channel design achieves target identification performance comparable with or better than a single-channel design.

  5. A Unified Analysis of Structured Sonar-terrain Data using Bayesian Functional Mixed Models

    PubMed Central

    Zhu, Hongxiao; Caspers, Philip; Morris, Jeffrey S.; Wu, Xiaowei; Müller, Rolf

    2017-01-01

    Sonar emits pulses of sound and uses the reflected echoes to gain information about target objects. It offers a low cost, complementary sensing modality for small robotic platforms. While existing analytical approaches often assume independence across echoes, real sonar data can have more complicated structures due to device setup or experimental design. In this paper, we consider sonar echo data collected from multiple terrain substrates with a dual-channel sonar head. Our goals are to identify the differential sonar responses to terrains and study the effectiveness of this dual-channel design in discriminating targets. We describe a unified analytical framework that achieves these goals rigorously, simultaneously, and automatically. The analysis was done by treating the echo envelope signals as functional responses and the terrain/channel information as covariates in a functional regression setting. We adopt functional mixed models that facilitate the estimation of terrain and channel effects while capturing the complex hierarchical structure in data. This unified analytical framework incorporates both Gaussian models and robust models. We fit the models using a full Bayesian approach, which enables us to perform multiple inferential tasks under the same modeling framework, including selecting models, estimating the effects of interest, identifying significant local regions, discriminating terrain types, and describing the discriminatory power of local regions. Our analysis of the sonar-terrain data identifies time regions that reflect differential sonar responses to terrains. The discriminant analysis suggests that a multi- or dual-channel design achieves target identification performance comparable with or better than a single-channel design. PMID:29749977

  6. Rapid development of entity-based data models for bioinformatics with persistence object-oriented design and structured interfaces.

    PubMed

    Ezra Tsur, Elishai

    2017-01-01

    Databases are imperative for research in bioinformatics and computational biology. Current challenges in database design include data heterogeneity and context-dependent interconnections between data entities. These challenges drove the development of unified data interfaces and specialized databases. The curation of specialized databases is an ever-growing challenge due to the introduction of new data sources and the emergence of new relational connections between established datasets. Here, an open-source framework for the curation of specialized databases is proposed. The framework supports user-designed models of data encapsulation, objects persistency and structured interfaces to local and external data sources such as MalaCards, Biomodels and the National Centre for Biotechnology Information (NCBI) databases. The proposed framework was implemented using Java as the development environment, EclipseLink as the data persistency agent and Apache Derby as the database manager. Syntactic analysis was based on J3D, jsoup, Apache Commons and w3c.dom open libraries. Finally, a construction of a specialized database for aneurysms associated vascular diseases is demonstrated. This database contains 3-dimensional geometries of aneurysms, patient's clinical information, articles, biological models, related diseases and our recently published model of aneurysms' risk of rapture. Framework is available in: http://nbel-lab.com.

  7. Statistical mechanics framework for static granular matter.

    PubMed

    Henkes, Silke; Chakraborty, Bulbul

    2009-06-01

    The physical properties of granular materials have been extensively studied in recent years. So far, however, there exists no theoretical framework which can explain the observations in a unified manner beyond the phenomenological jamming diagram. This work focuses on the case of static granular matter, where we have constructed a statistical ensemble which mirrors equilibrium statistical mechanics. This ensemble, which is based on the conservation properties of the stress tensor, is distinct from the original Edwards ensemble and applies to packings of deformable grains. We combine it with a field theoretical analysis of the packings, where the field is the Airy stress function derived from the force and torque balance conditions. In this framework, Point J characterized by a diverging stiffness of the pressure fluctuations. Separately, we present a phenomenological mean-field theory of the jamming transition, which incorporates the mean contact number as a variable. We link both approaches in the context of the marginal rigidity picture proposed by Wyart and others.

  8. Architectural Framework for Addressing Legacy Waste from the Cold War - 13611

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Love, Gregory A.; Glazner, Christopher G.; Steckley, Sam

    We present an architectural framework for the use of a hybrid simulation model of enterprise-wide operations used to develop system-level insight into the U.S. Department of Energy's (DOE) environmental cleanup of legacy nuclear waste at the Savannah River Site. We use this framework for quickly exploring policy and architectural options, analyzing plans, addressing management challenges and developing mitigation strategies for DOE Office of Environmental Management (EM). The socio-technical complexity of EM's mission compels the use of a qualitative approach to complement a more a quantitative discrete event modeling effort. We use this model-based analysis to pinpoint pressure and leverage pointsmore » and develop a shared conceptual understanding of the problem space and platform for communication among stakeholders across the enterprise in a timely manner. This approach affords the opportunity to discuss problems using a unified conceptual perspective and is also general enough that it applies to a broad range of capital investment/production operations problems. (authors)« less

  9. Brain-Mind Operational Architectonics Imaging: Technical and Methodological Aspects

    PubMed Central

    Fingelkurts, Andrew A; Fingelkurts, Alexander A

    2008-01-01

    This review paper deals with methodological and technical foundations of the Operational Architectonics framework of brain and mind functioning. This theory provides a framework for mapping and understanding important aspects of the brain mechanisms that constitute perception, cognition, and eventually consciousness. The methods utilized within Operational Architectonics framework allow analyzing with an incredible detail the operational behavior of local neuronal assemblies and their joint activity in the form of unified and metastable operational modules, which constitute the whole hierarchy of brain operations, operations of cognition and phenomenal consciousness. PMID:19526071

  10. Reframing Information Literacy as a Metaliteracy

    ERIC Educational Resources Information Center

    Mackey, Thomas P.; Jacobson, Trudi E.

    2011-01-01

    Social media environments and online communities are innovative collaborative technologies that challenge traditional definitions of information literacy. Metaliteracy is an overarching and self-referential framework that integrates emerging technologies and unifies multiple literacy types. This redefinition of information literacy expands the…

  11. Probabilistic framework for the estimation of the adult and child toxicokinetic intraspecies uncertainty factors.

    PubMed

    Pelekis, Michael; Nicolich, Mark J; Gauthier, Joseph S

    2003-12-01

    Human health risk assessments use point values to develop risk estimates and thus impart a deterministic character to risk, which, by definition, is a probability phenomenon. The risk estimates are calculated based on individuals and then, using uncertainty factors (UFs), are extrapolated to the population that is characterized by variability. Regulatory agencies have recommended the quantification of the impact of variability in risk assessments through the application of probabilistic methods. In the present study, a framework that deals with the quantitative analysis of uncertainty (U) and variability (V) in target tissue dose in the population was developed by applying probabilistic analysis to physiologically-based toxicokinetic models. The mechanistic parameters that determine kinetics were described with probability density functions (PDFs). Since each PDF depicts the frequency of occurrence of all expected values of each parameter in the population, the combined effects of multiple sources of U/V were accounted for in the estimated distribution of tissue dose in the population, and a unified (adult and child) intraspecies toxicokinetic uncertainty factor UFH-TK was determined. The results show that the proposed framework accounts effectively for U/V in population toxicokinetics. The ratio of the 95th percentile to the 50th percentile of the annual average concentration of the chemical at the target tissue organ (i.e., the UFH-TK) varies with age. The ratio is equivalent to a unified intraspecies toxicokinetic UF, and it is one of the UFs by which the NOAEL can be divided to obtain the RfC/RfD. The 10-fold intraspecies UF is intended to account for uncertainty and variability in toxicokinetics (3.2x) and toxicodynamics (3.2x). This article deals exclusively with toxicokinetic component of UF. The framework provides an alternative to the default methodology and is advantageous in that the evaluation of toxicokinetic variability is based on the distribution of the effective target tissue dose, rather than applied dose. It allows for the replacement of the default adult and children intraspecies UF with toxicokinetic data-derived values and provides accurate chemical-specific estimates for their magnitude. It shows that proper application of probability and toxicokinetic theories can reduce uncertainties when establishing exposure limits for specific compounds and provide better assurance that established limits are adequately protective. It contributes to the development of a probabilistic noncancer risk assessment framework and will ultimately lead to the unification of cancer and noncancer risk assessment methodologies.

  12. A unified probabilistic framework for spontaneous facial action modeling and understanding.

    PubMed

    Tong, Yan; Chen, Jixu; Ji, Qiang

    2010-02-01

    Facial expression is a natural and powerful means of human communication. Recognizing spontaneous facial actions, however, is very challenging due to subtle facial deformation, frequent head movements, and ambiguous and uncertain facial motion measurements. Because of these challenges, current research in facial expression recognition is limited to posed expressions and often in frontal view. A spontaneous facial expression is characterized by rigid head movements and nonrigid facial muscular movements. More importantly, it is the coherent and consistent spatiotemporal interactions among rigid and nonrigid facial motions that produce a meaningful facial expression. Recognizing this fact, we introduce a unified probabilistic facial action model based on the Dynamic Bayesian network (DBN) to simultaneously and coherently represent rigid and nonrigid facial motions, their spatiotemporal dependencies, and their image measurements. Advanced machine learning methods are introduced to learn the model based on both training data and subjective prior knowledge. Given the model and the measurements of facial motions, facial action recognition is accomplished through probabilistic inference by systematically integrating visual measurements with the facial action model. Experiments show that compared to the state-of-the-art techniques, the proposed system yields significant improvements in recognizing both rigid and nonrigid facial motions, especially for spontaneous facial expressions.

  13. Geometric rectification of camera-captured document images.

    PubMed

    Liang, Jian; DeMenthon, Daniel; Doermann, David

    2008-04-01

    Compared to typical scanners, handheld cameras offer convenient, flexible, portable, and non-contact image capture, which enables many new applications and breathes new life into existing ones. However, camera-captured documents may suffer from distortions caused by non-planar document shape and perspective projection, which lead to failure of current OCR technologies. We present a geometric rectification framework for restoring the frontal-flat view of a document from a single camera-captured image. Our approach estimates 3D document shape from texture flow information obtained directly from the image without requiring additional 3D/metric data or prior camera calibration. Our framework provides a unified solution for both planar and curved documents and can be applied in many, especially mobile, camera-based document analysis applications. Experiments show that our method produces results that are significantly more OCR compatible than the original images.

  14. Structured sparse linear graph embedding.

    PubMed

    Wang, Haixian

    2012-03-01

    Subspace learning is a core issue in pattern recognition and machine learning. Linear graph embedding (LGE) is a general framework for subspace learning. In this paper, we propose a structured sparse extension to LGE (SSLGE) by introducing a structured sparsity-inducing norm into LGE. Specifically, SSLGE casts the projection bases learning into a regression-type optimization problem, and then the structured sparsity regularization is applied to the regression coefficients. The regularization selects a subset of features and meanwhile encodes high-order information reflecting a priori structure information of the data. The SSLGE technique provides a unified framework for discovering structured sparse subspace. Computationally, by using a variational equality and the Procrustes transformation, SSLGE is efficiently solved with closed-form updates. Experimental results on face image show the effectiveness of the proposed method. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Image-guided regularization level set evolution for MR image segmentation and bias field correction.

    PubMed

    Wang, Lingfeng; Pan, Chunhong

    2014-01-01

    Magnetic resonance (MR) image segmentation is a crucial step in surgical and treatment planning. In this paper, we propose a level-set-based segmentation method for MR images with intensity inhomogeneous problem. To tackle the initialization sensitivity problem, we propose a new image-guided regularization to restrict the level set function. The maximum a posteriori inference is adopted to unify segmentation and bias field correction within a single framework. Under this framework, both the contour prior and the bias field prior are fully used. As a result, the image intensity inhomogeneity can be well solved. Extensive experiments are provided to evaluate the proposed method, showing significant improvements in both segmentation and bias field correction accuracies as compared with other state-of-the-art approaches. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Theory of the Origin, Evolution, and Nature of Life

    PubMed Central

    Andrulis, Erik D.

    2011-01-01

    Life is an inordinately complex unsolved puzzle. Despite significant theoretical progress, experimental anomalies, paradoxes, and enigmas have revealed paradigmatic limitations. Thus, the advancement of scientific understanding requires new models that resolve fundamental problems. Here, I present a theoretical framework that economically fits evidence accumulated from examinations of life. This theory is based upon a straightforward and non-mathematical core model and proposes unique yet empirically consistent explanations for major phenomena including, but not limited to, quantum gravity, phase transitions of water, why living systems are predominantly CHNOPS (carbon, hydrogen, nitrogen, oxygen, phosphorus, and sulfur), homochirality of sugars and amino acids, homeoviscous adaptation, triplet code, and DNA mutations. The theoretical framework unifies the macrocosmic and microcosmic realms, validates predicted laws of nature, and solves the puzzle of the origin and evolution of cellular life in the universe. PMID:25382118

  17. From 16-bit to high-accuracy IDCT approximation: fruits of single architecture affliation

    NASA Astrophysics Data System (ADS)

    Liu, Lijie; Tran, Trac D.; Topiwala, Pankaj

    2007-09-01

    In this paper, we demonstrate an effective unified framework for high-accuracy approximation of the irrational co-effcient floating-point IDCT by a single integer-coeffcient fixed-point architecture. Our framework is based on a modified version of the Loeffler's sparse DCT factorization, and the IDCT architecture is constructed via a cascade of dyadic lifting steps and butterflies. We illustrate that simply varying the accuracy of the approximating parameters yields a large family of standard-compliant IDCTs, from rare 16-bit approximations catering to portable computing to ultra-high-accuracy 32-bit versions that virtually eliminate any drifting effect when pairing with the 64-bit floating-point IDCT at the encoder. Drifting performances of the proposed IDCTs along with existing popular IDCT algorithms in H.263+, MPEG-2 and MPEG-4 are also demonstrated.

  18. Medical privacy protection based on granular computing.

    PubMed

    Wang, Da-Wei; Liau, Churn-Jung; Hsu, Tsan-Sheng

    2004-10-01

    Based on granular computing methodology, we propose two criteria to quantitatively measure privacy invasion. The total cost criterion measures the effort needed for a data recipient to find private information. The average benefit criterion measures the benefit a data recipient obtains when he received the released data. These two criteria remedy the inadequacy of the deterministic privacy formulation proposed in Proceedings of Asia Pacific Medical Informatics Conference, 2000; Int J Med Inform 2003;71:17-23. Granular computing methodology provides a unified framework for these quantitative measurements and previous bin size and logical approaches. These two new criteria are implemented in a prototype system Cellsecu 2.0. Preliminary system performance evaluation is conducted and reviewed.

  19. High-Contrast Gratings based Spoof Surface Plasmons

    NASA Astrophysics Data System (ADS)

    Li, Zhuo; Liu, Liangliang; Xu, Bingzheng; Ning, Pingping; Chen, Chen; Xu, Jia; Chen, Xinlei; Gu, Changqing; Qing, Quan

    2016-02-01

    In this work, we explore the existence of spoof surface plasmons (SSPs) supported by deep-subwavelength high-contrast gratings (HCGs) on a perfect electric conductor plane. The dispersion relation of the HCGs-based SSPs is derived analyt- ically by combining multimode network theory with rigorous mode matching method, which has nearly the same form with and can be degenerated into that of the SSPs arising from deep-subwavelength metallic gratings (MGs). Numerical simula- tions validate the analytical dispersion relation and an effective medium approximation is also presented to obtain the same analytical dispersion formula. This work sets up a unified theoretical framework for SSPs and opens up new vistas in surface plasmon optics.

  20. Perspectives for elucidating the ethylenediurea (EDU) mode of action for protection against O3 phytotoxicity.

    PubMed

    Agathokleous, Evgenios

    2017-08-01

    Ethylenediurea (EDU) has been widely studied for its effectiveness to protect plants against injuries caused by surface ozone (O 3 ), however its mode of action remains unclear. So far, there is not a unified methodological approach and thus the methodology is quite arbitrary, thereby making it more difficult to generalize findings and understand the EDU mode of action. This review examines the question of whether potential N addition to plants by EDU is a fundamental underlying mechanism in protecting against O 3 phytotoxicity. Yet, this review proposes an evidence-based hypothesis that EDU may protect plants against O 3 deleterious effects upon generation of EDU-induced hormesis, i.e. by activating plant defense at low doses. This hypothesis challenges the future research directions. Revealing a hormesis-based EDU mode of action in protecting plants against O 3 toxicity would have further implications to ecotoxicology and environmental safety. Furthermore, this review discusses the need for further studies on plant metabolism under EDU treatment through relevant experimental approach, and attempts to set the bases for approaching a unified methodology that will contribute in revealing the EDU mode of action. In this framework, focus is given to the main EDU application methods. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Unified Deep Learning Architecture for Modeling Biology Sequence.

    PubMed

    Wu, Hongjie; Cao, Chengyuan; Xia, Xiaoyan; Lu, Qiang

    2017-10-09

    Prediction of the spatial structure or function of biological macromolecules based on their sequence remains an important challenge in bioinformatics. When modeling biological sequences using traditional sequencing models, characteristics, such as long-range interactions between basic units, the complicated and variable output of labeled structures, and the variable length of biological sequences, usually lead to different solutions on a case-by-case basis. This study proposed the use of bidirectional recurrent neural networks based on long short-term memory or a gated recurrent unit to capture long-range interactions by designing the optional reshape operator to adapt to the diversity of the output labels and implementing a training algorithm to support the training of sequence models capable of processing variable-length sequences. Additionally, the merge and pooling operators enhanced the ability to capture short-range interactions between basic units of biological sequences. The proposed deep-learning model and its training algorithm might be capable of solving currently known biological sequence-modeling problems through the use of a unified framework. We validated our model on one of the most difficult biological sequence-modeling problems currently known, with our results indicating the ability of the model to obtain predictions of protein residue interactions that exceeded the accuracy of current popular approaches by 10% based on multiple benchmarks.

  2. A Goddard Multi-Scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, W.K.; Anderson, D.; Atlas, R.; Chern, J.; Houser, P.; Hou, A.; Lang, S.; Lau, W.; Peters-Lidard, C.; Kakar, R.; hide

    2008-01-01

    Numerical cloud resolving models (CRMs), which are based the non-hydrostatic equations of motion, have been extensively applied to cloud-scale and mesoscale processes during the past four decades. Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that CRMs agree with observations in simulating various types of clouds and cloud systems from different geographic locations. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that Numerical Weather Prediction (NWP) and regional scale model can be run in grid size similar to cloud resolving model through nesting technique. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a szrper-parameterization or multi-scale modeling -framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign can provide initial conditions as well as validation through utilizing the Earth Satellite simulators. At Goddard, we have developed a multi-scale modeling system with unified physics. The modeling system consists a coupled GCM-CRM (or MMF); a state-of-the-art weather research forecast model (WRF) and a cloud-resolving model (Goddard Cumulus Ensemble model). In these models, the same microphysical schemes (2ICE, several 3ICE), radiation (including explicitly calculated cloud optical properties), and surface models are applied. In addition, a comprehensive unified Earth Satellite simulator has been developed at GSFC, which is designed to fully utilize the multi-scale modeling system. A brief review of the multi-scale modeling system with unified physics/simulator and examples is presented in this article.

  3. Self organising hypothesis networks: a new approach for representing and structuring SAR knowledge

    PubMed Central

    2014-01-01

    Background Combining different sources of knowledge to build improved structure activity relationship models is not easy owing to the variety of knowledge formats and the absence of a common framework to interoperate between learning techniques. Most of the current approaches address this problem by using consensus models that operate at the prediction level. We explore the possibility to directly combine these sources at the knowledge level, with the aim to harvest potentially increased synergy at an earlier stage. Our goal is to design a general methodology to facilitate knowledge discovery and produce accurate and interpretable models. Results To combine models at the knowledge level, we propose to decouple the learning phase from the knowledge application phase using a pivot representation (lingua franca) based on the concept of hypothesis. A hypothesis is a simple and interpretable knowledge unit. Regardless of its origin, knowledge is broken down into a collection of hypotheses. These hypotheses are subsequently organised into hierarchical network. This unification permits to combine different sources of knowledge into a common formalised framework. The approach allows us to create a synergistic system between different forms of knowledge and new algorithms can be applied to leverage this unified model. This first article focuses on the general principle of the Self Organising Hypothesis Network (SOHN) approach in the context of binary classification problems along with an illustrative application to the prediction of mutagenicity. Conclusion It is possible to represent knowledge in the unified form of a hypothesis network allowing interpretable predictions with performances comparable to mainstream machine learning techniques. This new approach offers the potential to combine knowledge from different sources into a common framework in which high level reasoning and meta-learning can be applied; these latter perspectives will be explored in future work. PMID:24959206

  4. Foundational Principles for Large-Scale Inference: Illustrations Through Correlation Mining.

    PubMed

    Hero, Alfred O; Rajaratnam, Bala

    2016-01-01

    When can reliable inference be drawn in fue "Big Data" context? This paper presents a framework for answering this fundamental question in the context of correlation mining, wifu implications for general large scale inference. In large scale data applications like genomics, connectomics, and eco-informatics fue dataset is often variable-rich but sample-starved: a regime where the number n of acquired samples (statistical replicates) is far fewer than fue number p of observed variables (genes, neurons, voxels, or chemical constituents). Much of recent work has focused on understanding the computational complexity of proposed methods for "Big Data". Sample complexity however has received relatively less attention, especially in the setting when the sample size n is fixed, and the dimension p grows without bound. To address fuis gap, we develop a unified statistical framework that explicitly quantifies the sample complexity of various inferential tasks. Sampling regimes can be divided into several categories: 1) the classical asymptotic regime where fue variable dimension is fixed and fue sample size goes to infinity; 2) the mixed asymptotic regime where both variable dimension and sample size go to infinity at comparable rates; 3) the purely high dimensional asymptotic regime where the variable dimension goes to infinity and the sample size is fixed. Each regime has its niche but only the latter regime applies to exa cale data dimension. We illustrate this high dimensional framework for the problem of correlation mining, where it is the matrix of pairwise and partial correlations among the variables fua t are of interest. Correlation mining arises in numerous applications and subsumes the regression context as a special case. we demonstrate various regimes of correlation mining based on the unifying perspective of high dimensional learning rates and sample complexity for different structured covariance models and different inference tasks.

  5. A UNIFIED FRAMEWORK FOR VARIANCE COMPONENT ESTIMATION WITH SUMMARY STATISTICS IN GENOME-WIDE ASSOCIATION STUDIES.

    PubMed

    Zhou, Xiang

    2017-12-01

    Linear mixed models (LMMs) are among the most commonly used tools for genetic association studies. However, the standard method for estimating variance components in LMMs-the restricted maximum likelihood estimation method (REML)-suffers from several important drawbacks: REML requires individual-level genotypes and phenotypes from all samples in the study, is computationally slow, and produces downward-biased estimates in case control studies. To remedy these drawbacks, we present an alternative framework for variance component estimation, which we refer to as MQS. MQS is based on the method of moments (MoM) and the minimal norm quadratic unbiased estimation (MINQUE) criterion, and brings two seemingly unrelated methods-the renowned Haseman-Elston (HE) regression and the recent LD score regression (LDSC)-into the same unified statistical framework. With this new framework, we provide an alternative but mathematically equivalent form of HE that allows for the use of summary statistics. We provide an exact estimation form of LDSC to yield unbiased and statistically more efficient estimates. A key feature of our method is its ability to pair marginal z -scores computed using all samples with SNP correlation information computed using a small random subset of individuals (or individuals from a proper reference panel), while capable of producing estimates that can be almost as accurate as if both quantities are computed using the full data. As a result, our method produces unbiased and statistically efficient estimates, and makes use of summary statistics, while it is computationally efficient for large data sets. Using simulations and applications to 37 phenotypes from 8 real data sets, we illustrate the benefits of our method for estimating and partitioning SNP heritability in population studies as well as for heritability estimation in family studies. Our method is implemented in the GEMMA software package, freely available at www.xzlab.org/software.html.

  6. Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator.

    PubMed

    Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus

    2017-01-01

    Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation.

  7. Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator

    PubMed Central

    Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus

    2017-01-01

    Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation. PMID:28596730

  8. Accuracy of the unified approach in maternally influenced traits - illustrated by a simulation study in the honey bee (Apis mellifera)

    PubMed Central

    2013-01-01

    Background The honey bee is an economically important species. With a rapid decline of the honey bee population, it is necessary to implement an improved genetic evaluation methodology. In this study, we investigated the applicability of the unified approach and its impact on the accuracy of estimation of breeding values for maternally influenced traits on a simulated dataset for the honey bee. Due to the limitation to the number of individuals that can be genotyped in a honey bee population, the unified approach can be an efficient strategy to increase the genetic gain and to provide a more accurate estimation of breeding values. We calculated the accuracy of estimated breeding values for two evaluation approaches, the unified approach and the traditional pedigree based approach. We analyzed the effects of different heritabilities as well as genetic correlation between direct and maternal effects on the accuracy of estimation of direct, maternal and overall breeding values (sum of maternal and direct breeding values). The genetic and reproductive biology of the honey bee was accounted for by taking into consideration characteristics such as colony structure, uncertain paternity, overlapping generations and polyandry. In addition, we used a modified numerator relationship matrix and a realistic genome for the honey bee. Results For all values of heritability and correlation, the accuracy of overall estimated breeding values increased significantly with the unified approach. The increase in accuracy was always higher for the case when there was no correlation as compared to the case where a negative correlation existed between maternal and direct effects. Conclusions Our study shows that the unified approach is a useful methodology for genetic evaluation in honey bees, and can contribute immensely to the improvement of traits of apicultural interest such as resistance to Varroa or production and behavioural traits. In particular, the study is of great interest for cases where negative correlation between maternal and direct effects and uncertain paternity exist, thus, is of relevance for other species as well. The study also provides an important framework for simulating genomic and pedigree datasets that will prove to be helpful for future studies. PMID:23647776

  9. Accuracy of the unified approach in maternally influenced traits--illustrated by a simulation study in the honey bee (Apis mellifera).

    PubMed

    Gupta, Pooja; Reinsch, Norbert; Spötter, Andreas; Conrad, Tim; Bienefeld, Kaspar

    2013-05-06

    The honey bee is an economically important species. With a rapid decline of the honey bee population, it is necessary to implement an improved genetic evaluation methodology. In this study, we investigated the applicability of the unified approach and its impact on the accuracy of estimation of breeding values for maternally influenced traits on a simulated dataset for the honey bee. Due to the limitation to the number of individuals that can be genotyped in a honey bee population, the unified approach can be an efficient strategy to increase the genetic gain and to provide a more accurate estimation of breeding values. We calculated the accuracy of estimated breeding values for two evaluation approaches, the unified approach and the traditional pedigree based approach. We analyzed the effects of different heritabilities as well as genetic correlation between direct and maternal effects on the accuracy of estimation of direct, maternal and overall breeding values (sum of maternal and direct breeding values). The genetic and reproductive biology of the honey bee was accounted for by taking into consideration characteristics such as colony structure, uncertain paternity, overlapping generations and polyandry. In addition, we used a modified numerator relationship matrix and a realistic genome for the honey bee. For all values of heritability and correlation, the accuracy of overall estimated breeding values increased significantly with the unified approach. The increase in accuracy was always higher for the case when there was no correlation as compared to the case where a negative correlation existed between maternal and direct effects. Our study shows that the unified approach is a useful methodology for genetic evaluation in honey bees, and can contribute immensely to the improvement of traits of apicultural interest such as resistance to Varroa or production and behavioural traits. In particular, the study is of great interest for cases where negative correlation between maternal and direct effects and uncertain paternity exist, thus, is of relevance for other species as well. The study also provides an important framework for simulating genomic and pedigree datasets that will prove to be helpful for future studies.

  10. A Web-Based Common Framework to Support the Test and Evaluation Process Any Time, Anywhere, and Anyhow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schur, Anne; Brown, James C.; Eaton, Sharon L.

    Test and evaluation (T and E) is an enterprise. For any product, large or small, performance data is desired on many aspects to evaluate the product?s effectiveness for the intended users. Representing the many T and E facets without bewildering the user is challenging when there is a range of people, from the system developers to the manager of the organization, that want specific feedback. A web-based One-Stop Evaluation Center was created to meet these needs for a particular project. The evaluation center is usable at any time in the systems development lifecycle and streamlines the T and E enterprise.more » This paper discusses a common framework that unifies the T and E process with many stakeholders involved and is flexible to accommodate each stakeholders?specific evaluative processes and content. Our success has translated to many cost savings by enabling quick responses to change and a better line of communication between the users, developers, and managers.« less

  11. Determination of water environment standards based on water quality criteria in China: Limitations and feasibilities.

    PubMed

    Wang, Tieyu; Zhou, Yunqiao; Bi, Cencen; Lu, Yonglong; He, Guizhen; Giesy, John P

    2017-07-01

    There is a need to formulate water environment standards (WESs) from the current water quality criteria (WQC) in China. To this end, we briefly summarize typical mechanisms applied in several countries with longer histories of developing WESs, and three limitations to formulating WESs in China were identified. After analyzing the feasibility factors including economic development, scientific support capability and environmental policies, we realized that China is still not ready for a complete change from its current nation-wide unified WES system to a local-standard-based system. Thus, we proposed a framework for transformation from WQC to WESs in China. The framework consists of three parts, including responsibilities, processes and policies. The responsibilities include research authorization, development of guidelines, and collection of information, at both national and local levels; the processes include four steps and an impact factor system to establish water quality standards; and the policies include seven specific proposals. Copyright © 2016. Published by Elsevier B.V.

  12. Morphological effects on sensitivity of heterogeneous energetic materials

    NASA Astrophysics Data System (ADS)

    Roy, Sidhartha; Rai, Nirmal; Sen, Oishik; Udaykumar, H. S.

    2017-06-01

    The mesoscale physical response under shock loading in heterogeneous energetics is inherently linked to the microstructural characteristics. The current work demonstrates the connection between the microstructural features of porous energetic material and its sensitivity. A unified levelset based framework is developed to characterize the microstructures of a given sample. Several morphological metrics describing the mesoscale geometry of the materials are extracted using the current tool including anisotropy, tortuosity, surface to volume, nearest neighbors, size and curvature distributions. The relevant metrics among the ones extracted are identified and correlated to the mesoscale response of the energetic materials under shock loading. Two classes of problems are considered here: (a) field of idealized voids embedded in the HMX material and (b) real samples of pressed HMX. The effects of stochasticity associated with void arrangements on the sensitivity of the energetic material samples are shown. In summary, this work demonstrates the relationship between the mesoscale morphology and shock response of heterogeneous energetic materials using a levelset based framework.

  13. The 4D hyperspherical diffusion wavelet: A new method for the detection of localized anatomical variation.

    PubMed

    Hosseinbor, Ameer Pasha; Kim, Won Hwa; Adluru, Nagesh; Acharya, Amit; Vorperian, Houri K; Chung, Moo K

    2014-01-01

    Recently, the HyperSPHARM algorithm was proposed to parameterize multiple disjoint objects in a holistic manner using the 4D hyperspherical harmonics. The HyperSPHARM coefficients are global; they cannot be used to directly infer localized variations in signal. In this paper, we present a unified wavelet framework that links Hyper-SPHARM to the diffusion wavelet transform. Specifically, we will show that the HyperSPHARM basis forms a subset of a wavelet-based multiscale representation of surface-based signals. This wavelet, termed the hyperspherical diffusion wavelet, is a consequence of the equivalence of isotropic heat diffusion smoothing and the diffusion wavelet transform on the hypersphere. Our framework allows for the statistical inference of highly localized anatomical changes, which we demonstrate in the first-ever developmental study on the hyoid bone investigating gender and age effects. We also show that the hyperspherical wavelet successfully picks up group-wise differences that are barely detectable using SPHARM.

  14. The 4D Hyperspherical Diffusion Wavelet: A New Method for the Detection of Localized Anatomical Variation

    PubMed Central

    Hosseinbor, A. Pasha; Kim, Won Hwa; Adluru, Nagesh; Acharya, Amit; Vorperian, Houri K.; Chung, Moo K.

    2014-01-01

    Recently, the HyperSPHARM algorithm was proposed to parameterize multiple disjoint objects in a holistic manner using the 4D hyperspherical harmonics. The HyperSPHARM coefficients are global; they cannot be used to directly infer localized variations in signal. In this paper, we present a unified wavelet framework that links HyperSPHARM to the diffusion wavelet transform. Specifically, we will show that the HyperSPHARM basis forms a subset of a wavelet-based multiscale representation of surface-based signals. This wavelet, termed the hyperspherical diffusion wavelet, is a consequence of the equivalence of isotropic heat diffusion smoothing and the diffusion wavelet transform on the hypersphere. Our framework allows for the statistical inference of highly localized anatomical changes, which we demonstrate in the firstever developmental study on the hyoid bone investigating gender and age effects. We also show that the hyperspherical wavelet successfully picks up group-wise differences that are barely detectable using SPHARM. PMID:25320783

  15. RHIC and LHC Phenomena with a Unified Parton Transport

    NASA Astrophysics Data System (ADS)

    Bouras, Ioannis; El, Andrej; Fochler, Oliver; Reining, Felix; Senzel, Florian; Uphoff, Jan; Wesp, Christian; Xu, Zhe; Greiner, Carsten

    We discuss recent applications of the partonic pQCD based cascade model BAMPS with focus on heavy-ion phenomeneology in hard and soft momentum range. The nuclear modification factor as well as elliptic flow are calculated in BAMPS for RHIC end LHC energies. These observables are also discussed within the same framework for charm and bottom quarks. Contributing to the recent jet-quenching investigations we present first preliminary results on application of jet reconstruction algorithms in BAMPS. Finally, collective effects induced by jets are investigated: we demonstrate the development of Mach cones in ideal matter as well in the highly viscous regime.

  16. RHIC and LHC phenomena with an unified parton transport

    NASA Astrophysics Data System (ADS)

    Bouras, Ioannis; El, Andrej; Fochler, Oliver; Reining, Felix; Senzel, Florian; Uphoff, Jan; Wesp, Christian; Xu, Zhe; Greiner, Carsten

    2012-11-01

    We discuss recent applications of the partonic pQCD based cascade model BAMPS with focus on heavy-ion phenomeneology in hard and soft momentum range. The nuclear modification factor as well as elliptic flow are calculated in BAMPS for RHIC end LHC energies. These observables are also discussed within the same framework for charm and bottom quarks. Contributing to the recent jet-quenching investigations we present first preliminary results on application of jet reconstruction algorithms in BAMPS. Finally, collective effects induced by jets are investigated: we demonstrate the development of Mach cones in ideal matter as well in the highly viscous regime.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burke, J.V.

    The published work on exact penalization is indeed vast. Recently this work has indicated an intimate relationship between exact penalization, Lagrange multipliers, and problem stability or calmness. In the present work we chronicle this development within a simple idealized problem framework, wherein we unify, extend, and refine much of the known theory. In particular, most of the foundations for constrained optimization are developed with the aid of exact penalization techniques. Our approach is highly geometric and is based upon the elementary subdifferential theory for distance functions. It is assumed that the reader is familiar with the theory of convex setsmore » and functions. 54 refs.« less

  18. PRMS Data Warehousing Prototype

    NASA Technical Reports Server (NTRS)

    Guruvadoo, Eranna K.

    2001-01-01

    Project and Resource Management System (PRMS) is a web-based, mid-level management tool developed at KSC to provide a unified enterprise framework for Project and Mission management. The addition of a data warehouse as a strategic component to the PRMS is investigated through the analysis design and implementation processes of a data warehouse prototype. As a proof of concept, a demonstration of the prototype with its OLAP's technology for multidimensional data analysis is made. The results of the data analysis and the design constraints are discussed. The prototype can be used to motivate interest and support for an operational data warehouse.

  19. PRMS Data Warehousing Prototype

    NASA Technical Reports Server (NTRS)

    Guruvadoo, Eranna K.

    2002-01-01

    Project and Resource Management System (PRMS) is a web-based, mid-level management tool developed at KSC to provide a unified enterprise framework for Project and Mission management. The addition of a data warehouse as a strategic component to the PRMS is investigated through the analysis, design and implementation processes of a data warehouse prototype. As a proof of concept, a demonstration of the prototype with its OLAP's technology for multidimensional data analysis is made. The results of the data analysis and the design constraints are discussed. The prototype can be used to motivate interest and support for an operational data warehouse.

  20. Deciphering the enigma of undetected species, phylogenetic, and functional diversity based on Good-Turing theory.

    PubMed

    Chao, Anne; Chiu, Chun-Huo; Colwell, Robert K; Magnago, Luiz Fernando S; Chazdon, Robin L; Gotelli, Nicholas J

    2017-11-01

    Estimating the species, phylogenetic, and functional diversity of a community is challenging because rare species are often undetected, even with intensive sampling. The Good-Turing frequency formula, originally developed for cryptography, estimates in an ecological context the true frequencies of rare species in a single assemblage based on an incomplete sample of individuals. Until now, this formula has never been used to estimate undetected species, phylogenetic, and functional diversity. Here, we first generalize the Good-Turing formula to incomplete sampling of two assemblages. The original formula and its two-assemblage generalization provide a novel and unified approach to notation, terminology, and estimation of undetected biological diversity. For species richness, the Good-Turing framework offers an intuitive way to derive the non-parametric estimators of the undetected species richness in a single assemblage, and of the undetected species shared between two assemblages. For phylogenetic diversity, the unified approach leads to an estimator of the undetected Faith's phylogenetic diversity (PD, the total length of undetected branches of a phylogenetic tree connecting all species), as well as a new estimator of undetected PD shared between two phylogenetic trees. For functional diversity based on species traits, the unified approach yields a new estimator of undetected Walker et al.'s functional attribute diversity (FAD, the total species-pairwise functional distance) in a single assemblage, as well as a new estimator of undetected FAD shared between two assemblages. Although some of the resulting estimators have been previously published (but derived with traditional mathematical inequalities), all taxonomic, phylogenetic, and functional diversity estimators are now derived under the same framework. All the derived estimators are theoretically lower bounds of the corresponding undetected diversities; our approach reveals the sufficient conditions under which the estimators are nearly unbiased, thus offering new insights. Simulation results are reported to numerically verify the performance of the derived estimators. We illustrate all estimators and assess their sampling uncertainty with an empirical dataset for Brazilian rain forest trees. These estimators should be widely applicable to many current problems in ecology, such as the effects of climate change on spatial and temporal beta diversity and the contribution of trait diversity to ecosystem multi-functionality. © 2017 by the Ecological Society of America.

  1. Self-Efficacy: Toward a Unifying Theory of Behavioral Change

    ERIC Educational Resources Information Center

    Bandura, Albert

    1977-01-01

    This research presents an integrative theoretical framework to explain and to predict psychological changes achieved by different modes of treatment. This theory states that psychological procedures, whatever their form, alter the level and strength of "self-efficacy". (Editor/RK)

  2. COMPLEMENTARITY OF ECOLOGICAL GOAL FUNCTIONS

    EPA Science Inventory

    This paper summarizes, in the framework of network environ analysis, a set of analyses of energy-matter flow and storage in steady state systems. The network perspective is used to codify and unify ten ecological orientors or external principles: maximum power (Lotka), maximum st...

  3. Chimaera simulation of complex states of flowing matter

    PubMed Central

    2016-01-01

    We discuss a unified mesoscale framework (chimaera) for the simulation of complex states of flowing matter across scales of motion. The chimaera framework can deal with each of the three macro–meso–micro levels through suitable ‘mutations’ of the basic mesoscale formulation. The idea is illustrated through selected simulations of complex micro- and nanoscale flows. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698031

  4. Modeling and control of operator functional state in a unified framework of fuzzy inference petri nets.

    PubMed

    Zhang, Jian-Hua; Xia, Jia-Jun; Garibaldi, Jonathan M; Groumpos, Petros P; Wang, Ru-Bin

    2017-06-01

    In human-machine (HM) hybrid control systems, human operator and machine cooperate to achieve the control objectives. To enhance the overall HM system performance, the discrete manual control task-load by the operator must be dynamically allocated in accordance with continuous-time fluctuation of psychophysiological functional status of the operator, so-called operator functional state (OFS). The behavior of the HM system is hybrid in nature due to the co-existence of discrete task-load (control) variable and continuous operator performance (system output) variable. Petri net is an effective tool for modeling discrete event systems, but for hybrid system involving discrete dynamics, generally Petri net model has to be extended. Instead of using different tools to represent continuous and discrete components of a hybrid system, this paper proposed a method of fuzzy inference Petri nets (FIPN) to represent the HM hybrid system comprising a Mamdani-type fuzzy model of OFS and a logical switching controller in a unified framework, in which the task-load level is dynamically reallocated between the operator and machine based on the model-predicted OFS. Furthermore, this paper used a multi-model approach to predict the operator performance based on three electroencephalographic (EEG) input variables (features) via the Wang-Mendel (WM) fuzzy modeling method. The membership function parameters of fuzzy OFS model for each experimental participant were optimized using artificial bee colony (ABC) evolutionary algorithm. Three performance indices, RMSE, MRE, and EPR, were computed to evaluate the overall modeling accuracy. Experiment data from six participants are analyzed. The results show that the proposed method (FIPN with adaptive task allocation) yields lower breakdown rate (from 14.8% to 3.27%) and higher human performance (from 90.30% to 91.99%). The simulation results of the FIPN-based adaptive HM (AHM) system on six experimental participants demonstrate that the FIPN framework provides an effective way to model and regulate/optimize the OFS in HM hybrid systems composed of continuous-time OFS model and discrete-event switching controller. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Bayesian Hierarchical Grouping: perceptual grouping as mixture estimation

    PubMed Central

    Froyen, Vicky; Feldman, Jacob; Singh, Manish

    2015-01-01

    We propose a novel framework for perceptual grouping based on the idea of mixture models, called Bayesian Hierarchical Grouping (BHG). In BHG we assume that the configuration of image elements is generated by a mixture of distinct objects, each of which generates image elements according to some generative assumptions. Grouping, in this framework, means estimating the number and the parameters of the mixture components that generated the image, including estimating which image elements are “owned” by which objects. We present a tractable implementation of the framework, based on the hierarchical clustering approach of Heller and Ghahramani (2005). We illustrate it with examples drawn from a number of classical perceptual grouping problems, including dot clustering, contour integration, and part decomposition. Our approach yields an intuitive hierarchical representation of image elements, giving an explicit decomposition of the image into mixture components, along with estimates of the probability of various candidate decompositions. We show that BHG accounts well for a diverse range of empirical data drawn from the literature. Because BHG provides a principled quantification of the plausibility of grouping interpretations over a wide range of grouping problems, we argue that it provides an appealing unifying account of the elusive Gestalt notion of Prägnanz. PMID:26322548

  6. Semantic Image Segmentation with Contextual Hierarchical Models.

    PubMed

    Seyedhosseini, Mojtaba; Tasdizen, Tolga

    2016-05-01

    Semantic segmentation is the problem of assigning an object label to each pixel. It unifies the image segmentation and object recognition problems. The importance of using contextual information in semantic segmentation frameworks has been widely realized in the field. We propose a contextual framework, called contextual hierarchical model (CHM), which learns contextual information in a hierarchical framework for semantic segmentation. At each level of the hierarchy, a classifier is trained based on downsampled input images and outputs of previous levels. Our model then incorporates the resulting multi-resolution contextual information into a classifier to segment the input image at original resolution. This training strategy allows for optimization of a joint posterior probability at multiple resolutions through the hierarchy. Contextual hierarchical model is purely based on the input image patches and does not make use of any fragments or shape examples. Hence, it is applicable to a variety of problems such as object segmentation and edge detection. We demonstrate that CHM performs at par with state-of-the-art on Stanford background and Weizmann horse datasets. It also outperforms state-of-the-art edge detection methods on NYU depth dataset and achieves state-of-the-art on Berkeley segmentation dataset (BSDS 500).

  7. Information spreading in Delay Tolerant Networks based on nodes' behaviors

    NASA Astrophysics Data System (ADS)

    Wu, Yahui; Deng, Su; Huang, Hongbin

    2014-07-01

    Information spreading in DTNs (Delay Tolerant Networks) adopts a store-carry-forward method, and nodes receive the message from others directly. However, it is hard to judge whether the information is safe in this communication mode. In this case, a node may observe other nodes' behaviors. At present, there is no theoretical model to describe the varying rule of the nodes' trusting level. In addition, due to the uncertainty of the connectivity in DTN, a node is hard to get the global state of the network. Therefore, a rational model about the node's trusting level should be a function of the node's own observing result. For example, if a node finds k nodes carrying a message, it may trust the information with probability p(k). This paper does not explore the real distribution of p(k), but instead presents a unifying theoretical framework to evaluate the performance of the information spreading in above case. This framework is an extension of the traditional SI (susceptible-infected) model, and is useful when p(k) conforms to any distribution. Simulations based on both synthetic and real motion traces show the accuracy of the framework. Finally, we explore the impact of the nodes' behaviors based on certain special distributions through numerical results.

  8. Unified phonon-based approach to the thermodynamics of solid, liquid and gas states

    NASA Astrophysics Data System (ADS)

    Bolmatov, Dima; Zav'yalov, Dmitry; Zhernenkov, Mikhail; Musaev, Edvard T.; Cai, Yong Q.

    2015-12-01

    We introduce a unified approach to states of matter (solid, liquid and gas) and describe the thermodynamics of the pressure-temperature phase diagram in terms of phonon excitations. We derive the effective Hamiltonian with low-energy cutoff in two transverse phonon polarizations (phononic band gaps) by breaking the symmetry in phonon interactions. Further, we construct the statistical mechanics of states of aggregation employing the Debye approximation. The introduced formalism covers the Debye theory of solids, the phonon theory of liquids, and thermodynamic limits such as the Dulong-Petit thermodynamic limit (cV = 3kB), the ideal gas limit (cV =3/2 kB) and the new thermodynamic limit (cV = 2kB), dubbed here the Frenkel line thermodynamic limit. We discuss the phonon propagation and localization effects in liquids above and below the Frenkel line, and explain the "fast sound" phenomenon. As a test for our theory we calculate velocity-velocity autocorrelation and pair distribution functions within the Green-Kubo formalism. We show the consistency between dynamics of phonons and pair correlations in the framework of the unified approach. New directions towards advancements in phononic band gaps engineering, hypersound manipulation technologies and exploration of exotic behaviour of fluids relevant to geo- and planetary sciences are discussed. The presented results are equally important both for practical implications and for fundamental research.

  9. Tensor scale-based fuzzy connectedness image segmentation

    NASA Astrophysics Data System (ADS)

    Saha, Punam K.; Udupa, Jayaram K.

    2003-05-01

    Tangible solutions to image segmentation are vital in many medical imaging applications. Toward this goal, a framework based on fuzzy connectedness was developed in our laboratory. A fundamental notion called "affinity" - a local fuzzy hanging togetherness relation on voxels - determines the effectiveness of this segmentation framework in real applications. In this paper, we introduce the notion of "tensor scale" - a recently developed local morphometric parameter - in affinity definition and study its effectiveness. Although, our previous notion of "local scale" using the spherical model successfully incorporated local structure size into affinity and resulted in measureable improvements in segmentation results, a major limitation of the previous approach was that it ignored local structural orientation and anisotropy. The current approach of using tensor scale in affinity computation allows an effective utilization of local size, orientation, and ansiotropy in a unified manner. Tensor scale is used for computing both the homogeneity- and object-feature-based components of affinity. Preliminary results of the proposed method on several medical images and computer generated phantoms of realistic shapes are presented. Further extensions of this work are discussed.

  10. Neuromorphic log-domain silicon synapse circuits obey bernoulli dynamics: a unifying tutorial analysis

    PubMed Central

    Papadimitriou, Konstantinos I.; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M.

    2014-01-01

    The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4th) order topology. PMID:25653579

  11. Neuromorphic log-domain silicon synapse circuits obey bernoulli dynamics: a unifying tutorial analysis.

    PubMed

    Papadimitriou, Konstantinos I; Liu, Shih-Chii; Indiveri, Giacomo; Drakakis, Emmanuel M

    2014-01-01

    The field of neuromorphic silicon synapse circuits is revisited and a parsimonious mathematical framework able to describe the dynamics of this class of log-domain circuits in the aggregate and in a systematic manner is proposed. Starting from the Bernoulli Cell Formalism (BCF), originally formulated for the modular synthesis and analysis of externally linear, time-invariant logarithmic filters, and by means of the identification of new types of Bernoulli Cell (BC) operators presented here, a generalized formalism (GBCF) is established. The expanded formalism covers two new possible and practical combinations of a MOS transistor (MOST) and a linear capacitor. The corresponding mathematical relations codifying each case are presented and discussed through the tutorial treatment of three well-known transistor-level examples of log-domain neuromorphic silicon synapses. The proposed mathematical tool unifies past analysis approaches of the same circuits under a common theoretical framework. The speed advantage of the proposed mathematical framework as an analysis tool is also demonstrated by a compelling comparative circuit analysis example of high order, where the GBCF and another well-known log-domain circuit analysis method are used for the determination of the input-output transfer function of the high (4(th)) order topology.

  12. GUDM: Automatic Generation of Unified Datasets for Learning and Reasoning in Healthcare.

    PubMed

    Ali, Rahman; Siddiqi, Muhammad Hameed; Idris, Muhammad; Ali, Taqdir; Hussain, Shujaat; Huh, Eui-Nam; Kang, Byeong Ho; Lee, Sungyoung

    2015-07-02

    A wide array of biomedical data are generated and made available to healthcare experts. However, due to the diverse nature of data, it is difficult to predict outcomes from it. It is therefore necessary to combine these diverse data sources into a single unified dataset. This paper proposes a global unified data model (GUDM) to provide a global unified data structure for all data sources and generate a unified dataset by a "data modeler" tool. The proposed tool implements user-centric priority based approach which can easily resolve the problems of unified data modeling and overlapping attributes across multiple datasets. The tool is illustrated using sample diabetes mellitus data. The diverse data sources to generate the unified dataset for diabetes mellitus include clinical trial information, a social media interaction dataset and physical activity data collected using different sensors. To realize the significance of the unified dataset, we adopted a well-known rough set theory based rules creation process to create rules from the unified dataset. The evaluation of the tool on six different sets of locally created diverse datasets shows that the tool, on average, reduces 94.1% time efforts of the experts and knowledge engineer while creating unified datasets.

  13. GUDM: Automatic Generation of Unified Datasets for Learning and Reasoning in Healthcare

    PubMed Central

    Ali, Rahman; Siddiqi, Muhammad Hameed; Idris, Muhammad; Ali, Taqdir; Hussain, Shujaat; Huh, Eui-Nam; Kang, Byeong Ho; Lee, Sungyoung

    2015-01-01

    A wide array of biomedical data are generated and made available to healthcare experts. However, due to the diverse nature of data, it is difficult to predict outcomes from it. It is therefore necessary to combine these diverse data sources into a single unified dataset. This paper proposes a global unified data model (GUDM) to provide a global unified data structure for all data sources and generate a unified dataset by a “data modeler” tool. The proposed tool implements user-centric priority based approach which can easily resolve the problems of unified data modeling and overlapping attributes across multiple datasets. The tool is illustrated using sample diabetes mellitus data. The diverse data sources to generate the unified dataset for diabetes mellitus include clinical trial information, a social media interaction dataset and physical activity data collected using different sensors. To realize the significance of the unified dataset, we adopted a well-known rough set theory based rules creation process to create rules from the unified dataset. The evaluation of the tool on six different sets of locally created diverse datasets shows that the tool, on average, reduces 94.1% time efforts of the experts and knowledge engineer while creating unified datasets. PMID:26147731

  14. Classical Markov Chains: A Unifying Framework for Understanding Avian Reproductive Success

    EPA Science Inventory

    Traditional methods for monitoring and analysis of avian nesting success have several important shortcomings, including 1) inability to handle multiple classes of nest failure, and 2) inability to provide estimates of annual reproductive success (because birds can, and typically ...

  15. Do changes in connectivity explain desertification?

    USDA-ARS?s Scientific Manuscript database

    Desertification, broad-scale land degradation in drylands, is a major environmental hazard facing inhabitants of the world’s deserts as well as an important component of global change. There is no unifying framework that simply and effectively explains different forms of desertification. Here we arg...

  16. Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering

    PubMed Central

    Carmena, Jose M.

    2016-01-01

    Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain’s behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user’s motor intention during CLDA—a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to parameter initialization. Finally, the architecture extended control to tasks beyond those used for CLDA training. These results have significant implications towards the development of clinically-viable neuroprosthetics. PMID:27035820

  17. Design and implementation of a unified certification management system based on seismic business

    NASA Astrophysics Data System (ADS)

    Tang, Hongliang

    2018-04-01

    Many business software for seismic systems are based on web pages, users can simply open a browser and enter their IP address. However, how to achieve unified management and security management of many IP addresses, this paper introduces the design concept based on seismic business and builds a unified authentication management system using ASP technology.

  18. ATP3 Unified Field Study Data

    DOE Data Explorer

    Wolfrum, Ed (ORCID:0000000273618931); Knoshug, Eric (ORCID:000000025709914X); Laurens, Lieve (ORCID:0000000349303267); Harmon, Valerie; Dempster, Thomas (ORCID:000000029550488X); McGowan, John (ORCID:0000000266920518); Rosov, Theresa; Cardello, David; Arrowsmith, Sarah; Kempkes, Sarah; Bautista, Maria; Lundquist, Tryg; Crowe, Brandon; Murawsky, Garrett; Nicolai, Eric; Rowe, Egan; Knurek, Emily; Javar, Reyna; Saracco Alvarez, Marcela; Schlosser, Steve; Riddle, Mary; Withstandley, Chris; Chen, Yongsheng; Van Ginkel, Steven; Igou, Thomas; Xu, Chunyan; Hu, Zixuan

    2017-10-20

    ATP3 Unified Field Study Data The Algae Testbed Public-Private Partnership (ATP3) was established with the goal of investigating open pond algae cultivation across different geographic, climatic, seasonal, and operational conditions while setting the benchmark for quality data collection, analysis, and dissemination. Identical algae cultivation systems and data analysis methodologies were established at testbed sites across the continental United States and Hawaii. Within this framework, the Unified Field Studies (UFS) were designed to characterize the cultivation of different algal strains during all 4 seasons across this testbed network. The dataset presented here is the complete, curated, climatic, cultivation, harvest, and biomass composition data for each season at each site. These data enable others to do in-depth cultivation, harvest, techno-economic, life cycle, resource, and predictive growth modeling analysis, as well as develop crop protection strategies for the nascent algae industry. NREL Sub award Number: DE-AC36-08-GO28308

  19. A unified universe

    NASA Astrophysics Data System (ADS)

    Codello, Alessandro; Jain, Rajeev Kumar

    2018-05-01

    We present a unified evolution of the universe from very early times until the present epoch by including both the leading local correction R^2 and the leading non-local term R1/\\square ^2R to the classical gravitational action. We find that the inflationary phase driven by R^2 term gracefully exits in a transitory regime characterized by coherent oscillations of the Hubble parameter. The universe then naturally enters into a radiation dominated epoch followed by a matter dominated era. At sufficiently late times after radiation-matter equality, the non-local term starts to dominate inducing an accelerated expansion of the universe at the present epoch. We further exhibit the fact that both the leading local and non-local terms can be obtained within the covariant effective field theory of gravity. This scenario thus provides a unified picture of inflation and dark energy in a single framework by means of a purely gravitational action without the usual need of a scalar field.

  20. Multilayer network of language: A unified framework for structural analysis of linguistic subsystems

    NASA Astrophysics Data System (ADS)

    Martinčić-Ipšić, Sanda; Margan, Domagoj; Meštrović, Ana

    2016-09-01

    Recently, the focus of complex networks' research has shifted from the analysis of isolated properties of a system toward a more realistic modeling of multiple phenomena - multilayer networks. Motivated by the prosperity of multilayer approach in social, transport or trade systems, we introduce the multilayer networks for language. The multilayer network of language is a unified framework for modeling linguistic subsystems and their structural properties enabling the exploration of their mutual interactions. Various aspects of natural language systems can be represented as complex networks, whose vertices depict linguistic units, while links model their relations. The multilayer network of language is defined by three aspects: the network construction principle, the linguistic subsystem and the language of interest. More precisely, we construct a word-level (syntax and co-occurrence) and a subword-level (syllables and graphemes) network layers, from four variations of original text (in the modeled language). The analysis and comparison of layers at the word and subword-levels are employed in order to determine the mechanism of the structural influences between linguistic units and subsystems. The obtained results suggest that there are substantial differences between the networks' structures of different language subsystems, which are hidden during the exploration of an isolated layer. The word-level layers share structural properties regardless of the language (e.g. Croatian or English), while the syllabic subword-level expresses more language dependent structural properties. The preserved weighted overlap quantifies the similarity of word-level layers in weighted and directed networks. Moreover, the analysis of motifs reveals a close topological structure of the syntactic and syllabic layers for both languages. The findings corroborate that the multilayer network framework is a powerful, consistent and systematic approach to model several linguistic subsystems simultaneously and hence to provide a more unified view on language.

  1. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-offs on Phenotype Robustness in Biological Networks. Part III: Synthetic Gene Networks in Synthetic Biology

    PubMed Central

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental disturbances, is also proposed, together with a simulation example. PMID:23515190

  2. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Trade-offs on Phenotype Robustness in Biological Networks. Part III: Synthetic Gene Networks in Synthetic Biology.

    PubMed

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    Robust stabilization and environmental disturbance attenuation are ubiquitous systematic properties that are observed in biological systems at many different levels. The underlying principles for robust stabilization and environmental disturbance attenuation are universal to both complex biological systems and sophisticated engineering systems. In many biological networks, network robustness should be large enough to confer: intrinsic robustness for tolerating intrinsic parameter fluctuations; genetic robustness for buffering genetic variations; and environmental robustness for resisting environmental disturbances. Network robustness is needed so phenotype stability of biological network can be maintained, guaranteeing phenotype robustness. Synthetic biology is foreseen to have important applications in biotechnology and medicine; it is expected to contribute significantly to a better understanding of functioning of complex biological systems. This paper presents a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance attenuation for synthetic gene networks in synthetic biology. Further, from the unifying mathematical framework, we found that the phenotype robustness criterion for synthetic gene networks is the following: if intrinsic robustness + genetic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations, genetic variations, and environmental disturbances. Therefore, the trade-offs between intrinsic robustness, genetic robustness, environmental robustness, and network robustness in synthetic biology can also be investigated through corresponding phenotype robustness criteria from the systematic point of view. Finally, a robust synthetic design that involves network evolution algorithms with desired behavior under intrinsic parameter fluctuations, genetic variations, and environmental disturbances, is also proposed, together with a simulation example.

  3. Phase 1: Definition of intercity transportation comparison framework. Volume 1: Summary. [operations research of passenger and freight transporatation systems

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A unified framework for comparing intercity passenger and freight transportation systems is presented. Composite measures for cost, service/demand, energy, and environmental impact were determined. A set of 14 basic measures were articulated to form the foundation for computing the composite measures. A parameter dependency diagram, constructed to explicitly interrelate the composite and basic measures is discussed. Ground rules and methodology for developing the values of the basic measures are provided and the use of the framework with existing cost and service data is illustrated for various freight systems.

  4. Representations, approximations, and limitations within a computational framework for cognitive science. Comment on “Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition” by W. Tecumseh Fitch

    NASA Astrophysics Data System (ADS)

    Perfors, Amy

    2014-09-01

    There is much to approve of in this provocative and interesting paper. I strongly agree in many parts, especially the point that dichotomies like nature/nurture are actively detrimental to the field. I also appreciate the idea that cognitive scientists should take the "biological wetware" of the cell (rather than the network) more seriously.

  5. Chimaera simulation of complex states of flowing matter.

    PubMed

    Succi, S

    2016-11-13

    We discuss a unified mesoscale framework (chimaera) for the simulation of complex states of flowing matter across scales of motion. The chimaera framework can deal with each of the three macro-meso-micro levels through suitable 'mutations' of the basic mesoscale formulation. The idea is illustrated through selected simulations of complex micro- and nanoscale flows.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2016 The Author(s).

  6. Unifying screening processes within the PROSPR consortium: a conceptual model for breast, cervical, and colorectal cancer screening.

    PubMed

    Beaber, Elisabeth F; Kim, Jane J; Schapira, Marilyn M; Tosteson, Anna N A; Zauber, Ann G; Geiger, Ann M; Kamineni, Aruna; Weaver, Donald L; Tiro, Jasmin A

    2015-06-01

    General frameworks of the cancer screening process are available, but none directly compare the process in detail across different organ sites. This limits the ability of medical and public health professionals to develop and evaluate coordinated screening programs that apply resources and population management strategies available for one cancer site to other sites. We present a trans-organ conceptual model that incorporates a single screening episode for breast, cervical, and colorectal cancers into a unified framework based on clinical guidelines and protocols; the model concepts could be expanded to other organ sites. The model covers four types of care in the screening process: risk assessment, detection, diagnosis, and treatment. Interfaces between different provider teams (eg, primary care and specialty care), including communication and transfer of responsibility, may occur when transitioning between types of care. Our model highlights across each organ site similarities and differences in steps, interfaces, and transitions in the screening process and documents the conclusion of a screening episode. This model was developed within the National Cancer Institute-funded consortium Population-based Research Optimizing Screening through Personalized Regimens (PROSPR). PROSPR aims to optimize the screening process for breast, cervical, and colorectal cancer and includes seven research centers and a statistical coordinating center. Given current health care reform initiatives in the United States, this conceptual model can facilitate the development of comprehensive quality metrics for cancer screening and promote trans-organ comparative cancer screening research. PROSPR findings will support the design of interventions that improve screening outcomes across multiple cancer sites. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. The HARNESS Workbench: Unified and Adaptive Access to Diverse HPC Platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sunderam, Vaidy S.

    2012-03-20

    The primary goal of the Harness WorkBench (HWB) project is to investigate innovative software environments that will help enhance the overall productivity of applications science on diverse HPC platforms. Two complementary frameworks were designed: one, a virtualized command toolkit for application building, deployment, and execution, that provides a common view across diverse HPC systems, in particular the DOE leadership computing platforms (Cray, IBM, SGI, and clusters); and two, a unified runtime environment that consolidates access to runtime services via an adaptive framework for execution-time and post processing activities. A prototype of the first was developed based on the concept ofmore » a 'system-call virtual machine' (SCVM), to enhance portability of the HPC application deployment process across heterogeneous high-end machines. The SCVM approach to portable builds is based on the insertion of toolkit-interpretable directives into original application build scripts. Modifications resulting from these directives preserve the semantics of the original build instruction flow. The execution of the build script is controlled by our toolkit that intercepts build script commands in a manner transparent to the end-user. We have applied this approach to a scientific production code (Gamess-US) on the Cray-XT5 machine. The second facet, termed Unibus, aims to facilitate provisioning and aggregation of multifaceted resources from resource providers and end-users perspectives. To achieve that, Unibus proposes a Capability Model and mediators (resource drivers) to virtualize access to diverse resources, and soft and successive conditioning to enable automatic and user-transparent resource provisioning. A proof of concept implementation has demonstrated the viability of this approach on high end machines, grid systems and computing clouds.« less

  8. The semiotics of medical image Segmentation.

    PubMed

    Baxter, John S H; Gibson, Eli; Eagleson, Roy; Peters, Terry M

    2018-02-01

    As the interaction between clinicians and computational processes increases in complexity, more nuanced mechanisms are required to describe how their communication is mediated. Medical image segmentation in particular affords a large number of distinct loci for interaction which can act on a deep, knowledge-driven level which complicates the naive interpretation of the computer as a symbol processing machine. Using the perspective of the computer as dialogue partner, we can motivate the semiotic understanding of medical image segmentation. Taking advantage of Peircean semiotic traditions and new philosophical inquiry into the structure and quality of metaphors, we can construct a unified framework for the interpretation of medical image segmentation as a sign exchange in which each sign acts as an interface metaphor. This allows for a notion of finite semiosis, described through a schematic medium, that can rigorously describe how clinicians and computers interpret the signs mediating their interaction. Altogether, this framework provides a unified approach to the understanding and development of medical image segmentation interfaces. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. A Unifying Mathematical Framework for Genetic Robustness, Environmental Robustness, Network Robustness and their Tradeoff on Phenotype Robustness in Biological Networks Part II: Ecological Networks

    PubMed Central

    Chen, Bor-Sen; Lin, Ying-Po

    2013-01-01

    In ecological networks, network robustness should be large enough to confer intrinsic robustness for tolerating intrinsic parameter fluctuations, as well as environmental robustness for resisting environmental disturbances, so that the phenotype stability of ecological networks can be maintained, thus guaranteeing phenotype robustness. However, it is difficult to analyze the network robustness of ecological systems because they are complex nonlinear partial differential stochastic systems. This paper develops a unifying mathematical framework for investigating the principles of both robust stabilization and environmental disturbance sensitivity in ecological networks. We found that the phenotype robustness criterion for ecological networks is that if intrinsic robustness + environmental robustness ≦ network robustness, then the phenotype robustness can be maintained in spite of intrinsic parameter fluctuations and environmental disturbances. These results in robust ecological networks are similar to that in robust gene regulatory networks and evolutionary networks even they have different spatial-time scales. PMID:23515112

  10. Unified Computational Methods for Regression Analysis of Zero-Inflated and Bound-Inflated Data

    PubMed Central

    Yang, Yan; Simpson, Douglas

    2010-01-01

    Bounded data with excess observations at the boundary are common in many areas of application. Various individual cases of inflated mixture models have been studied in the literature for bound-inflated data, yet the computational methods have been developed separately for each type of model. In this article we use a common framework for computing these models, and expand the range of models for both discrete and semi-continuous data with point inflation at the lower boundary. The quasi-Newton and EM algorithms are adapted and compared for estimation of model parameters. The numerical Hessian and generalized Louis method are investigated as means for computing standard errors after optimization. Correlated data are included in this framework via generalized estimating equations. The estimation of parameters and effectiveness of standard errors are demonstrated through simulation and in the analysis of data from an ultrasound bioeffect study. The unified approach enables reliable computation for a wide class of inflated mixture models and comparison of competing models. PMID:20228950

  11. Predicting excitonic gaps of semiconducting single-walled carbon nanotubes from a field theoretic analysis

    DOE PAGES

    Konik, Robert M.; Sfeir, Matthew Y.; Misewich, James A.

    2015-02-17

    We demonstrate that a non-perturbative framework for the treatment of the excitations of single walled carbon nanotubes based upon a field theoretic reduction is able to accurately describe experiment observations of the absolute values of excitonic energies. This theoretical framework yields a simple scaling function from which the excitonic energies can be read off. This scaling function is primarily determined by a single parameter, the charge Luttinger parameter of the tube, which is in turn a function of the tube chirality, dielectric environment, and the tube's dimensions, thus expressing disparate influences on the excitonic energies in a unified fashion. Asmore » a result, we test this theory explicitly on the data reported in [NanoLetters 5, 2314 (2005)] and [Phys. Rev. B 82, 195424 (2010)] and so demonstrate the method works over a wide range of reported excitonic spectra.« less

  12. Unraveling dynamics of human physical activity patterns in chronic pain conditions

    NASA Astrophysics Data System (ADS)

    Paraschiv-Ionescu, Anisoara; Buchser, Eric; Aminian, Kamiar

    2013-06-01

    Chronic pain is a complex disabling experience that negatively affects the cognitive, affective and physical functions as well as behavior. Although the interaction between chronic pain and physical functioning is a well-accepted paradigm in clinical research, the understanding of how pain affects individuals' daily life behavior remains a challenging task. Here we develop a methodological framework allowing to objectively document disruptive pain related interferences on real-life physical activity. The results reveal that meaningful information is contained in the temporal dynamics of activity patterns and an analytical model based on the theory of bivariate point processes can be used to describe physical activity behavior. The model parameters capture the dynamic interdependence between periods and events and determine a `signature' of activity pattern. The study is likely to contribute to the clinical understanding of complex pain/disease-related behaviors and establish a unified mathematical framework to quantify the complex dynamics of various human activities.

  13. A unified framework of image latent feature learning on Sina microblog

    NASA Astrophysics Data System (ADS)

    Wei, Jinjin; Jin, Zhigang; Zhou, Yuan; Zhang, Rui

    2015-10-01

    Large-scale user-contributed images with texts are rapidly increasing on the social media websites, such as Sina microblog. However, the noise and incomplete correspondence between the images and the texts give rise to the difficulty in precise image retrieval and ranking. In this paper, a hypergraph-based learning framework is proposed for image ranking, which simultaneously utilizes visual feature, textual content and social link information to estimate the relevance between images. Representing each image as a vertex in the hypergraph, complex relationship between images can be reflected exactly. Then updating the weight of hyperedges throughout the hypergraph learning process, the effect of different edges can be adaptively modulated in the constructed hypergraph. Furthermore, the popularity degree of the image is employed to re-rank the retrieval results. Comparative experiments on a large-scale Sina microblog data-set demonstrate the effectiveness of the proposed approach.

  14. NanoParticle Ontology for Cancer Nanotechnology Research

    PubMed Central

    Thomas, Dennis G.; Pappu, Rohit V.; Baker, Nathan A.

    2010-01-01

    Data generated from cancer nanotechnology research are so diverse and large in volume that it is difficult to share and efficiently use them without informatics tools. In particular, ontologies that provide a unifying knowledge framework for annotating the data are required to facilitate the semantic integration, knowledge-based searching, unambiguous interpretation, mining and inferencing of the data using informatics methods. In this paper, we discuss the design and development of NanoParticle Ontology (NPO), which is developed within the framework of the Basic Formal Ontology (BFO), and implemented in the Ontology Web Language (OWL) using well-defined ontology design principles. The NPO was developed to represent knowledge underlying the preparation, chemical composition, and characterization of nanomaterials involved in cancer research. Public releases of the NPO are available through BioPortal website, maintained by the National Center for Biomedical Ontology. Mechanisms for editorial and governance processes are being developed for the maintenance, review, and growth of the NPO. PMID:20211274

  15. Unified Heat Kernel Regression for Diffusion, Kernel Smoothing and Wavelets on Manifolds and Its Application to Mandible Growth Modeling in CT Images

    PubMed Central

    Chung, Moo K.; Qiu, Anqi; Seo, Seongho; Vorperian, Houri K.

    2014-01-01

    We present a novel kernel regression framework for smoothing scalar surface data using the Laplace-Beltrami eigenfunctions. Starting with the heat kernel constructed from the eigenfunctions, we formulate a new bivariate kernel regression framework as a weighted eigenfunction expansion with the heat kernel as the weights. The new kernel regression is mathematically equivalent to isotropic heat diffusion, kernel smoothing and recently popular diffusion wavelets. Unlike many previous partial differential equation based approaches involving diffusion, our approach represents the solution of diffusion analytically, reducing numerical inaccuracy and slow convergence. The numerical implementation is validated on a unit sphere using spherical harmonics. As an illustration, we have applied the method in characterizing the localized growth pattern of mandible surfaces obtained in CT images from subjects between ages 0 and 20 years by regressing the length of displacement vectors with respect to the template surface. PMID:25791435

  16. Orthographic Software Modelling: A Novel Approach to View-Based Software Engineering

    NASA Astrophysics Data System (ADS)

    Atkinson, Colin

    The need to support multiple views of complex software architectures, each capturing a different aspect of the system under development, has been recognized for a long time. Even the very first object-oriented analysis/design methods such as the Booch method and OMT supported a number of different diagram types (e.g. structural, behavioral, operational) and subsequent methods such as Fusion, Kruchten's 4+1 views and the Rational Unified Process (RUP) have added many more views over time. Today's leading modeling languages such as the UML and SysML, are also oriented towards supporting different views (i.e. diagram types) each able to portray a different facets of a system's architecture. More recently, so called enterprise architecture frameworks such as the Zachman Framework, TOGAF and RM-ODP have become popular. These add a whole set of new non-functional views to the views typically emphasized in traditional software engineering environments.

  17. Unifying a fragmented effort: a qualitative framework for improving international surgical teaching collaborations.

    PubMed

    Fallah, Parisa Nicole; Bernstein, Mark

    2017-09-07

    Access to adequate surgical care is limited globally, particularly in low- and middle-income countries (LMICs). To address this issue, surgeons are becoming increasingly involved in international surgical teaching collaborations (ISTCs), which include educational partnerships between surgical teams in high-income countries and those in LMICs. The purpose of this study is to determine a framework for unifying, systematizing, and improving the quality of ISTCs so that they can better address the global surgical need. A convenience sample of 68 surgeons, anesthesiologists, physicians, residents, nurses, academics, and administrators from the U.S., Canada, and Norway was used for the study. Participants all had some involvement in ISTCs and came from multiple specialties and institutions. Qualitative methodology was used, and participants were interviewed using a pre-determined set of open-ended questions. Data was gathered over two months either in-person, over the phone, or on Skype. Data was evaluated using thematic content analysis. To organize and systematize ISTCs, participants reported a need for a centralized/systematized process with designated leaders, a universal data bank of current efforts/progress, communication amongst involved parties, full-time administrative staff, dedicated funds, a scholarly approach, increased use of technology, and more research on needs and outcomes. By taking steps towards unifying and systematizing ISTCs, the quality of ISTCs can be improved. This could lead to an advancement in efforts to increase access to surgical care worldwide.

  18. Statistical model based iterative reconstruction in clinical CT systems. Part III. Task-based kV/mAs optimization for radiation dose reduction

    PubMed Central

    Li, Ke; Gomez-Cardona, Daniel; Hsieh, Jiang; Lubner, Meghan G.; Pickhardt, Perry J.; Chen, Guang-Hong

    2015-01-01

    Purpose: For a given imaging task and patient size, the optimal selection of x-ray tube potential (kV) and tube current-rotation time product (mAs) is pivotal in achieving the maximal radiation dose reduction while maintaining the needed diagnostic performance. Although contrast-to-noise (CNR)-based strategies can be used to optimize kV/mAs for computed tomography (CT) imaging systems employing the linear filtered backprojection (FBP) reconstruction method, a more general framework needs to be developed for systems using the nonlinear statistical model-based iterative reconstruction (MBIR) method. The purpose of this paper is to present such a unified framework for the optimization of kV/mAs selection for both FBP- and MBIR-based CT systems. Methods: The optimal selection of kV and mAs was formulated as a constrained optimization problem to minimize the objective function, Dose(kV,mAs), under the constraint that the achievable detectability index d′(kV,mAs) is not lower than the prescribed value of d℞′ for a given imaging task. Since it is difficult to analytically model the dependence of d′ on kV and mAs for the highly nonlinear MBIR method, this constrained optimization problem is solved with comprehensive measurements of Dose(kV,mAs) and d′(kV,mAs) at a variety of kV–mAs combinations, after which the overlay of the dose contours and d′ contours is used to graphically determine the optimal kV–mAs combination to achieve the lowest dose while maintaining the needed detectability for the given imaging task. As an example, d′ for a 17 mm hypoattenuating liver lesion detection task was experimentally measured with an anthropomorphic abdominal phantom at four tube potentials (80, 100, 120, and 140 kV) and fifteen mA levels (25 and 50–700) with a sampling interval of 50 mA at a fixed rotation time of 0.5 s, which corresponded to a dose (CTDIvol) range of [0.6, 70] mGy. Using the proposed method, the optimal kV and mA that minimized dose for the prescribed detectability level of d℞′=16 were determined. As another example, the optimal kV and mA for an 8 mm hyperattenuating liver lesion detection task were also measured using the developed framework. Both an in vivo animal and human subject study were used as demonstrations of how the developed framework can be applied to the clinical work flow. Results: For the first task, the optimal kV and mAs were measured to be 100 and 500, respectively, for FBP, which corresponded to a dose level of 24 mGy. In comparison, the optimal kV and mAs for MBIR were 80 and 150, respectively, which corresponded to a dose level of 4 mGy. The topographies of the iso-d′ map and the iso-CNR map were the same for FBP; thus, the use of d′- and CNR-based optimization methods generated the same results for FBP. However, the topographies of the iso-d′ and iso-CNR map were significantly different in MBIR; the CNR-based method overestimated the performance of MBIR, predicting an overly aggressive dose reduction factor. For the second task, the developed framework generated the following optimization results: for FBP, kV = 140, mA = 350, dose = 37.5 mGy; for MBIR, kV = 120, mA = 250, dose = 18.8 mGy. Again, the CNR-based method overestimated the performance of MBIR. Results of the preliminary in vivo studies were consistent with those of the phantom experiments. Conclusions: A unified and task-driven kV/mAs optimization framework has been developed in this work. The framework is applicable to both linear and nonlinear CT systems such as those using the MBIR method. As expected, the developed framework can be reduced to the conventional CNR-based kV/mAs optimization frameworks if the system is linear. For MBIR-based nonlinear CT systems, however, the developed task-based kV/mAs optimization framework is needed to achieve the maximal dose reduction while maintaining the desired diagnostic performance. PMID:26328971

  19. An early illness recognition framework using a temporal Smith Waterman algorithm and NLP.

    PubMed

    Hajihashemi, Zahra; Popescu, Mihail

    2013-01-01

    In this paper we propose a framework for detecting health patterns based on non-wearable sensor sequence similarity and natural language processing (NLP). In TigerPlace, an aging in place facility from Columbia, MO, we deployed 47 sensor networks together with a nursing electronic health record (EHR) system to provide early illness recognition. The proposed framework utilizes sensor sequence similarity and NLP on EHR nursing comments to automatically notify the physician when health problems are detected. The reported methodology is inspired by genomic sequence annotation using similarity algorithms such as Smith Waterman (SW). Similarly, for each sensor sequence, we associate health concepts extracted from the nursing notes using Metamap, a NLP tool provided by Unified Medical Language System (UMLS). Since sensor sequences, unlike genomics ones, have an associated time dimension we propose a temporal variant of SW (TSW) to account for time. The main challenges presented by our framework are finding the most suitable time sequence similarity and aggregation of the retrieved UMLS concepts. On a pilot dataset from three Tiger Place residents, with a total of 1685 sensor days and 626 nursing records, we obtained an average precision of 0.64 and a recall of 0.37.

  20. Critical analysis of e-health readiness assessment frameworks: suitability for application in developing countries.

    PubMed

    Mauco, Kabelo Leonard; Scott, Richard E; Mars, Maurice

    2018-02-01

    Introduction e-Health is an innovative way to make health services more effective and efficient and application is increasing worldwide. e-Health represents a substantial ICT investment and its failure usually results in substantial losses in time, money (including opportunity costs) and effort. Therefore it is important to assess e-health readiness prior to implementation. Several frameworks have been published on e-health readiness assessment, under various circumstances and geographical regions of the world. However, their utility for the developing world is unknown. Methods A literature review and analysis of published e-health readiness assessment frameworks or models was performed to determine if any are appropriate for broad assessment of e-health readiness in the developing world. A total of 13 papers described e-health readiness in different settings. Results and Discussion Eight types of e-health readiness were identified and no paper directly addressed all of these. The frameworks were based upon varying assumptions and perspectives. There was no underlying unifying theory underpinning the frameworks. Few assessed government and societal readiness, and none cultural readiness; all are important in the developing world. While the shortcomings of existing frameworks have been highlighted, most contain aspects that are relevant and can be drawn on when developing a framework and assessment tools for the developing world. What emerged is the need to develop different assessment tools for the various stakeholder sectors. This is an area that needs further research before attempting to develop a more generic framework for the developing world.

  1. Information object definition-based unified modeling language representation of DICOM structured reporting: a case study of transcoding DICOM to XML.

    PubMed

    Tirado-Ramos, Alfredo; Hu, Jingkun; Lee, K P

    2002-01-01

    Supplement 23 to DICOM (Digital Imaging and Communications for Medicine), Structured Reporting, is a specification that supports a semantically rich representation of image and waveform content, enabling experts to share image and related patient information. DICOM SR supports the representation of textual and coded data linked to images and waveforms. Nevertheless, the medical information technology community needs models that work as bridges between the DICOM relational model and open object-oriented technologies. The authors assert that representations of the DICOM Structured Reporting standard, using object-oriented modeling languages such as the Unified Modeling Language, can provide a high-level reference view of the semantically rich framework of DICOM and its complex structures. They have produced an object-oriented model to represent the DICOM SR standard and have derived XML-exchangeable representations of this model using World Wide Web Consortium specifications. They expect the model to benefit developers and system architects who are interested in developing applications that are compliant with the DICOM SR specification.

  2. Heating and flooding: A unified approach for rapid generation of free energy surfaces

    NASA Astrophysics Data System (ADS)

    Chen, Ming; Cuendet, Michel A.; Tuckerman, Mark E.

    2012-07-01

    We propose a general framework for the efficient sampling of conformational equilibria in complex systems and the generation of associated free energy hypersurfaces in terms of a set of collective variables. The method is a strategic synthesis of the adiabatic free energy dynamics approach, previously introduced by us and others, and existing schemes using Gaussian-based adaptive bias potentials to disfavor previously visited regions. In addition, we suggest sampling the thermodynamic force instead of the probability density to reconstruct the free energy hypersurface. All these elements are combined into a robust extended phase-space formalism that can be easily incorporated into existing molecular dynamics packages. The unified scheme is shown to outperform both metadynamics and adiabatic free energy dynamics in generating two-dimensional free energy surfaces for several example cases including the alanine dipeptide in the gas and aqueous phases and the met-enkephalin oligopeptide. In addition, the method can efficiently generate higher dimensional free energy landscapes, which we demonstrate by calculating a four-dimensional surface in the Ramachandran angles of the gas-phase alanine tripeptide.

  3. Unifying time evolution and optimization with matrix product states

    NASA Astrophysics Data System (ADS)

    Haegeman, Jutho; Lubich, Christian; Oseledets, Ivan; Vandereycken, Bart; Verstraete, Frank

    2016-10-01

    We show that the time-dependent variational principle provides a unifying framework for time-evolution methods and optimization methods in the context of matrix product states. In particular, we introduce a new integration scheme for studying time evolution, which can cope with arbitrary Hamiltonians, including those with long-range interactions. Rather than a Suzuki-Trotter splitting of the Hamiltonian, which is the idea behind the adaptive time-dependent density matrix renormalization group method or time-evolving block decimation, our method is based on splitting the projector onto the matrix product state tangent space as it appears in the Dirac-Frenkel time-dependent variational principle. We discuss how the resulting algorithm resembles the density matrix renormalization group (DMRG) algorithm for finding ground states so closely that it can be implemented by changing just a few lines of code and it inherits the same stability and efficiency. In particular, our method is compatible with any Hamiltonian for which ground-state DMRG can be implemented efficiently. In fact, DMRG is obtained as a special case of our scheme for imaginary time evolution with infinite time step.

  4. NIA-AA Research Framework: Toward a biological definition of Alzheimer's disease.

    PubMed

    Jack, Clifford R; Bennett, David A; Blennow, Kaj; Carrillo, Maria C; Dunn, Billy; Haeberlein, Samantha Budd; Holtzman, David M; Jagust, William; Jessen, Frank; Karlawish, Jason; Liu, Enchi; Molinuevo, Jose Luis; Montine, Thomas; Phelps, Creighton; Rankin, Katherine P; Rowe, Christopher C; Scheltens, Philip; Siemers, Eric; Snyder, Heather M; Sperling, Reisa

    2018-04-01

    In 2011, the National Institute on Aging and Alzheimer's Association created separate diagnostic recommendations for the preclinical, mild cognitive impairment, and dementia stages of Alzheimer's disease. Scientific progress in the interim led to an initiative by the National Institute on Aging and Alzheimer's Association to update and unify the 2011 guidelines. This unifying update is labeled a "research framework" because its intended use is for observational and interventional research, not routine clinical care. In the National Institute on Aging and Alzheimer's Association Research Framework, Alzheimer's disease (AD) is defined by its underlying pathologic processes that can be documented by postmortem examination or in vivo by biomarkers. The diagnosis is not based on the clinical consequences of the disease (i.e., symptoms/signs) in this research framework, which shifts the definition of AD in living people from a syndromal to a biological construct. The research framework focuses on the diagnosis of AD with biomarkers in living persons. Biomarkers are grouped into those of β amyloid deposition, pathologic tau, and neurodegeneration [AT(N)]. This ATN classification system groups different biomarkers (imaging and biofluids) by the pathologic process each measures. The AT(N) system is flexible in that new biomarkers can be added to the three existing AT(N) groups, and new biomarker groups beyond AT(N) can be added when they become available. We focus on AD as a continuum, and cognitive staging may be accomplished using continuous measures. However, we also outline two different categorical cognitive schemes for staging the severity of cognitive impairment: a scheme using three traditional syndromal categories and a six-stage numeric scheme. It is important to stress that this framework seeks to create a common language with which investigators can generate and test hypotheses about the interactions among different pathologic processes (denoted by biomarkers) and cognitive symptoms. We appreciate the concern that this biomarker-based research framework has the potential to be misused. Therefore, we emphasize, first, it is premature and inappropriate to use this research framework in general medical practice. Second, this research framework should not be used to restrict alternative approaches to hypothesis testing that do not use biomarkers. There will be situations where biomarkers are not available or requiring them would be counterproductive to the specific research goals (discussed in more detail later in the document). Thus, biomarker-based research should not be considered a template for all research into age-related cognitive impairment and dementia; rather, it should be applied when it is fit for the purpose of the specific research goals of a study. Importantly, this framework should be examined in diverse populations. Although it is possible that β-amyloid plaques and neurofibrillary tau deposits are not causal in AD pathogenesis, it is these abnormal protein deposits that define AD as a unique neurodegenerative disease among different disorders that can lead to dementia. We envision that defining AD as a biological construct will enable a more accurate characterization and understanding of the sequence of events that lead to cognitive impairment that is associated with AD, as well as the multifactorial etiology of dementia. This approach also will enable a more precise approach to interventional trials where specific pathways can be targeted in the disease process and in the appropriate people. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Optimal phase estimation with arbitrary a priori knowledge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demkowicz-Dobrzanski, Rafal

    2011-06-15

    The optimal-phase estimation strategy is derived when partial a priori knowledge on the estimated phase is available. The solution is found with the help of the most famous result from the entanglement theory: the positive partial transpose criterion. The structure of the optimal measurements, estimators, and the optimal probe states is analyzed. This Rapid Communication provides a unified framework bridging the gap in the literature on the subject which until now dealt almost exclusively with two extreme cases: almost perfect knowledge (local approach based on Fisher information) and no a priori knowledge (global approach based on covariant measurements). Special attentionmore » is paid to a natural a priori probability distribution arising from a diffusion process.« less

  6. Three-dimensional photoacoustic tomography based on graphics-processing-unit-accelerated finite element method.

    PubMed

    Peng, Kuan; He, Ling; Zhu, Ziqiang; Tang, Jingtian; Xiao, Jiaying

    2013-12-01

    Compared with commonly used analytical reconstruction methods, the frequency-domain finite element method (FEM) based approach has proven to be an accurate and flexible algorithm for photoacoustic tomography. However, the FEM-based algorithm is computationally demanding, especially for three-dimensional cases. To enhance the algorithm's efficiency, in this work a parallel computational strategy is implemented in the framework of the FEM-based reconstruction algorithm using a graphic-processing-unit parallel frame named the "compute unified device architecture." A series of simulation experiments is carried out to test the accuracy and accelerating effect of the improved method. The results obtained indicate that the parallel calculation does not change the accuracy of the reconstruction algorithm, while its computational cost is significantly reduced by a factor of 38.9 with a GTX 580 graphics card using the improved method.

  7. Kernel-imbedded Gaussian processes for disease classification using microarray gene expression data

    PubMed Central

    Zhao, Xin; Cheung, Leo Wang-Kit

    2007-01-01

    Background Designing appropriate machine learning methods for identifying genes that have a significant discriminating power for disease outcomes has become more and more important for our understanding of diseases at genomic level. Although many machine learning methods have been developed and applied to the area of microarray gene expression data analysis, the majority of them are based on linear models, which however are not necessarily appropriate for the underlying connection between the target disease and its associated explanatory genes. Linear model based methods usually also bring in false positive significant features more easily. Furthermore, linear model based algorithms often involve calculating the inverse of a matrix that is possibly singular when the number of potentially important genes is relatively large. This leads to problems of numerical instability. To overcome these limitations, a few non-linear methods have recently been introduced to the area. Many of the existing non-linear methods have a couple of critical problems, the model selection problem and the model parameter tuning problem, that remain unsolved or even untouched. In general, a unified framework that allows model parameters of both linear and non-linear models to be easily tuned is always preferred in real-world applications. Kernel-induced learning methods form a class of approaches that show promising potentials to achieve this goal. Results A hierarchical statistical model named kernel-imbedded Gaussian process (KIGP) is developed under a unified Bayesian framework for binary disease classification problems using microarray gene expression data. In particular, based on a probit regression setting, an adaptive algorithm with a cascading structure is designed to find the appropriate kernel, to discover the potentially significant genes, and to make the optimal class prediction accordingly. A Gibbs sampler is built as the core of the algorithm to make Bayesian inferences. Simulation studies showed that, even without any knowledge of the underlying generative model, the KIGP performed very close to the theoretical Bayesian bound not only in the case with a linear Bayesian classifier but also in the case with a very non-linear Bayesian classifier. This sheds light on its broader usability to microarray data analysis problems, especially to those that linear methods work awkwardly. The KIGP was also applied to four published microarray datasets, and the results showed that the KIGP performed better than or at least as well as any of the referred state-of-the-art methods did in all of these cases. Conclusion Mathematically built on the kernel-induced feature space concept under a Bayesian framework, the KIGP method presented in this paper provides a unified machine learning approach to explore both the linear and the possibly non-linear underlying relationship between the target features of a given binary disease classification problem and the related explanatory gene expression data. More importantly, it incorporates the model parameter tuning into the framework. The model selection problem is addressed in the form of selecting a proper kernel type. The KIGP method also gives Bayesian probabilistic predictions for disease classification. These properties and features are beneficial to most real-world applications. The algorithm is naturally robust in numerical computation. The simulation studies and the published data studies demonstrated that the proposed KIGP performs satisfactorily and consistently. PMID:17328811

  8. Unified Research on Network-Based Hard/Soft Information Fusion

    DTIC Science & Technology

    2016-02-02

    types). There are a number of search tree run parameters which must be set depending on the experimental setting. A pilot study was run to identify...Unlimited Final Report: Unified Research on Network-Based Hard/Soft Information Fusion The views, opinions and/or findings contained in this report...Final Report: Unified Research on Network-Based Hard/Soft Information Fusion Report Title The University at Buffalo (UB) Center for Multisource

  9. Interprofessional Care and Collaborative Practice.

    ERIC Educational Resources Information Center

    Casto, R. Michael; And Others

    This book provides materials for those learning about the dynamics, techniques, and potential of interprofessional collaboration in health care and human services professions. Eight case studies thread their way through most chapters to unify and illustrate the text. Part 1 addresses the theoretical framework that forms the basis for…

  10. Mean Comparison: Manifest Variable versus Latent Variable

    ERIC Educational Resources Information Center

    Yuan, Ke-Hai; Bentler, Peter M.

    2006-01-01

    An extension of multiple correspondence analysis is proposed that takes into account cluster-level heterogeneity in respondents' preferences/choices. The method involves combining multiple correspondence analysis and k-means in a unified framework. The former is used for uncovering a low-dimensional space of multivariate categorical variables…

  11. Unified Framework for Deriving Simultaneous Equation Algorithms for Water Distribution Networks

    EPA Science Inventory

    The known formulations for steady state hydraulics within looped water distribution networks are re-derived in terms of linear and non-linear transformations of the original set of partly linear and partly non-linear equations that express conservation of mass and energy. All of ...

  12. Reconciling Time, Space and Function: A New Dorsal-Ventral Stream Model of Sentence Comprehension

    ERIC Educational Resources Information Center

    Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias

    2013-01-01

    We present a new dorsal-ventral stream framework for language comprehension which unifies basic neurobiological assumptions (Rauschecker & Scott, 2009) with a cross-linguistic neurocognitive sentence comprehension model (eADM; Bornkessel & Schlesewsky, 2006). The dissociation between (time-dependent) syntactic structure-building and…

  13. The extraction and integration framework: a two-process account of statistical learning.

    PubMed

    Thiessen, Erik D; Kronstein, Alexandra T; Hufnagle, Daniel G

    2013-07-01

    The term statistical learning in infancy research originally referred to sensitivity to transitional probabilities. Subsequent research has demonstrated that statistical learning contributes to infant development in a wide array of domains. The range of statistical learning phenomena necessitates a broader view of the processes underlying statistical learning. Learners are sensitive to a much wider range of statistical information than the conditional relations indexed by transitional probabilities, including distributional and cue-based statistics. We propose a novel framework that unifies learning about all of these kinds of statistical structure. From our perspective, learning about conditional relations outputs discrete representations (such as words). Integration across these discrete representations yields sensitivity to cues and distributional information. To achieve sensitivity to all of these kinds of statistical structure, our framework combines processes that extract segments of the input with processes that compare across these extracted items. In this framework, the items extracted from the input serve as exemplars in long-term memory. The similarity structure of those exemplars in long-term memory leads to the discovery of cues and categorical structure, which guides subsequent extraction. The extraction and integration framework provides a way to explain sensitivity to both conditional statistical structure (such as transitional probabilities) and distributional statistical structure (such as item frequency and variability), and also a framework for thinking about how these different aspects of statistical learning influence each other. 2013 APA, all rights reserved

  14. Decomposing dendrophilia. Comment on “Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition” by W. Tecumseh Fitch

    NASA Astrophysics Data System (ADS)

    Honing, Henkjan; Zuidema, Willem

    2014-09-01

    The future of cognitive science will be about bridging neuroscience and behavioral studies, with essential roles played by comparative biology, formal modeling, and the theory of computation. Nowhere will this integration be more strongly needed than in understanding the biological basis of language and music. We thus strongly sympathize with the general framework that Fitch [1] proposes, and welcome the remarkably broad and readable review he presents to support it.

  15. RosettaRemodel: A Generalized Framework for Flexible Backbone Protein Design

    PubMed Central

    Huang, Po-Ssu; Ban, Yih-En Andrew; Richter, Florian; Andre, Ingemar; Vernon, Robert; Schief, William R.; Baker, David

    2011-01-01

    We describe RosettaRemodel, a generalized framework for flexible protein design that provides a versatile and convenient interface to the Rosetta modeling suite. RosettaRemodel employs a unified interface, called a blueprint, which allows detailed control over many aspects of flexible backbone protein design calculations. RosettaRemodel allows the construction and elaboration of customized protocols for a wide range of design problems ranging from loop insertion and deletion, disulfide engineering, domain assembly, loop remodeling, motif grafting, symmetrical units, to de novo structure modeling. PMID:21909381

  16. Pricing foreign equity option with stochastic volatility

    NASA Astrophysics Data System (ADS)

    Sun, Qi; Xu, Weidong

    2015-11-01

    In this paper we propose a general foreign equity option pricing framework that unifies the vast foreign equity option pricing literature and incorporates the stochastic volatility into foreign equity option pricing. Under our framework, the time-changed Lévy processes are used to model the underlying assets price of foreign equity option and the closed form pricing formula is obtained through the use of characteristic function methodology. Numerical tests indicate that stochastic volatility has a dramatic effect on the foreign equity option prices.

  17. Unified criteria for multipartite quantum nonlocality

    NASA Astrophysics Data System (ADS)

    Cavalcanti, E. G.; He, Q. Y.; Reid, M. D.; Wiseman, H. M.

    2011-09-01

    Wiseman and co-workers [H. M. Wiseman, S. J. Jones, and A. C. Doherty, Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.98.140402 98, 140402, (2007)] proposed a distinction among the nonlocality classes of Bell's nonlocality, Einstein-Podolsky-Rosen (EPR) paradox or steering, and entanglement based on whether or not an overseer trusts each party in a bipartite scenario where they are asked to demonstrate entanglement. Here we extend that concept to the multipartite case and derive inequalities that progressively test for those classes of nonlocality, with different thresholds for each level. This framework includes the three classes of nonlocality above in special cases and introduces a family of others.

  18. A joint analysis of the Drake equation and the Fermi paradox

    NASA Astrophysics Data System (ADS)

    Prantzos, Nikos

    2013-07-01

    I propose a unified framework for a joint analysis of the Drake equation and the Fermi paradox, which enables a simultaneous, quantitative study of both of them. The analysis is based on a simplified form of the Drake equation and on a fairly simple scheme for the colonization of the Milky Way. It appears that for sufficiently long-lived civilizations, colonization of the Galaxy is the only reasonable option to gain knowledge about other life forms. This argument allows one to define a region in the parameter space of the Drake equation, where the Fermi paradox definitely holds (`Strong Fermi paradox').

  19. Teaching Tree-Thinking to Undergraduate Biology Students.

    PubMed

    Meisel, Richard P

    2010-07-27

    Evolution is the unifying principle of all biology, and understanding how evolutionary relationships are represented is critical for a complete understanding of evolution. Phylogenetic trees are the most conventional tool for displaying evolutionary relationships, and "tree-thinking" has been coined as a term to describe the ability to conceptualize evolutionary relationships. Students often lack tree-thinking skills, and developing those skills should be a priority of biology curricula. Many common student misconceptions have been described, and a successful instructor needs a suite of tools for correcting those misconceptions. I review the literature on teaching tree-thinking to undergraduate students and suggest how this material can be presented within an inquiry-based framework.

  20. Modelling biological behaviours with the unified modelling language: an immunological case study and critique.

    PubMed

    Read, Mark; Andrews, Paul S; Timmis, Jon; Kumar, Vipin

    2014-10-06

    We present a framework to assist the diagrammatic modelling of complex biological systems using the unified modelling language (UML). The framework comprises three levels of modelling, ranging in scope from the dynamics of individual model entities to system-level emergent properties. By way of an immunological case study of the mouse disease experimental autoimmune encephalomyelitis, we show how the framework can be used to produce models that capture and communicate the biological system, detailing how biological entities, interactions and behaviours lead to higher-level emergent properties observed in the real world. We demonstrate how the UML can be successfully applied within our framework, and provide a critique of UML's ability to capture concepts fundamental to immunology and biology more generally. We show how specialized, well-explained diagrams with less formal semantics can be used where no suitable UML formalism exists. We highlight UML's lack of expressive ability concerning cyclic feedbacks in cellular networks, and the compounding concurrency arising from huge numbers of stochastic, interacting agents. To compensate for this, we propose several additional relationships for expressing these concepts in UML's activity diagram. We also demonstrate the ambiguous nature of class diagrams when applied to complex biology, and question their utility in modelling such dynamic systems. Models created through our framework are non-executable, and expressly free of simulation implementation concerns. They are a valuable complement and precursor to simulation specifications and implementations, focusing purely on thoroughly exploring the biology, recording hypotheses and assumptions, and serve as a communication medium detailing exactly how a simulation relates to the real biology.

  1. Modelling biological behaviours with the unified modelling language: an immunological case study and critique

    PubMed Central

    Read, Mark; Andrews, Paul S.; Timmis, Jon; Kumar, Vipin

    2014-01-01

    We present a framework to assist the diagrammatic modelling of complex biological systems using the unified modelling language (UML). The framework comprises three levels of modelling, ranging in scope from the dynamics of individual model entities to system-level emergent properties. By way of an immunological case study of the mouse disease experimental autoimmune encephalomyelitis, we show how the framework can be used to produce models that capture and communicate the biological system, detailing how biological entities, interactions and behaviours lead to higher-level emergent properties observed in the real world. We demonstrate how the UML can be successfully applied within our framework, and provide a critique of UML's ability to capture concepts fundamental to immunology and biology more generally. We show how specialized, well-explained diagrams with less formal semantics can be used where no suitable UML formalism exists. We highlight UML's lack of expressive ability concerning cyclic feedbacks in cellular networks, and the compounding concurrency arising from huge numbers of stochastic, interacting agents. To compensate for this, we propose several additional relationships for expressing these concepts in UML's activity diagram. We also demonstrate the ambiguous nature of class diagrams when applied to complex biology, and question their utility in modelling such dynamic systems. Models created through our framework are non-executable, and expressly free of simulation implementation concerns. They are a valuable complement and precursor to simulation specifications and implementations, focusing purely on thoroughly exploring the biology, recording hypotheses and assumptions, and serve as a communication medium detailing exactly how a simulation relates to the real biology. PMID:25142524

  2. A stochastically fully connected conditional random field framework for super resolution OCT

    NASA Astrophysics Data System (ADS)

    Boroomand, A.; Tan, B.; Wong, A.; Bizheva, K.

    2017-02-01

    A number of factors can degrade the resolution and contrast of OCT images, such as: (1) changes of the OCT pointspread function (PSF) resulting from wavelength dependent scattering and absorption of light along the imaging depth (2) speckle noise, as well as (3) motion artifacts. We propose a new Super Resolution OCT (SR OCT) imaging framework that takes advantage of a Stochastically Fully Connected Conditional Random Field (SF-CRF) model to generate a Super Resolved OCT (SR OCT) image of higher quality from a set of Low-Resolution OCT (LR OCT) images. The proposed SF-CRF SR OCT imaging is able to simultaneously compensate for all of the factors mentioned above, that degrade the OCT image quality, using a unified computational framework. The proposed SF-CRF SR OCT imaging framework was tested on a set of simulated LR human retinal OCT images generated from a high resolution, high contrast retinal image, and on a set of in-vivo, high resolution, high contrast rat retinal OCT images. The reconstructed SR OCT images show considerably higher spatial resolution, less speckle noise and higher contrast compared to other tested methods. Visual assessment of the results demonstrated the usefulness of the proposed approach in better preservation of fine details and structures of the imaged sample, retaining biological tissue boundaries while reducing speckle noise using a unified computational framework. Quantitative evaluation using both Contrast to Noise Ratio (CNR) and Edge Preservation (EP) parameter also showed superior performance of the proposed SF-CRF SR OCT approach compared to other image processing approaches.

  3. Qualitative insights into practice time management: does 'patient-centred time' in practice management offer a portal to improved access?

    PubMed

    Buetow, S; Adair, V; Coster, G; Hight, M; Gribben, B; Mitchell, E

    2002-12-01

    Different sets of literature suggest how aspects of practice time management can limit access to general practitioner (GP) care. Researchers have not organised this knowledge into a unified framework that can enhance understanding of barriers to, and opportunities for, improved access. To suggest a framework conceptualising how differences in professional and cultural understanding of practice time management in Auckland, New Zealand, influence access to GP care for children with chronic asthma. A qualitative study involving selective sampling, semi-structured interviews on barriers to access, and a general inductive approach. Twenty-nine key informants and ten mothers of children with chronic, moderate to severe asthma and poor access to GP care in Auckland. Development of a framework from themes describing barriers associated with, and needs for, practice time management. The themes were independently identified by two authors from transcribed interviews and confirmed through informant checking. Themes from key informant and patient interviews were triangulated with each other and with published literature. The framework distinguishes 'practice-centred time' from 'patient-centred time.' A predominance of 'practice-centred time' and an unmet opportunity for 'patient-centred time' are suggested by the persistence of five barriers to accessing GP care: limited hours of opening; traditional appointment systems; practice intolerance of missed appointments; long waiting times in the practice; and inadequate consultation lengths. None of the barriers is specific to asthmatic children. A unified framework was suggested for understanding how the organisation of practice work time can influence access to GP care by groups including asthmatic children.

  4. Exploiting Vector and Multicore Parallelsim for Recursive, Data- and Task-Parallel Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Bin; Krishnamoorthy, Sriram; Agrawal, Kunal

    Modern hardware contains parallel execution resources that are well-suited for data-parallelism-vector units-and task parallelism-multicores. However, most work on parallel scheduling focuses on one type of hardware or the other. In this work, we present a scheduling framework that allows for a unified treatment of task- and data-parallelism. Our key insight is an abstraction, task blocks, that uniformly handles data-parallel iterations and task-parallel tasks, allowing them to be scheduled on vector units or executed independently as multicores. Our framework allows us to define schedulers that can dynamically select between executing task- blocks on vector units or multicores. We show that thesemore » schedulers are asymptotically optimal, and deliver the maximum amount of parallelism available in computation trees. To evaluate our schedulers, we develop program transformations that can convert mixed data- and task-parallel pro- grams into task block-based programs. Using a prototype instantiation of our scheduling framework, we show that, on an 8-core system, we can simultaneously exploit vector and multicore parallelism to achieve 14×-108× speedup over sequential baselines.« less

  5. Diffany: an ontology-driven framework to infer, visualise and analyse differential molecular networks.

    PubMed

    Van Landeghem, Sofie; Van Parys, Thomas; Dubois, Marieke; Inzé, Dirk; Van de Peer, Yves

    2016-01-05

    Differential networks have recently been introduced as a powerful way to study the dynamic rewiring capabilities of an interactome in response to changing environmental conditions or stimuli. Currently, such differential networks are generated and visualised using ad hoc methods, and are often limited to the analysis of only one condition-specific response or one interaction type at a time. In this work, we present a generic, ontology-driven framework to infer, visualise and analyse an arbitrary set of condition-specific responses against one reference network. To this end, we have implemented novel ontology-based algorithms that can process highly heterogeneous networks, accounting for both physical interactions and regulatory associations, symmetric and directed edges, edge weights and negation. We propose this integrative framework as a standardised methodology that allows a unified view on differential networks and promotes comparability between differential network studies. As an illustrative application, we demonstrate its usefulness on a plant abiotic stress study and we experimentally confirmed a predicted regulator. Diffany is freely available as open-source java library and Cytoscape plugin from http://bioinformatics.psb.ugent.be/supplementary_data/solan/diffany/.

  6. Large ensemble and large-domain hydrologic modeling: Insights from SUMMA applications in the Columbia River Basin

    NASA Astrophysics Data System (ADS)

    Ou, G.; Nijssen, B.; Nearing, G. S.; Newman, A. J.; Mizukami, N.; Clark, M. P.

    2016-12-01

    The Structure for Unifying Multiple Modeling Alternatives (SUMMA) provides a unifying modeling framework for process-based hydrologic modeling by defining a general set of conservation equations for mass and energy, with the capability to incorporate multiple choices for spatial discretizations and flux parameterizations. In this study, we provide a first demonstration of large-scale hydrologic simulations using SUMMA through an application to the Columbia River Basin (CRB) in the northwestern United States and Canada for a multi-decadal simulation period. The CRB is discretized into 11,723 hydrologic response units (HRUs) according to the United States Geologic Service Geospatial Fabric. The soil parameters are derived from the Natural Resources Conservation Service Soil Survey Geographic (SSURGO) Database. The land cover parameters are based on the National Land Cover Database from the year 2001 created by the Multi-Resolution Land Characteristics (MRLC) Consortium. The forcing data, including hourly air pressure, temperature, specific humidity, wind speed, precipitation, shortwave and longwave radiations, are based on Phase 2 of the North American Land Data Assimilation System (NLDAS-2) and averaged for each HRU. The simulation results are compared to simulations with the Variable Infiltration Capacity (VIC) model and the Precipitation Runoff Modeling System (PRMS). We are particularly interested in SUMMA's capability to mimic model behaviors of the other two models through the selection of appropriate model parameterizations in SUMMA.

  7. Intuitive and deliberate judgments are based on common principles.

    PubMed

    Kruglanski, Arie W; Gigerenzer, Gerd

    2011-01-01

    A popular distinction in cognitive and social psychology has been between intuitive and deliberate judgments. This juxtaposition has aligned in dual-process theories of reasoning associative, unconscious, effortless, heuristic, and suboptimal processes (assumed to foster intuitive judgments) versus rule-based, conscious, effortful, analytic, and rational processes (assumed to characterize deliberate judgments). In contrast, we provide convergent arguments and evidence for a unified theoretical approach to both intuitive and deliberative judgments. Both are rule-based, and in fact, the very same rules can underlie both intuitive and deliberate judgments. The important open question is that of rule selection, and we propose a 2-step process in which the task itself and the individual's memory constrain the set of applicable rules, whereas the individual's processing potential and the (perceived) ecological rationality of the rule for the task guide the final selection from that set. Deliberate judgments are not generally more accurate than intuitive judgments; in both cases, accuracy depends on the match between rule and environment: the rules' ecological rationality. Heuristics that are less effortful and in which parts of the information are ignored can be more accurate than cognitive strategies that have more information and computation. The proposed framework adumbrates a unified approach that specifies the critical dimensions on which judgmental situations may vary and the environmental conditions under which rules can be expected to be successful.

  8. Will the Meikirch Model, a New Framework for Health, Induce a Paradigm Shift in Healthcare?

    PubMed

    Bircher, Johannes; Hahn, Eckhart G

    2017-03-06

    Over the past decades, scientific medicine has realized tremendous advances. Yet, it is felt that the quality, costs, and equity of medicine and public health have not improved correspondingly and, both inside and outside the USA, may even have changed for the worse. An initiative for improving this situation is value-based healthcare, in which value is defined as health outcomes relative to the cost of achieving them. Value-based healthcare was advocated in order to stimulate competition among healthcare providers and thereby reduce costs. The approach may be well grounded economically, but in the care of patients, "value" has ethical and philosophical connotations. The restriction of value to an economic meaning ignores the importance of health and, thus, leads to misunderstandings. We postulate that a new understanding of the nature of health is necessary. We present the Meikirch model, a conceptual framework for health and disease that views health as a complex adaptive system. We describe this model and analyze some important consequences of its application to healthcare. The resources each person needs to meet the demands of life are both biological and personal, and both function together. While scientific advances in healthcare are hailed, these advances focus mainly on the biologically given potential (BGP) and tend to neglect the personally acquired potential (PAP) of an individual person. Personal growth to improve the PAP strongly contributes to meeting the demands of life. Therefore, in individual and public health care, personal growth deserves as much attention as the BGP. The conceptual framework of the Meikirch model supports a unified understanding of healthcare and serves to develop common goals, thereby rendering interprofessional and intersectoral cooperation more successful. The Meikirch model can be used as an effective tool to stimulate health literacy and improve health-supporting behavior. If individuals and groups of people involved in healthcare interact based on the model, mutual understanding of and adherence to treatments and preventive measures will improve. In healthcare, the Meikirch model also makes it plain that neither pay-for-performance nor value-based payment is an adequate response to improve person-centered healthcare. The Meikirch model is not only a unifying theoretical framework for health and disease but also a scaffold for the practice of medicine and public health. It is fully in line with the theory and practice of evidence-based medicine, person-centered healthcare, and integrative medicine. The model offers opportunities to self-motivate people to improve their health-supporting behavior, thereby making preventive approaches and overall healthcare more effective. We believe that the Meikirch model could induce a paradigm shift in healthcare. The healthcare community is hereby invited to acquaint themselves with this model and to consider its potential ramifications.

  9. Combined node and link partitions method for finding overlapping communities in complex networks

    PubMed Central

    Jin, Di; Gabrys, Bogdan; Dang, Jianwu

    2015-01-01

    Community detection in complex networks is a fundamental data analysis task in various domains, and how to effectively find overlapping communities in real applications is still a challenge. In this work, we propose a new unified model and method for finding the best overlapping communities on the basis of the associated node and link partitions derived from the same framework. Specifically, we first describe a unified model that accommodates node and link communities (partitions) together, and then present a nonnegative matrix factorization method to learn the parameters of the model. Thereafter, we infer the overlapping communities based on the derived node and link communities, i.e., determine each overlapped community between the corresponding node and link community with a greedy optimization of a local community function conductance. Finally, we introduce a model selection method based on consensus clustering to determine the number of communities. We have evaluated our method on both synthetic and real-world networks with ground-truths, and compared it with seven state-of-the-art methods. The experimental results demonstrate the superior performance of our method over the competing ones in detecting overlapping communities for all analysed data sets. Improved performance is particularly pronounced in cases of more complicated networked community structures. PMID:25715829

  10. The Development of Web-based Graphical User Interface for Unified Modeling Data with Multi (Correlated) Responses

    NASA Astrophysics Data System (ADS)

    Made Tirta, I.; Anggraeni, Dian

    2018-04-01

    Statistical models have been developed rapidly into various directions to accommodate various types of data. Data collected from longitudinal, repeated measured, clustered data (either continuous, binary, count, or ordinal), are more likely to be correlated. Therefore statistical model for independent responses, such as Generalized Linear Model (GLM), Generalized Additive Model (GAM) are not appropriate. There are several models available to apply for correlated responses including GEEs (Generalized Estimating Equations), for marginal model and various mixed effect model such as GLMM (Generalized Linear Mixed Models) and HGLM (Hierarchical Generalized Linear Models) for subject spesific models. These models are available on free open source software R, but they can only be accessed through command line interface (using scrit). On the othe hand, most practical researchers very much rely on menu based or Graphical User Interface (GUI). We develop, using Shiny framework, standard pull down menu Web-GUI that unifies most models for correlated responses. The Web-GUI has accomodated almost all needed features. It enables users to do and compare various modeling for repeated measure data (GEE, GLMM, HGLM, GEE for nominal responses) much more easily trough online menus. This paper discusses the features of the Web-GUI and illustrates the use of them. In General we find that GEE, GLMM, HGLM gave very closed results.

  11. Sex Ratio Elasticity Influences the Selection of Sex Ratio Strategy.

    PubMed

    Wang, Yaqiang; Wang, Ruiwu; Li, Yaotang; Sam Ma, Zhanshan

    2016-12-23

    There are three sex ratio strategies (SRS) in nature-male-biased sex ratio, female-biased sex ratio and, equal sex ratio. It was R. A. Fisher who first explained why most species in nature display a sex ratio of ½. Consequent SRS theories such as Hamilton's local mate competition (LMC) and Clark's local resource competition (LRC) separately explained the observed deviations from the seemingly universal 1:1 ratio. However, to the best of our knowledge, there is not yet a unified theory that accounts for the mechanisms of the three SRS. Here, we introduce the price elasticity theory in economics to define sex ratio elasticity (SRE), and present an analytical model that derives three SRSs based on the following assumption: simultaneously existing competitions for both resources A and resources B influence the level of SRE in both sexes differently. Consequently, it is the difference (between two sexes) in the level of their sex ratio elasticity that leads to three different SRS. Our analytical results demonstrate that the elasticity-based model not only reveals a highly plausible mechanism that explains the evolution of SRS in nature, but also offers a novel framework for unifying two major classical theories (i.e., LMC &LRC) in the field of SRS research.

  12. Sex Ratio Elasticity Influences the Selection of Sex Ratio Strategy

    NASA Astrophysics Data System (ADS)

    Wang, Yaqiang; Wang, Ruiwu; Li, Yaotang; (Sam) Ma, Zhanshan

    2016-12-01

    There are three sex ratio strategies (SRS) in nature—male-biased sex ratio, female-biased sex ratio and, equal sex ratio. It was R. A. Fisher who first explained why most species in nature display a sex ratio of ½. Consequent SRS theories such as Hamilton’s local mate competition (LMC) and Clark’s local resource competition (LRC) separately explained the observed deviations from the seemingly universal 1:1 ratio. However, to the best of our knowledge, there is not yet a unified theory that accounts for the mechanisms of the three SRS. Here, we introduce the price elasticity theory in economics to define sex ratio elasticity (SRE), and present an analytical model that derives three SRSs based on the following assumption: simultaneously existing competitions for both resources A and resources B influence the level of SRE in both sexes differently. Consequently, it is the difference (between two sexes) in the level of their sex ratio elasticity that leads to three different SRS. Our analytical results demonstrate that the elasticity-based model not only reveals a highly plausible mechanism that explains the evolution of SRS in nature, but also offers a novel framework for unifying two major classical theories (i.e., LMC & LRC) in the field of SRS research.

  13. Rater cognition: review and integration of research findings.

    PubMed

    Gauthier, Geneviève; St-Onge, Christina; Tavares, Walter

    2016-05-01

    Given the complexity of competency frameworks, associated skills and abilities, and contexts in which they are to be assessed in competency-based education (CBE), there is an increased reliance on rater judgements when considering trainee performance. This increased dependence on rater-based assessment has led to the emergence of rater cognition as a field of research in health professions education. The topic, however, is often conceptualised and ultimately investigated using many different perspectives and theoretical frameworks. Critically analysing how researchers think about, study and discuss rater cognition or the judgement processes in assessment frameworks may provide meaningful and efficient directions in how the field continues to explore the topic. We conducted a critical and integrative review of the literature to explore common conceptualisations and unified terminology associated with rater cognition research. We identified 1045 articles on rater-based assessment in health professions education using Scorpus, Medline and ERIC and 78 articles were included in our review. We propose a three-phase framework of observation, processing and integration. We situate nine specific mechanisms and sub-mechanisms described across the literature within these phases: (i) generating automatic impressions about the person; (ii) formulating high-level inferences; (iii) focusing on different dimensions of competencies; (iv) categorising through well-developed schemata based on (a) personal concept of competence, (b) comparison with various exemplars and (c) task and context specificity; (v) weighting and synthesising information differently, (vi) producing narrative judgements; and (vii) translating narrative judgements into scales. Our review has allowed us to identify common underlying conceptualisations of observed rater mechanisms and subsequently propose a comprehensive, although complex, framework for the dynamic and contextual nature of the rating process. This framework could help bridge the gap between researchers adopting different perspectives when studying rater cognition and enable the interpretation of contradictory findings of raters' performance by determining which mechanism is enabled or disabled in any given context. © 2016 John Wiley & Sons Ltd.

  14. A development framework for artificial intelligence based distributed operations support systems

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.; Cottman, Bruce H.

    1990-01-01

    Advanced automation is required to reduce costly human operations support requirements for complex space-based and ground control systems. Existing knowledge based technologies have been used successfully to automate individual operations tasks. Considerably less progress has been made in integrating and coordinating multiple operations applications for unified intelligent support systems. To fill this gap, SOCIAL, a tool set for developing Distributed Artificial Intelligence (DAI) systems is being constructed. SOCIAL consists of three primary language based components defining: models of interprocess communication across heterogeneous platforms; models for interprocess coordination, concurrency control, and fault management; and for accessing heterogeneous information resources. DAI applications subsystems, either new or existing, will access these distributed services non-intrusively, via high-level message-based protocols. SOCIAL will reduce the complexity of distributed communications, control, and integration, enabling developers to concentrate on the design and functionality of the target DAI system itself.

  15. Teacher Preparation for Vocational Education and Training in Germany: A Potential Model for Canada?

    ERIC Educational Resources Information Center

    Barabasch, Antje; Watt-Malcolm, Bonnie

    2013-01-01

    Germany's vocational education and training (VET) and corresponding teacher-education programmes are known worldwide for their integrated framework. Government legislation unifies companies, unions and vocational schools, and specifies the education and training required for students as well as vocational teachers. Changing from the Diplom…

  16. The Unified Plant Growth Model (UPGM): software framework overview and model application

    USDA-ARS?s Scientific Manuscript database

    Since the Environmental Policy Integrated Climate (EPIC) model was developed in 1989, the EPIC plant growth component has been incorporated into other erosion and crop management models (e.g., WEPS, WEPP, SWAT, ALMANAC, and APEX) and modified to meet model developer research objectives. This has re...

  17. Software Hardware Asset Reuse Enterprise (SHARE) Repository Framework Final Report: Component Specification and Ontology

    DTIC Science & Technology

    2009-08-19

    SSDS Ship Self Defense System TSTS Total Ship Training System UDDI Universal Description, Discovery, and Integration UML Unified Modeling...34ContractorOrganization" type="ContractorOrganizationType"> <xs:annotation> <xs:documentation>Identifies a contractor organization resposible for the

  18. An Extension of Multiple Correspondence Analysis for Identifying Heterogeneous Subgroups of Respondents

    ERIC Educational Resources Information Center

    Hwang, Heungsun; Montreal, Hec; Dillon, William R.; Takane, Yoshio

    2006-01-01

    An extension of multiple correspondence analysis is proposed that takes into account cluster-level heterogeneity in respondents' preferences/choices. The method involves combining multiple correspondence analysis and k-means in a unified framework. The former is used for uncovering a low-dimensional space of multivariate categorical variables…

  19. Diversity from genes to ecosystems: A unifying framework to study variation across biological metrics and scales

    USDA-ARS?s Scientific Manuscript database

    Biological diversity is a key concept in the life sciences and plays a fundamental role in many ecological and evolutionary processes. Although biodiversity is inherently a hierarchical concept covering different levels of organization (genes, population, species, ecological communities and ecosyst...

  20. The Theory behind the Theory in DCT and SCDT: A Response to Rigazio-DiGilio.

    ERIC Educational Resources Information Center

    Terry, Linda L.

    1994-01-01

    Responds to previous article by Rigazio-DiGilio on Developmental Counseling and Therapy and Systemic Cognitive-Developmental Therapy as two integrative models that unify individual, family, and network treatment within coconstructive-developmental framework. Discusses hidden complexities in cognitive-developmental ecosystemic integration and…

  1. Potential of DCT/SCDT in Addressing Two Elusive Themes of Mental Health Counseling.

    ERIC Educational Resources Information Center

    Borders, L. DiAnne

    1994-01-01

    Responds to previous article by Rigazio-DiGilio on Developmental Counseling and Therapy and Systemic Cognitive-Developmental Therapy as two integrative models that unify individual, family, and network treatment within coconstructive-developmental framework. Considers extent to which model breaks impasse in integrating development into counseling…

  2. Converging Instructional Technology and Critical Intercultural Pedagogy in Teacher Education

    ERIC Educational Resources Information Center

    Pittman, Joyce

    2007-01-01

    Purpose: This paper aims to postulate an emerging unified cultural-convergence framework to converge the delivery of instructional technology and intercultural education (ICE) that extends beyond web-learning technologies to inculcate inclusive pedagogy in teacher education. Design/methodology/approach: The paper explores the literature and a…

  3. Spending on School Infrastructure: Does Money Matter?

    ERIC Educational Resources Information Center

    Crampton, Faith E.

    2009-01-01

    Purpose: The purpose of this study is to further develop an emerging thread of quantitative research that grounds investment in school infrastructure in a unified theoretical framework of investment in human, social, and physical capital. Design/methodology/approach: To answer the research question, what is the impact of investment in human,…

  4. Simultaneous Two-Way Clustering of Multiple Correspondence Analysis

    ERIC Educational Resources Information Center

    Hwang, Heungsun; Dillon, William R.

    2010-01-01

    A 2-way clustering approach to multiple correspondence analysis is proposed to account for cluster-level heterogeneity of both respondents and variable categories in multivariate categorical data. Specifically, in the proposed method, multiple correspondence analysis is combined with k-means in a unified framework in which "k"-means is…

  5. Darwinian Liberal Education

    ERIC Educational Resources Information Center

    Arnhart, Larry

    2006-01-01

    Be it metaphysics, theology, or some other unifying framework, humans have long sought to determine "first principles" underlying knowledge. Larry Arnhart continues in this vein, positing a Darwinian web of genetic, cultural, and cognitive evolution to explain our social behavior in terms of human nature as governed by biology. He leaves it to us…

  6. Unified, Insular, Firmly Policed, or Fractured, Porous, Contested, Gifted Education?

    ERIC Educational Resources Information Center

    Ambrose, Don; VanTassel-Baska, Joyce; Coleman, Laurence J.; Cross, Tracy L.

    2010-01-01

    Much like medieval, feudal nations, professional fields such as gifted education can take shape as centralized kingdoms with strong armies controlling their compliant populations and protecting closed borders, or as loose collections of conflict-prone principalities with borders open to invaders. Using an investigative framework borrowed from an…

  7. A framework for understanding waste management studies in construction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu Weisheng, E-mail: wilsonlu@hku.hk; Yuan Hongping, E-mail: hp.yuan@polyu.edu.hk

    2011-06-15

    During the past decades, construction and demolition (C and D) waste issues have received increasing attention from both practitioners and researchers around the world. A plethora of research relating to C and D waste management (WM) has been published in scholarly journals. However, a comprehensive understanding of the C and D WM research is somehow absent in spite of its proliferation. The aim of this paper is to develop a framework that helps readers understand the C and D WM research as archived in selected journals. Papers under the topic of C and D WM are retrieved based on amore » set of rigorous procedures. The information of these papers is then analyzed with the assistance of the Qualitative Social Research (QSR) software package NVivo. A framework for understanding C and D WM research is created based on the analytic results. By following the framework, a bibliometric analysis of research in C and D WM is presented, followed by an in-depth literature analysis. It is found that C and D generation, reduction, and recycling are the three major topics in the discipline of C and D WM. Future research is recommended to (a) investigate C and D waste issues in wider scopes including design, maintenance and demolition, (b) develop a unified measurement for waste generation so that WM performance can be compared across various economies, and (c) enhance effectiveness of WM approaches (e.g. waste charging scheme) based on new WM concepts (e.g. Extended Producer Responsibility). In addition to the above research findings, the approach for producing the research framework can be useful references for other studies which attempt to understand the research of a given discipline.« less

  8. A framework for understanding waste management studies in construction.

    PubMed

    Lu, Weisheng; Yuan, Hongping

    2011-06-01

    During the past decades, construction and demolition (C&D) waste issues have received increasing attention from both practitioners and researchers around the world. A plethora of research relating to C&D waste management (WM) has been published in scholarly journals. However, a comprehensive understanding of the C&D WM research is somehow absent in spite of its proliferation. The aim of this paper is to develop a framework that helps readers understand the C&D WM research as archived in selected journals. Papers under the topic of C&D WM are retrieved based on a set of rigorous procedures. The information of these papers is then analyzed with the assistance of the Qualitative Social Research (QSR) software package NVivo. A framework for understanding C&D WM research is created based on the analytic results. By following the framework, a bibliometric analysis of research in C&D WM is presented, followed by an in-depth literature analysis. It is found that C&D generation, reduction, and recycling are the three major topics in the discipline of C&D WM. Future research is recommended to (a) investigate C&D waste issues in wider scopes including design, maintenance and demolition, (b) develop a unified measurement for waste generation so that WM performance can be compared across various economies, and (c) enhance effectiveness of WM approaches (e.g. waste charging scheme) based on new WM concepts (e.g. Extended Producer Responsibility). In addition to the above research findings, the approach for producing the research framework can be useful references for other studies which attempt to understand the research of a given discipline. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Software for Data Analysis with Graphical Models

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.; Roy, H. Scott

    1994-01-01

    Probabilistic graphical models are being used widely in artificial intelligence and statistics, for instance, in diagnosis and expert systems, as a framework for representing and reasoning with probabilities and independencies. They come with corresponding algorithms for performing statistical inference. This offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper illustrates the framework with an example and then presents some basic techniques for the task: problem decomposition and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  10. PERCH: A Unified Framework for Disease Gene Prioritization.

    PubMed

    Feng, Bing-Jian

    2017-03-01

    To interpret genetic variants discovered from next-generation sequencing, integration of heterogeneous information is vital for success. This article describes a framework named PERCH (Polymorphism Evaluation, Ranking, and Classification for a Heritable trait), available at http://BJFengLab.org/. It can prioritize disease genes by quantitatively unifying a new deleteriousness measure called BayesDel, an improved assessment of the biological relevance of genes to the disease, a modified linkage analysis, a novel rare-variant association test, and a converted variant call quality score. It supports data that contain various combinations of extended pedigrees, trios, and case-controls, and allows for a reduced penetrance, an elevated phenocopy rate, liability classes, and covariates. BayesDel is more accurate than PolyPhen2, SIFT, FATHMM, LRT, Mutation Taster, Mutation Assessor, PhyloP, GERP++, SiPhy, CADD, MetaLR, and MetaSVM. The overall approach is faster and more powerful than the existing quantitative method pVAAST, as shown by the simulations of challenging situations in finding the missing heritability of a complex disease. This framework can also classify variants of unknown significance (variants of uncertain significance) by quantitatively integrating allele frequencies, deleteriousness, association, and co-segregation. PERCH is a versatile tool for gene prioritization in gene discovery research and variant classification in clinical genetic testing. © 2016 The Authors. **Human Mutation published by Wiley Periodicals, Inc.

  11. NIA-AA Research Framework: Toward a biological definition of Alzheimer’s disease

    PubMed Central

    Jack, Clifford R.; Bennett, David A.; Blennow, Kaj; Carrillo, Maria C.; Dunn, Billy; Haeberlein, Samantha Budd; Holtzman, David M.; Jagust, William; Jessen, Frank; Karlawish, Jason; Liu, Enchi; Molinuevo, Jose Luis; Montine, Thomas; Phelps, Creighton; Rankin, Katherine P.; Rowe, Christopher C.; Scheltens, Philip; Siemers, Eric; Snyder, Heather M.; Sperling, Reisa

    2018-01-01

    In 2011, the National Institute on Aging and Alzheimer’s Association created separate diagnostic recommendations for the preclinical, mild cognitive impairment, and dementia stages of Alzheimer’s disease. Scientific progress in the interim led to an initiative by the National Institute on Aging and Alzheimer’s Association to update and unify the 2011 guidelines. This unifying update is labeled a “research framework” because its intended use is for observational and interventional research, not routine clinical care. In the National Institute on Aging and Alzheimer’s Association Research Framework, Alzheimer’s disease (AD) is defined by its underlying pathologic processes that can be documented by postmortem examination or in vivo by biomarkers. The diagnosis is not based on the clinical consequences of the disease (i.e., symptoms/signs) in this research framework, which shifts the definition of AD in living people from a syndromal to a biological construct. The research framework focuses on the diagnosis of AD with biomarkers in living persons. Biomarkers are grouped into those of β amyloid deposition, pathologic tau, and neurodegeneration [AT(N)]. This ATN classification system groups different biomarkers (imaging and biofluids) by the pathologic process each measures. The AT(N) system is flexible in that new biomarkers can be added to the three existing AT(N) groups, and new biomarker groups beyond AT(N) can be added when they become available. We focus on AD as a continuum, and cognitive staging may be accomplished using continuous measures. However, we also outline two different categorical cognitive schemes for staging the severity of cognitive impairment: a scheme using three traditional syndromal categories and a six-stage numeric scheme. It is important to stress that this framework seeks to create a common language with which investigators can generate and test hypotheses about the interactions among different pathologic processes (denoted by biomarkers) and cognitive symptoms. We appreciate the concern that this biomarker-based research framework has the potential to be misused. Therefore, we emphasize, first, it is premature and inappropriate to use this research framework in general medical practice. Second, this research framework should not be used to restrict alternative approaches to hypothesis testing that do not use biomarkers. There will be situations where biomarkers are not available or requiring them would be counterproductive to the specific research goals (discussed in more detail later in the document). Thus, biomarker-based research should not be considered a template for all research into age-related cognitive impairment and dementia; rather, it should be applied when it is fit for the purpose of the specific research goals of a study. Importantly, this framework should be examined in diverse populations. Although it is possible that β-amyloid plaques and neurofibrillary tau deposits are not causal in AD pathogenesis, it is these abnormal protein deposits that define AD as a unique neurodegenerative disease among different disorders that can lead to dementia. We envision that defining AD as a biological construct will enable a more accurate characterization and understanding of the sequence of events that lead to cognitive impairment that is associated with AD, as well as the multifactorial etiology of dementia. This approach also will enable a more precise approach to interventional trials where specific pathways can be targeted in the disease process and in the appropriate people. PMID:29653606

  12. A Framework for Mining Actionable Navigation Patterns from In-Store RFID Datasets via Indoor Mapping

    PubMed Central

    Shen, Bin; Zheng, Qiuhua; Li, Xingsen; Xu, Libo

    2015-01-01

    With the quick development of RFID technology and the decreasing prices of RFID devices, RFID is becoming widely used in various intelligent services. Especially in the retail application domain, RFID is increasingly adopted to capture the shopping tracks and behavior of in-store customers. To further enhance the potential of this promising application, in this paper, we propose a unified framework for RFID-based path analytics, which uses both in-store shopping paths and RFID-based purchasing data to mine actionable navigation patterns. Four modules of this framework are discussed, which are: (1) mapping from the physical space to the cyber space, (2) data preprocessing, (3) pattern mining and (4) knowledge understanding and utilization. In the data preprocessing module, the critical problem of how to capture the mainstream shopping path sequences while wiping out unnecessary redundant and repeated details is addressed in detail. To solve this problem, two types of redundant patterns, i.e., loop repeat pattern and palindrome-contained pattern are recognized and the corresponding processing algorithms are proposed. The experimental results show that the redundant pattern filtering functions are effective and scalable. Overall, this work builds a bridge between indoor positioning and advanced data mining technologies, and provides a feasible way to study customers’ shopping behaviors via multi-source RFID data. PMID:25751076

  13. A Unified Fisher's Ratio Learning Method for Spatial Filter Optimization.

    PubMed

    Li, Xinyang; Guan, Cuntai; Zhang, Haihong; Ang, Kai Keng

    To detect the mental task of interest, spatial filtering has been widely used to enhance the spatial resolution of electroencephalography (EEG). However, the effectiveness of spatial filtering is undermined due to the significant nonstationarity of EEG. Based on regularization, most of the conventional stationary spatial filter design methods address the nonstationarity at the cost of the interclass discrimination. Moreover, spatial filter optimization is inconsistent with feature extraction when EEG covariance matrices could not be jointly diagonalized due to the regularization. In this paper, we propose a novel framework for a spatial filter design. With Fisher's ratio in feature space directly used as the objective function, the spatial filter optimization is unified with feature extraction. Given its ratio form, the selection of the regularization parameter could be avoided. We evaluate the proposed method on a binary motor imagery data set of 16 subjects, who performed the calibration and test sessions on different days. The experimental results show that the proposed method yields improvement in classification performance for both single broadband and filter bank settings compared with conventional nonunified methods. We also provide a systematic attempt to compare different objective functions in modeling data nonstationarity with simulation studies.To detect the mental task of interest, spatial filtering has been widely used to enhance the spatial resolution of electroencephalography (EEG). However, the effectiveness of spatial filtering is undermined due to the significant nonstationarity of EEG. Based on regularization, most of the conventional stationary spatial filter design methods address the nonstationarity at the cost of the interclass discrimination. Moreover, spatial filter optimization is inconsistent with feature extraction when EEG covariance matrices could not be jointly diagonalized due to the regularization. In this paper, we propose a novel framework for a spatial filter design. With Fisher's ratio in feature space directly used as the objective function, the spatial filter optimization is unified with feature extraction. Given its ratio form, the selection of the regularization parameter could be avoided. We evaluate the proposed method on a binary motor imagery data set of 16 subjects, who performed the calibration and test sessions on different days. The experimental results show that the proposed method yields improvement in classification performance for both single broadband and filter bank settings compared with conventional nonunified methods. We also provide a systematic attempt to compare different objective functions in modeling data nonstationarity with simulation studies.

  14. A movement ecology paradigm for unifying organismal movement research

    PubMed Central

    Nathan, Ran; Getz, Wayne M.; Revilla, Eloy; Holyoak, Marcel; Kadmon, Ronen; Saltz, David; Smouse, Peter E.

    2008-01-01

    Movement of individual organisms is fundamental to life, quilting our planet in a rich tapestry of phenomena with diverse implications for ecosystems and humans. Movement research is both plentiful and insightful, and recent methodological advances facilitate obtaining a detailed view of individual movement. Yet, we lack a general unifying paradigm, derived from first principles, which can place movement studies within a common context and advance the development of a mature scientific discipline. This introductory article to the Movement Ecology Special Feature proposes a paradigm that integrates conceptual, theoretical, methodological, and empirical frameworks for studying movement of all organisms, from microbes to trees to elephants. We introduce a conceptual framework depicting the interplay among four basic mechanistic components of organismal movement: the internal state (why move?), motion (how to move?), and navigation (when and where to move?) capacities of the individual and the external factors affecting movement. We demonstrate how the proposed framework aids the study of various taxa and movement types; promotes the formulation of hypotheses about movement; and complements existing biomechanical, cognitive, random, and optimality paradigms of movement. The proposed framework integrates eclectic research on movement into a structured paradigm and aims at providing a basis for hypothesis generation and a vehicle facilitating the understanding of the causes, mechanisms, and spatiotemporal patterns of movement and their role in various ecological and evolutionary processes. ”Now we must consider in general the common reason for moving with any movement whatever.“ (Aristotle, De Motu Animalium, 4th century B.C.) PMID:19060196

  15. Automated speech understanding: the next generation

    NASA Astrophysics Data System (ADS)

    Picone, J.; Ebel, W. J.; Deshmukh, N.

    1995-04-01

    Modern speech understanding systems merge interdisciplinary technologies from Signal Processing, Pattern Recognition, Natural Language, and Linguistics into a unified statistical framework. These systems, which have applications in a wide range of signal processing problems, represent a revolution in Digital Signal Processing (DSP). Once a field dominated by vector-oriented processors and linear algebra-based mathematics, the current generation of DSP-based systems rely on sophisticated statistical models implemented using a complex software paradigm. Such systems are now capable of understanding continuous speech input for vocabularies of several thousand words in operational environments. The current generation of deployed systems, based on small vocabularies of isolated words, will soon be replaced by a new technology offering natural language access to vast information resources such as the Internet, and provide completely automated voice interfaces for mundane tasks such as travel planning and directory assistance.

  16. Unifying Gate Synthesis and Magic State Distillation.

    PubMed

    Campbell, Earl T; Howard, Mark

    2017-02-10

    The leading paradigm for performing a computation on quantum memories can be encapsulated as distill-then-synthesize. Initially, one performs several rounds of distillation to create high-fidelity magic states that provide one good T gate, an essential quantum logic gate. Subsequently, gate synthesis intersperses many T gates with Clifford gates to realize a desired circuit. We introduce a unified framework that implements one round of distillation and multiquibit gate synthesis in a single step. Typically, our method uses the same number of T gates as conventional synthesis but with the added benefit of quadratic error suppression. Because of this, one less round of magic state distillation needs to be performed, leading to significant resource savings.

  17. The free-energy principle: a unified brain theory?

    PubMed

    Friston, Karl

    2010-02-01

    A free-energy principle has been proposed recently that accounts for action, perception and learning. This Review looks at some key brain theories in the biological (for example, neural Darwinism) and physical (for example, information theory and optimal control theory) sciences from the free-energy perspective. Crucially, one key theme runs through each of these theories - optimization. Furthermore, if we look closely at what is optimized, the same quantity keeps emerging, namely value (expected reward, expected utility) or its complement, surprise (prediction error, expected cost). This is the quantity that is optimized under the free-energy principle, which suggests that several global brain theories might be unified within a free-energy framework.

  18. Customer-experienced rapid prototyping

    NASA Astrophysics Data System (ADS)

    Zhang, Lijuan; Zhang, Fu; Li, Anbo

    2008-12-01

    In order to describe accurately and comprehend quickly the perfect GIS requirements, this article will integrate the ideas of QFD (Quality Function Deployment) and UML (Unified Modeling Language), and analyze the deficiency of prototype development model, and will propose the idea of the Customer-Experienced Rapid Prototyping (CE-RP) and describe in detail the process and framework of the CE-RP, from the angle of the characteristics of Modern-GIS. The CE-RP is mainly composed of Customer Tool-Sets (CTS), Developer Tool-Sets (DTS) and Barrier-Free Semantic Interpreter (BF-SI) and performed by two roles of customer and developer. The main purpose of the CE-RP is to produce the unified and authorized requirements data models between customer and software developer.

  19. Unified reduction principle for the evolution of mutation, migration, and recombination

    PubMed Central

    Altenberg, Lee; Liberman, Uri; Feldman, Marcus W.

    2017-01-01

    Modifier-gene models for the evolution of genetic information transmission between generations of organisms exhibit the reduction principle: Selection favors reduction in the rate of variation production in populations near equilibrium under a balance of constant viability selection and variation production. Whereas this outcome has been proven for a variety of genetic models, it has not been proven in general for multiallelic genetic models of mutation, migration, and recombination modification with arbitrary linkage between the modifier and major genes under viability selection. We show that the reduction principle holds for all of these cases by developing a unifying mathematical framework that characterizes all of these evolutionary models. PMID:28265103

  20. A Unified Approach to Intra-Domain Security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shue, Craig A; Kalafut, Andrew J.; Gupta, Prof. Minaxi

    2009-01-01

    While a variety of mechanisms have been developed for securing individual intra-domain protocols, none address the issue in a holistic manner. We develop a unified framework to secure prominent networking protocols within a single domain. We begin with a secure version of the DHCP protocol, which has the additional feature of providing each host with a certificate. We then leverage these certificates to secure ARP, prevent spoofing within the domain, and secure SSH and VPN connections between the domain and hosts which have previously interacted with it locally. In doing so, we also develop an incrementally deployable public key infrastructuremore » which can later be leveraged to support inter-domain authentication.« less

  1. The Frontier Framework (and its eight Frontier Archetypes): A new conceptual approach to representing staff and patient well-being in health systems.

    PubMed

    Baines, Darrin L

    2018-05-04

    This paper proposes a new conceptual framework for jointly analysing the production of staff and patient welfare in health systems. Research to date has identified a direct link between staff and patient well-being. However, until now, no one has produced a unified framework for analysing them concurrently. In response, this paper introduces the "Frontier Framework". The new conceptual framework is applicable to all health systems regardless of their structure or financing. To demonstrate the benefits of its use, an empirical example of the Frontier Framework is constructed using data from the UK's National Health Service. This paper also introduces eight "Frontier Archetypes", which represent common patterns of welfare generation observable in health organisations involved in programmes of change. These archetypes may be used in planning, monitoring or creating narratives about organisational journeys. Copyright © 2018 The Author. Published by Elsevier Ltd.. All rights reserved.

  2. Feature-based component model for design of embedded systems

    NASA Astrophysics Data System (ADS)

    Zha, Xuan Fang; Sriram, Ram D.

    2004-11-01

    An embedded system is a hybrid of hardware and software, which combines software's flexibility and hardware real-time performance. Embedded systems can be considered as assemblies of hardware and software components. An Open Embedded System Model (OESM) is currently being developed at NIST to provide a standard representation and exchange protocol for embedded systems and system-level design, simulation, and testing information. This paper proposes an approach to representing an embedded system feature-based model in OESM, i.e., Open Embedded System Feature Model (OESFM), addressing models of embedded system artifacts, embedded system components, embedded system features, and embedded system configuration/assembly. The approach provides an object-oriented UML (Unified Modeling Language) representation for the embedded system feature model and defines an extension to the NIST Core Product Model. The model provides a feature-based component framework allowing the designer to develop a virtual embedded system prototype through assembling virtual components. The framework not only provides a formal precise model of the embedded system prototype but also offers the possibility of designing variation of prototypes whose members are derived by changing certain virtual components with different features. A case study example is discussed to illustrate the embedded system model.

  3. Robustness Metrics: How Are They Calculated, When Should They Be Used and Why Do They Give Different Results?

    NASA Astrophysics Data System (ADS)

    McPhail, C.; Maier, H. R.; Kwakkel, J. H.; Giuliani, M.; Castelletti, A.; Westra, S.

    2018-02-01

    Robustness is being used increasingly for decision analysis in relation to deep uncertainty and many metrics have been proposed for its quantification. Recent studies have shown that the application of different robustness metrics can result in different rankings of decision alternatives, but there has been little discussion of what potential causes for this might be. To shed some light on this issue, we present a unifying framework for the calculation of robustness metrics, which assists with understanding how robustness metrics work, when they should be used, and why they sometimes disagree. The framework categorizes the suitability of metrics to a decision-maker based on (1) the decision-context (i.e., the suitability of using absolute performance or regret), (2) the decision-maker's preferred level of risk aversion, and (3) the decision-maker's preference toward maximizing performance, minimizing variance, or some higher-order moment. This article also introduces a conceptual framework describing when relative robustness values of decision alternatives obtained using different metrics are likely to agree and disagree. This is used as a measure of how "stable" the ranking of decision alternatives is when determined using different robustness metrics. The framework is tested on three case studies, including water supply augmentation in Adelaide, Australia, the operation of a multipurpose regulated lake in Italy, and flood protection for a hypothetical river based on a reach of the river Rhine in the Netherlands. The proposed conceptual framework is confirmed by the case study results, providing insight into the reasons for disagreements between rankings obtained using different robustness metrics.

  4. Workload capacity spaces: a unified methodology for response time measures of efficiency as workload is varied.

    PubMed

    Townsend, James T; Eidels, Ami

    2011-08-01

    Increasing the number of available sources of information may impair or facilitate performance, depending on the capacity of the processing system. Tests performed on response time distributions are proving to be useful tools in determining the workload capacity (as well as other properties) of cognitive systems. In this article, we develop a framework and relevant mathematical formulae that represent different capacity assays (Miller's race model bound, Grice's bound, and Townsend's capacity coefficient) in the same space. The new space allows a direct comparison between the distinct bounds and the capacity coefficient values and helps explicate the relationships among the different measures. An analogous common space is proposed for the AND paradigm, relating the capacity index to the Colonius-Vorberg bounds. We illustrate the effectiveness of the unified spaces by presenting data from two simulated models (standard parallel, coactive) and a prototypical visual detection experiment. A conversion table for the unified spaces is provided.

  5. Microphysics in Multi-scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2012-01-01

    Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the microphysics development and its performance for the multi-scale modeling system will be presented.

  6. A unified data representation theory for network visualization, ordering and coarse-graining

    PubMed Central

    Kovács, István A.; Mizsei, Réka; Csermely, Péter

    2015-01-01

    Representation of large data sets became a key question of many scientific disciplines in the last decade. Several approaches for network visualization, data ordering and coarse-graining accomplished this goal. However, there was no underlying theoretical framework linking these problems. Here we show an elegant, information theoretic data representation approach as a unified solution of network visualization, data ordering and coarse-graining. The optimal representation is the hardest to distinguish from the original data matrix, measured by the relative entropy. The representation of network nodes as probability distributions provides an efficient visualization method and, in one dimension, an ordering of network nodes and edges. Coarse-grained representations of the input network enable both efficient data compression and hierarchical visualization to achieve high quality representations of larger data sets. Our unified data representation theory will help the analysis of extensive data sets, by revealing the large-scale structure of complex networks in a comprehensible form. PMID:26348923

  7. Requirements Modeling with the Aspect-oriented User Requirements Notation (AoURN): A Case Study

    NASA Astrophysics Data System (ADS)

    Mussbacher, Gunter; Amyot, Daniel; Araújo, João; Moreira, Ana

    The User Requirements Notation (URN) is a recent ITU-T standard that supports requirements engineering activities. The Aspect-oriented URN (AoURN) adds aspect-oriented concepts to URN, creating a unified framework that allows for scenario-based, goal-oriented, and aspect-oriented modeling. AoURN is applied to the car crash crisis management system (CCCMS), modeling its functional and non-functional requirements (NFRs). AoURN generally models all use cases, NFRs, and stakeholders as individual concerns and provides general guidelines for concern identification. AoURN handles interactions between concerns, capturing their dependencies and conflicts as well as the resolutions. We present a qualitative comparison of aspect-oriented techniques for scenario-based and goal-oriented requirements engineering. An evaluation carried out based on the metrics adapted from literature and a task-based evaluation suggest that AoURN models are more scalable than URN models and exhibit better modularity, reusability, and maintainability.

  8. Color Sparse Representations for Image Processing: Review, Models, and Prospects.

    PubMed

    Barthélemy, Quentin; Larue, Anthony; Mars, Jérôme I

    2015-11-01

    Sparse representations have been extended to deal with color images composed of three channels. A review of dictionary-learning-based sparse representations for color images is made here, detailing the differences between the models, and comparing their results on the real and simulated data. These models are considered in a unifying framework that is based on the degrees of freedom of the linear filtering/transformation of the color channels. Moreover, this allows it to be shown that the scalar quaternionic linear model is equivalent to constrained matrix-based color filtering, which highlights the filtering implicitly applied through this model. Based on this reformulation, the new color filtering model is introduced, using unconstrained filters. In this model, spatial morphologies of color images are encoded by atoms, and colors are encoded by color filters. Color variability is no longer captured in increasing the dictionary size, but with color filters, this gives an efficient color representation.

  9. Model-based object classification using unification grammars and abstract representations

    NASA Astrophysics Data System (ADS)

    Liburdy, Kathleen A.; Schalkoff, Robert J.

    1993-04-01

    The design and implementation of a high level computer vision system which performs object classification is described. General object labelling and functional analysis require models of classes which display a wide range of geometric variations. A large representational gap exists between abstract criteria such as `graspable' and current geometric image descriptions. The vision system developed and described in this work addresses this problem and implements solutions based on a fusion of semantics, unification, and formal language theory. Object models are represented using unification grammars, which provide a framework for the integration of structure and semantics. A methodology for the derivation of symbolic image descriptions capable of interacting with the grammar-based models is described and implemented. A unification-based parser developed for this system achieves object classification by determining if the symbolic image description can be unified with the abstract criteria of an object model. Future research directions are indicated.

  10. Topology Optimization using the Level Set and eXtended Finite Element Methods: Theory and Applications

    NASA Astrophysics Data System (ADS)

    Villanueva Perez, Carlos Hernan

    Computational design optimization provides designers with automated techniques to develop novel and non-intuitive optimal designs. Topology optimization is a design optimization technique that allows for the evolution of a broad variety of geometries in the optimization process. Traditional density-based topology optimization methods often lack a sufficient resolution of the geometry and physical response, which prevents direct use of the optimized design in manufacturing and the accurate modeling of the physical response of boundary conditions. The goal of this thesis is to introduce a unified topology optimization framework that uses the Level Set Method (LSM) to describe the design geometry and the eXtended Finite Element Method (XFEM) to solve the governing equations and measure the performance of the design. The methodology is presented as an alternative to density-based optimization approaches, and is able to accommodate a broad range of engineering design problems. The framework presents state-of-the-art methods for immersed boundary techniques to stabilize the systems of equations and enforce the boundary conditions, and is studied with applications in 2D and 3D linear elastic structures, incompressible flow, and energy and species transport problems to test the robustness and the characteristics of the method. A comparison of the framework against density-based topology optimization approaches is studied with regards to convergence, performance, and the capability to manufacture the designs. Furthermore, the ability to control the shape of the design to operate within manufacturing constraints is developed and studied. The analysis capability of the framework is validated quantitatively through comparison against previous benchmark studies, and qualitatively through its application to topology optimization problems. The design optimization problems converge to intuitive designs and resembled well the results from previous 2D or density-based studies.

  11. The jABC Approach to Rigorous Collaborative Development of SCM Applications

    NASA Astrophysics Data System (ADS)

    Hörmann, Martina; Margaria, Tiziana; Mender, Thomas; Nagel, Ralf; Steffen, Bernhard; Trinh, Hong

    Our approach to the model-driven collaborative design of IKEA's P3 Delivery Management Process uses the jABC [9] for model driven mediation and choreography to complement a RUP-based (Rational Unified Process) development process. jABC is a framework for service development based on Lightweight Process Coordination. Users (product developers and system/software designers) easily develop services and applications by composing reusable building-blocks into (flow-) graph structures that can be animated, analyzed, simulated, verified, executed, and compiled. This way of handling the collaborative design of complex embedded systems has proven to be effective and adequate for the cooperation of non-programmers and non-technical people, which is the focus of this contribution, and it is now being rolled out in the operative practice.

  12. An integrated radar model solution for mission level performance and cost trades

    NASA Astrophysics Data System (ADS)

    Hodge, John; Duncan, Kerron; Zimmerman, Madeline; Drupp, Rob; Manno, Mike; Barrett, Donald; Smith, Amelia

    2017-05-01

    A fully integrated Mission-Level Radar model is in development as part of a multi-year effort under the Northrop Grumman Mission Systems (NGMS) sector's Model Based Engineering (MBE) initiative to digitally interconnect and unify previously separate performance and cost models. In 2016, an NGMS internal research and development (IR and D) funded multidisciplinary team integrated radio frequency (RF), power, control, size, weight, thermal, and cost models together using a commercial-off-the-shelf software, ModelCenter, for an Active Electronically Scanned Array (AESA) radar system. Each represented model was digitally connected with standard interfaces and unified to allow end-to-end mission system optimization and trade studies. The radar model was then linked to the Air Force's own mission modeling framework (AFSIM). The team first had to identify the necessary models, and with the aid of subject matter experts (SMEs) understand and document the inputs, outputs, and behaviors of the component models. This agile development process and collaboration enabled rapid integration of disparate models and the validation of their combined system performance. This MBE framework will allow NGMS to design systems more efficiently and affordably, optimize architectures, and provide increased value to the customer. The model integrates detailed component models that validate cost and performance at the physics level with high-level models that provide visualization of a platform mission. This connectivity of component to mission models allows hardware and software design solutions to be better optimized to meet mission needs, creating cost-optimal solutions for the customer, while reducing design cycle time through risk mitigation and early validation of design decisions.

  13. A unifying model of concurrent spatial and temporal modularity in muscle activity.

    PubMed

    Delis, Ioannis; Panzeri, Stefano; Pozzo, Thierry; Berret, Bastien

    2014-02-01

    Modularity in the central nervous system (CNS), i.e., the brain capability to generate a wide repertoire of movements by combining a small number of building blocks ("modules"), is thought to underlie the control of movement. Numerous studies reported evidence for such a modular organization by identifying invariant muscle activation patterns across various tasks. However, previous studies relied on decompositions differing in both the nature and dimensionality of the identified modules. Here, we derive a single framework that encompasses all influential models of muscle activation modularity. We introduce a new model (named space-by-time decomposition) that factorizes muscle activations into concurrent spatial and temporal modules. To infer these modules, we develop an algorithm, referred to as sample-based nonnegative matrix trifactorization (sNM3F). We test the space-by-time decomposition on a comprehensive electromyographic dataset recorded during execution of arm pointing movements and show that it provides a low-dimensional yet accurate, highly flexible and task-relevant representation of muscle patterns. The extracted modules have a well characterized functional meaning and implement an efficient trade-off between replication of the original muscle patterns and task discriminability. Furthermore, they are compatible with the modules extracted from existing models, such as synchronous synergies and temporal primitives, and generalize time-varying synergies. Our results indicate the effectiveness of a simultaneous but separate condensation of spatial and temporal dimensions of muscle patterns. The space-by-time decomposition accommodates a unified view of the hierarchical mapping from task parameters to coordinated muscle activations, which could be employed as a reference framework for studying compositional motor control.

  14. Phase Equilibrium, Chemical Equilibrium, and a Test of the Third Law: Experiments for Physical Chemistry.

    ERIC Educational Resources Information Center

    Dannhauser, Walter

    1980-01-01

    Described is an experiment designed to provide an experimental basis for a unifying point of view (utilizing theoretical framework and chemistry laboratory experiments) for physical chemistry students. Three experiments are described: phase equilibrium, chemical equilibrium, and a test of the third law of thermodynamics. (Author/DS)

  15. Persuasive Writing, A Curriculum Design: K-12.

    ERIC Educational Resources Information Center

    Bennett, Susan G., Ed.

    In the spirit of the Texas Hill Country Writing Project and in response to the requirements of the Texas Assessment of Basic Skills, this guide presents writing assignments reflecting a commitment to a unified writing program for kindergarten through grade twelve. The framework for the assignments is adopted from the discourse theory of James…

  16. Practical Issues in Estimating Classification Accuracy and Consistency with R Package cacIRT

    ERIC Educational Resources Information Center

    Lathrop, Quinn N.

    2015-01-01

    There are two main lines of research in estimating classification accuracy (CA) and classification consistency (CC) under Item Response Theory (IRT). The R package cacIRT provides computer implementations of both approaches in an accessible and unified framework. Even with available implementations, there remains decisions a researcher faces when…

  17. Optimization Techniques for Analysis of Biological and Social Networks

    DTIC Science & Technology

    2012-03-28

    analyzing a new metaheuristic technique, variable objective search. 3. Experimentation and application: Implement the proposed algorithms , test and fine...alternative mathematical programming formulations, their theoretical analysis, the development of exact algorithms , and heuristics. Originally, clusters...systematic fashion under a unifying theoretical and algorithmic framework. Optimization, Complex Networks, Social Network Analysis, Computational

  18. Conceptualizing the Suicide-Alcohol Relationship.

    ERIC Educational Resources Information Center

    Rogers, James R.

    Despite the strong empirical evidence linking alcohol use across varying levels to suicidal behavior, the field is lacking a unifying theoretical framework in this area. The concept of alcohol induced myopia to explain the varied effects of alcohol on the behaviors of individuals who drink has been proposed. The term "alcohol myopia" refers to its…

  19. Diffusion of Innovations Theory: A Unifying Framework for HIV Peer Education

    ERIC Educational Resources Information Center

    Ramseyer Winter, Virginia

    2013-01-01

    Peer education programs are a popular approach to preventing HIV infection among adolescents. While the programs show promise for effectively preventing HIV among the peers who are provided education, little evaluation research has been conducted to determine if the peer educators themselves experience knowledge, attitude, and behavior changes. A…

  20. Utilizing Emergency Departments as Learning Spaces through a Post-Occupancy Evaluation

    ERIC Educational Resources Information Center

    Guinther, Lindsey Lawry; Carll-White, Allison

    2014-01-01

    This case study describes the use of an emergency department as a learning space for interior design students. Kolb's (1984; 2005) framework identifies the characteristics of experiential learning and learning spaces, serving as the bridge to unify learning styles and the learning environment. A post-occupancy evaluation was conducted with…

  1. The Reciprocal Relationship between Motivation and Self-Regulation: A Longitudinal Study on Academic Performance

    ERIC Educational Resources Information Center

    Ning, Hoi Kwan; Downing, Kevin

    2010-01-01

    While previous studies have examined the single directional effects of motivation constructs in influencing students' use of self-regulatory strategies, few attempts have been made to unravel their interrelationship in a unified framework. In this study we adopt the social cognitive perspective and examine the reciprocal interplay between…

  2. Professionalization in Universities and European Convergence

    ERIC Educational Resources Information Center

    Vivas, Amparo Jimenez; Hevia, David Menendez Alvarez

    2009-01-01

    The constant assessment of the quality of higher education within the framework of European convergence is a challenge for all those universities that wish their degrees and diplomas to reflect a unified Europe. As is the case in any assessment, change and review process, the quest to improve quality implies measuring achievement of the objectives…

  3. Reaching and Remediating "Grey-Area" Middle School Students

    ERIC Educational Resources Information Center

    Jorgenson, Olaf; Smolkovich, Greg E.

    2004-01-01

    This article presents a framework for school administrators developed by Mesa Unified School district used in identifying and assisting the subtly struggling adolescents. Mesa's "safety net" approach targets middle grades students in the midst of their formative, pre-high school experience. Here, it is stated that the first step to identify a…

  4. Evolution of Students' Ideas about Natural Selection through a Constructivist Framework

    ERIC Educational Resources Information Center

    Baumgartner, Erin; Duncan, Kanesa

    2009-01-01

    Educating students about the process of evolution through natural selection is vitally important because not only is it the unifying theory of biological science, it is also widely regarded as difficult for students to fully comprehend. Anderson and colleagues (2002) describe alternative ideas and misconceptions about natural selection as highly…

  5. A Narrow Path through the Broad Middle: Mapping Institutional Considerations for ESEA Reauthorization

    ERIC Educational Resources Information Center

    DeBray, Elizabeth; Houck, Eric A.

    2011-01-01

    This article uses an institutional framework to analyze the political context of the next reauthorization of the Elementary and Secondary Education Act. The authors analyze three relevant factors in the institutional environment: the role of traditional party politics, including theories of divided versus unified party government; the entrance of…

  6. Understanding Early Childhood Student Teachers' Acceptance and Use of Interactive Whiteboard

    ERIC Educational Resources Information Center

    Wong, Kung-Teck; Russo, Sharon; McDowall, Janet

    2013-01-01

    Purpose: The purpose of this paper is to understand early childhood student teachers' self-reported acceptance and use of interactive whiteboard (IWB), by employing the Unified Theory of Acceptance and Use of Technology (UTAUT) as the research framework. Design/methodology/approach: A total of 112 student teachers enrolled in science-related…

  7. Factors Influencing Students' Adoption of E-Learning: A Structural Equation Modeling Approach

    ERIC Educational Resources Information Center

    Tarhini, Ali; Masa'deh, Ra'ed; Al-Busaidi, Kamla Ali; Mohammed, Ashraf Bany; Maqableh, Mahmoud

    2017-01-01

    Purpose: This research aims to examine the factors that may hinder or enable the adoption of e-learning systems by university students. Design/methodology/approach: A conceptual framework was developed through extending the unified theory of acceptance and use of technology (performance expectancy, effort expectancy, hedonic motivation, habit,…

  8. The Long Way towards Abandoning ECEC Dichotomy in Greece

    ERIC Educational Resources Information Center

    Rentzou, Konstantina

    2018-01-01

    Although Greece has a dichotomous system both in terms of Early Childhood Education and Care (ECEC) services and in terms of ECEC workers' preparation programmes, in 2016 Greek government's Organization for ECEC organized an open colloquy about the adoption of a 'Unified National Framework for Early Childhood Education and Care', causing a heated…

  9. "A Unified Poet Alliance": The Personal and Social Outcomes of Youth Spoken Word Poetry Programming

    ERIC Educational Resources Information Center

    Weinstein, Susan

    2010-01-01

    This article places youth spoken word (YSW) poetry programming within the larger framework of arts education. Drawing primarily on transcripts of interviews with teen poets and adult teaching artists and program administrators, the article identifies specific benefits that participants ascribe to youth spoken word, including the development of…

  10. The Four Elementary Forms of Sociality: Framework for a Unified Theory of Social Relations.

    ERIC Educational Resources Information Center

    Fiske, Alan Page

    1992-01-01

    A theory is presented that postulates that people in all cultures use four relational models to generate most kinds of social interaction, evaluation, and affect. Ethnographic and field studies (n=19) have supported cultural variations on communal sharing; authority ranking; equality matching; and market pricing. (SLD)

  11. A Unifying Framework for Teaching Nonparametric Statistical Tests

    ERIC Educational Resources Information Center

    Bargagliotti, Anna E.; Orrison, Michael E.

    2014-01-01

    Increased importance is being placed on statistics at both the K-12 and undergraduate level. Research divulging effective methods to teach specific statistical concepts is still widely sought after. In this paper, we focus on best practices for teaching topics in nonparametric statistics at the undergraduate level. To motivate the work, we…

  12. The road against fatalities: infrastructure spending vs. regulation??

    PubMed

    Albalate, Daniel; Fernández, Laura; Yarygina, Anastasiya

    2013-10-01

    The road safety literature is typified by a high degree of compartmentalization between studies that focus on infrastructure and traffic conditions and those devoted to the evaluation of public policies and regulations. As a result, few studies adopt a unified empirical framework in their attempts at evaluating the road safety performance of public interventions, thus limiting our understanding of successful strategies in this regard. This paper considers both types of determinants in an analysis of a European country that has enjoyed considerable success in reducing road fatalities. After constructing a panel data set with road safety outcomes for all Spanish provinces between 1990 and 2009, we evaluate the role of the technical characteristics of infrastructure and recent infrastructure spending together with the main regulatory changes introduced. Our results show the importance of considering both types of determinants in a unified framework. Moreover, we highlight the importance of maintenance spending given its effectiveness in reducing fatalities and casualties in the current economic context of austerity that is having such a marked impact on investment efforts in Spain. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. A unified framework of unsupervised subjective optimized bit allocation for multiple video object coding

    NASA Astrophysics Data System (ADS)

    Chen, Zhenzhong; Han, Junwei; Ngan, King Ngi

    2005-10-01

    MPEG-4 treats a scene as a composition of several objects or so-called video object planes (VOPs) that are separately encoded and decoded. Such a flexible video coding framework makes it possible to code different video object with different distortion scale. It is necessary to analyze the priority of the video objects according to its semantic importance, intrinsic properties and psycho-visual characteristics such that the bit budget can be distributed properly to video objects to improve the perceptual quality of the compressed video. This paper aims to provide an automatic video object priority definition method based on object-level visual attention model and further propose an optimization framework for video object bit allocation. One significant contribution of this work is that the human visual system characteristics are incorporated into the video coding optimization process. Another advantage is that the priority of the video object can be obtained automatically instead of fixing weighting factors before encoding or relying on the user interactivity. To evaluate the performance of the proposed approach, we compare it with traditional verification model bit allocation and the optimal multiple video object bit allocation algorithms. Comparing with traditional bit allocation algorithms, the objective quality of the object with higher priority is significantly improved under this framework. These results demonstrate the usefulness of this unsupervised subjective quality lifting framework.

  14. Foundational Principles for Large-Scale Inference: Illustrations Through Correlation Mining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hero, Alfred O.; Rajaratnam, Bala

    When can reliable inference be drawn in the ‘‘Big Data’’ context? This article presents a framework for answering this fundamental question in the context of correlation mining, with implications for general large-scale inference. In large-scale data applications like genomics, connectomics, and eco-informatics, the data set is often variable rich but sample starved: a regime where the number n of acquired samples (statistical replicates) is far fewer than the number p of observed variables (genes, neurons, voxels, or chemical constituents). Much of recent work has focused on understanding the computational complexity of proposed methods for ‘‘Big Data.’’ Sample complexity, however, hasmore » received relatively less attention, especially in the setting when the sample size n is fixed, and the dimension p grows without bound. To address this gap, we develop a unified statistical framework that explicitly quantifies the sample complexity of various inferential tasks. Sampling regimes can be divided into several categories: 1) the classical asymptotic regime where the variable dimension is fixed and the sample size goes to infinity; 2) the mixed asymptotic regime where both variable dimension and sample size go to infinity at comparable rates; and 3) the purely high-dimensional asymptotic regime where the variable dimension goes to infinity and the sample size is fixed. Each regime has its niche but only the latter regime applies to exa-scale data dimension. We illustrate this high-dimensional framework for the problem of correlation mining, where it is the matrix of pairwise and partial correlations among the variables that are of interest. Correlation mining arises in numerous applications and subsumes the regression context as a special case. We demonstrate various regimes of correlation mining based on the unifying perspective of high-dimensional learning rates and sample complexity for different structured covariance models and different inference tasks.« less

  15. Foundational Principles for Large-Scale Inference: Illustrations Through Correlation Mining

    PubMed Central

    Hero, Alfred O.; Rajaratnam, Bala

    2015-01-01

    When can reliable inference be drawn in fue “Big Data” context? This paper presents a framework for answering this fundamental question in the context of correlation mining, wifu implications for general large scale inference. In large scale data applications like genomics, connectomics, and eco-informatics fue dataset is often variable-rich but sample-starved: a regime where the number n of acquired samples (statistical replicates) is far fewer than fue number p of observed variables (genes, neurons, voxels, or chemical constituents). Much of recent work has focused on understanding the computational complexity of proposed methods for “Big Data”. Sample complexity however has received relatively less attention, especially in the setting when the sample size n is fixed, and the dimension p grows without bound. To address fuis gap, we develop a unified statistical framework that explicitly quantifies the sample complexity of various inferential tasks. Sampling regimes can be divided into several categories: 1) the classical asymptotic regime where fue variable dimension is fixed and fue sample size goes to infinity; 2) the mixed asymptotic regime where both variable dimension and sample size go to infinity at comparable rates; 3) the purely high dimensional asymptotic regime where the variable dimension goes to infinity and the sample size is fixed. Each regime has its niche but only the latter regime applies to exa cale data dimension. We illustrate this high dimensional framework for the problem of correlation mining, where it is the matrix of pairwise and partial correlations among the variables fua t are of interest. Correlation mining arises in numerous applications and subsumes the regression context as a special case. we demonstrate various regimes of correlation mining based on the unifying perspective of high dimensional learning rates and sample complexity for different structured covariance models and different inference tasks. PMID:27087700

  16. A structured interface to the object-oriented genomics unified schema for XML-formatted data.

    PubMed

    Clark, Terry; Jurek, Josef; Kettler, Gregory; Preuss, Daphe

    2005-01-01

    Data management systems are fast becoming required components in many biology laboratories as the role of computer-based information grows. Although the need for data management systems is on the rise, their inherent complexities can deter the full and routine use of their computational capabilities. The significant undertaking to implement a capable production system can be reduced in part by adapting an established data management system. In such a way, we are leveraging the Genomics Unified Schema (GUS) developed at the Computational Biology and Informatics Laboratory at the University of Pennsylvania as a foundation for managing and analysing DNA sequence data in centromere research projects around Arabidopsis thaliana and related species. Because GUS provides a core schema that includes support for genome sequences, mRNA and its expression, and annotated chromosomes, it is ideal for synthesising a variety of parameters to analyse these repetitive and highly dynamic portions of the genome. Despite this, production-strength data management frameworks are complex, requiring dedicated efforts to adapt and maintain. The work reported in this article addresses one component of such an effort, namely the pivotal task of marshalling data from various sources into GUS. In order to harness GUS for our project, and motivated by efficiency needs, we developed a structured framework for transferring data into GUS from outside sources. This technology is embodied in a GUS object-layer processor, XMLGUS. XMLGUS facilitates incorporating data into GUS by (i) formulating an XML interface that includes relational database key constraint definitions, (ii) regularising traversal through that XML, (iii) realising automatic processing of the XML with database key constraints and (iv) allowing for special processing of input data within the framework for automated processing. The application of XMLGUS to production pipeline processing for a sequencing project and inputting the Arabidopsis genome into GUS is discussed. XMLGUS is available from the Flora website (http://flora.ittc.ku.edu/).

  17. Foundational Principles for Large-Scale Inference: Illustrations Through Correlation Mining

    DOE PAGES

    Hero, Alfred O.; Rajaratnam, Bala

    2015-12-09

    When can reliable inference be drawn in the ‘‘Big Data’’ context? This article presents a framework for answering this fundamental question in the context of correlation mining, with implications for general large-scale inference. In large-scale data applications like genomics, connectomics, and eco-informatics, the data set is often variable rich but sample starved: a regime where the number n of acquired samples (statistical replicates) is far fewer than the number p of observed variables (genes, neurons, voxels, or chemical constituents). Much of recent work has focused on understanding the computational complexity of proposed methods for ‘‘Big Data.’’ Sample complexity, however, hasmore » received relatively less attention, especially in the setting when the sample size n is fixed, and the dimension p grows without bound. To address this gap, we develop a unified statistical framework that explicitly quantifies the sample complexity of various inferential tasks. Sampling regimes can be divided into several categories: 1) the classical asymptotic regime where the variable dimension is fixed and the sample size goes to infinity; 2) the mixed asymptotic regime where both variable dimension and sample size go to infinity at comparable rates; and 3) the purely high-dimensional asymptotic regime where the variable dimension goes to infinity and the sample size is fixed. Each regime has its niche but only the latter regime applies to exa-scale data dimension. We illustrate this high-dimensional framework for the problem of correlation mining, where it is the matrix of pairwise and partial correlations among the variables that are of interest. Correlation mining arises in numerous applications and subsumes the regression context as a special case. We demonstrate various regimes of correlation mining based on the unifying perspective of high-dimensional learning rates and sample complexity for different structured covariance models and different inference tasks.« less

  18. Qualitative insights into practice time management: does 'patient-centred time' in practice management offer a portal to improved access?

    PubMed Central

    Buetow, S; Adair, V; Coster, G; Hight, M; Gribben, B; Mitchell, E

    2002-01-01

    BACKGROUND: Different sets of literature suggest how aspects of practice time management can limit access to general practitioner (GP) care. Researchers have not organised this knowledge into a unified framework that can enhance understanding of barriers to, and opportunities for, improved access. AIM: To suggest a framework conceptualising how differences in professional and cultural understanding of practice time management in Auckland, New Zealand, influence access to GP care for children with chronic asthma. DESIGN OF STUDY: A qualitative study involving selective sampling, semi-structured interviews on barriers to access, and a general inductive approach. SETTING: Twenty-nine key informants and ten mothers of children with chronic, moderate to severe asthma and poor access to GP care in Auckland. METHOD: Development of a framework from themes describing barriers associated with, and needs for, practice time management. The themes were independently identified by two authors from transcribed interviews and confirmed through informant checking. Themes from key informant and patient interviews were triangulated with each other and with published literature. RESULTS: The framework distinguishes 'practice-centred time' from 'patient-centred time.' A predominance of 'practice-centred time' and an unmet opportunity for 'patient-centred time' are suggested by the persistence of five barriers to accessing GP care: limited hours of opening; traditional appointment systems; practice intolerance of missed appointments; long waiting times in the practice; and inadequate consultation lengths. None of the barriers is specific to asthmatic children. CONCLUSION: A unified framework was suggested for understanding how the organisation of practice work time can influence access to GP care by groups including asthmatic children. PMID:12528583

  19. Models for evaluating the performability of degradable computing systems

    NASA Technical Reports Server (NTRS)

    Wu, L. T.

    1982-01-01

    Recent advances in multiprocessor technology established the need for unified methods to evaluate computing systems performance and reliability. In response to this modeling need, a general modeling framework that permits the modeling, analysis and evaluation of degradable computing systems is considered. Within this framework, several user oriented performance variables are identified and shown to be proper generalizations of the traditional notions of system performance and reliability. Furthermore, a time varying version of the model is developed to generalize the traditional fault tree reliability evaluation methods of phased missions.

  20. Einstein-Yang-Mills-Dirac systems from the discretized Kaluza-Klein theory

    NASA Astrophysics Data System (ADS)

    Wali, Kameshwar; Viet, Nguyen Ali

    2017-01-01

    A unified theory of the non-Abelian gauge interactions with gravity in the framework of a discretized Kaluza-Klein theory is constructed with a modified Dirac operator and wedge product. All the couplings of chiral spinors to the non-Abelian gauge fields emerge naturally as components of the coupling of the chiral spinors in the generalized gravity together with some new interactions. In particular, the currently prevailing gravity-QCD quark and gravity-electroweak-quark and lepton models are shown to follow as special cases of the general framework.

  1. A unified framework for evaluating the risk of re-identification of text de-identification tools.

    PubMed

    Scaiano, Martin; Middleton, Grant; Arbuckle, Luk; Kolhatkar, Varada; Peyton, Liam; Dowling, Moira; Gipson, Debbie S; El Emam, Khaled

    2016-10-01

    It has become regular practice to de-identify unstructured medical text for use in research using automatic methods, the goal of which is to remove patient identifying information to minimize re-identification risk. The metrics commonly used to determine if these systems are performing well do not accurately reflect the risk of a patient being re-identified. We therefore developed a framework for measuring the risk of re-identification associated with textual data releases. We apply the proposed evaluation framework to a data set from the University of Michigan Medical School. Our risk assessment results are then compared with those that would be obtained using a typical contemporary micro-average evaluation of recall in order to illustrate the difference between the proposed evaluation framework and the current baseline method. We demonstrate how this framework compares against common measures of the re-identification risk associated with an automated text de-identification process. For the probability of re-identification using our evaluation framework we obtained a mean value for direct identifiers of 0.0074 and a mean value for quasi-identifiers of 0.0022. The 95% confidence interval for these estimates were below the relevant thresholds. The threshold for direct identifier risk was based on previously used approaches in the literature. The threshold for quasi-identifiers was determined based on the context of the data release following commonly used de-identification criteria for structured data. Our framework attempts to correct for poorly distributed evaluation corpora, accounts for the data release context, and avoids the often optimistic assumptions that are made using the more traditional evaluation approach. It therefore provides a more realistic estimate of the true probability of re-identification. This framework should be used as a basis for computing re-identification risk in order to more realistically evaluate future text de-identification tools. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  2. The Development of Cadastral Domain Model Oriented at Unified Real Estate Registration of China Based on Ontology

    NASA Astrophysics Data System (ADS)

    Li, M.; Zhu, X.; Shen, C.; Chen, D.; Guo, W.

    2012-07-01

    With the certain regulation of unified real estate registration taken by the Property Law and the step-by-step advance of simultaneous development in urban and rural in China, it is the premise and foundation to clearly specify property rights and their relations in promoting the integrated management of urban and rural land. This paper aims at developing a cadastral domain model oriented at unified real estate registration of China from the perspective of legal and spatial, which set up the foundation for unified real estate registration, and facilitates the effective interchange of cadastral information and the administration of land use. The legal cadastral model is provided based on the analysis of gap between current model and the demand of unified real estate registration, which implies the restrictions between different rights. Then the new cadastral domain model is constructed based on the legal cadastral domain model and CCDM (van Oosterom et al., 2006), which integrate real estate rights of urban land and rural land. Finally, the model is validated by a prototype system. The results show that the model is applicable for unified real estate registration in China.

  3. Network planning under uncertainties

    NASA Astrophysics Data System (ADS)

    Ho, Kwok Shing; Cheung, Kwok Wai

    2008-11-01

    One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a generic framework for solving the network planning problem under uncertainties. In addition to reviewing the various network planning problems involving uncertainties, we also propose that a unified framework based on robust optimization can be used to solve a rather large segment of network planning problem under uncertainties. Robust optimization is first introduced in the operations research literature and is a framework that incorporates information about the uncertainty sets for the parameters in the optimization model. Even though robust optimization is originated from tackling the uncertainty in the optimization process, it can serve as a comprehensive and suitable framework for tackling generic network planning problems under uncertainties. In this paper, we begin by explaining the main ideas behind the robust optimization approach. Then we demonstrate the capabilities of the proposed framework by giving out some examples of how the robust optimization framework can be applied to the current common network planning problems under uncertain environments. Next, we list some practical considerations for solving the network planning problem under uncertainties with the proposed framework. Finally, we conclude this article with some thoughts on the future directions for applying this framework to solve other network planning problems.

  4. A unifying framework for ghost-free Lorentz-invariant Lagrangian field theories

    NASA Astrophysics Data System (ADS)

    Li, Wenliang

    2018-04-01

    We propose a framework for Lorentz-invariant Lagrangian field theories where Ostrogradsky's scalar ghosts could be absent. A key ingredient is the generalized Kronecker delta. The general Lagrangians are reformulated in the language of differential forms. The absence of higher order equations of motion for the scalar modes stems from the basic fact that every exact form is closed. The well-established Lagrangian theories for spin-0, spin-1, p-form, spin-2 fields have natural formulations in this framework. We also propose novel building blocks for Lagrangian field theories. Some of them are novel nonlinear derivative terms for spin-2 fields. It is nontrivial that Ostrogradsky's scalar ghosts are absent in these fully nonlinear theories.

  5. Computational Design for Multifunctional Microstructural Composites

    NASA Astrophysics Data System (ADS)

    Chen, Yuhang; Zhou, Shiwei; Li, Qing

    As an important class of natural and engineered materials, periodic microstructural composites have drawn substantial attention from the material research community for their excellent flexibility in tailoring various desirable physical behaviors. To develop periodic cellular composites for multifunctional applications, this paper presents a unified design framework for combining stiffness and a range of physical properties governed by quasi-harmonic partial differential equations. A multiphase microstructural configuration is sought within a periodic base-cell design domain using topology optimization. To deal with conflicting properties, e.g. conductivity/permeability versus bulk modulus, the optimum is sought in a Pareto sense. Illustrative examples demonstrate the capability of the presented procedure for the design of multiphysical composites and tissue scaffolds.

  6. Algorithm Building and Learning Programming Languages Using a New Educational Paradigm

    NASA Astrophysics Data System (ADS)

    Jain, Anshul K.; Singhal, Manik; Gupta, Manu Sheel

    2011-08-01

    This research paper presents a new concept of using a single tool to associate syntax of various programming languages, algorithms and basic coding techniques. A simple framework has been programmed in Python that helps students learn skills to develop algorithms, and implement them in various programming languages. The tool provides an innovative and a unified graphical user interface for development of multimedia objects, educational games and applications. It also aids collaborative learning amongst students and teachers through an integrated mechanism based on Remote Procedure Calls. The paper also elucidates an innovative method for code generation to enable students to learn the basics of programming languages using drag-n-drop methods for image objects.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lue Xing; Sun Kun; Wang Pan

    In the framework of Bell-polynomial manipulations, under investigation hereby are three single-field bilinearizable equations: the (1+1)-dimensional shallow water wave model, Boiti-Leon-Manna-Pempinelli model, and (2+1)-dimensional Sawada-Kotera model. Based on the concept of scale invariance, a direct and unifying Bell-polynomial scheme is employed to achieve the Baecklund transformations and Lax pairs associated with those three soliton equations. Note that the Bell-polynomial expressions and Bell-polynomial-typed Baecklund transformations for those three soliton equations can be, respectively, cast into the bilinear equations and bilinear Baecklund transformations with symbolic computation. Consequently, it is also shown that the Bell-polynomial-typed Baecklund transformations can be linearized into the correspondingmore » Lax pairs.« less

  8. General Multivariate Linear Modeling of Surface Shapes Using SurfStat

    PubMed Central

    Chung, Moo K.; Worsley, Keith J.; Nacewicz, Brendon, M.; Dalton, Kim M.; Davidson, Richard J.

    2010-01-01

    Although there are many imaging studies on traditional ROI-based amygdala volumetry, there are very few studies on modeling amygdala shape variations. This paper present a unified computational and statistical framework for modeling amygdala shape variations in a clinical population. The weighted spherical harmonic representation is used as to parameterize, to smooth out, and to normalize amygdala surfaces. The representation is subsequently used as an input for multivariate linear models accounting for nuisance covariates such as age and brain size difference using SurfStat package that completely avoids the complexity of specifying design matrices. The methodology has been applied for quantifying abnormal local amygdala shape variations in 22 high functioning autistic subjects. PMID:20620211

  9. Consistent multiphysics simulation of a central tower CSP plant as applied to ISTORE

    NASA Astrophysics Data System (ADS)

    Votyakov, Evgeny V.; Papanicolas, Costas N.

    2017-06-01

    We present a unified consistent multiphysics approach to model a central tower CSP plant. The framework for the model includes Monte Carlo ray tracing (RT) and computational fluid dynamics (CFD) components utilizing the OpenFOAM C++ software library. The RT part works effectively with complex surfaces of engineering design given in CAD formats. The CFD simulation, which is based on 3D Navier-Stokes equations, takes into account all possible heat transfer mechanisms: radiation, conduction, and convection. Utilizing this package, the solar field of the experimental Platform for Research, Observation, and TEchnological Applications in Solar Energy (PROTEAS) and the Integrated STOrage and Receiver (ISTORE), developed at the Cyprus Institute, are being examined.

  10. Teaching Tree-Thinking to Undergraduate Biology Students

    PubMed Central

    2011-01-01

    Evolution is the unifying principle of all biology, and understanding how evolutionary relationships are represented is critical for a complete understanding of evolution. Phylogenetic trees are the most conventional tool for displaying evolutionary relationships, and “tree-thinking” has been coined as a term to describe the ability to conceptualize evolutionary relationships. Students often lack tree-thinking skills, and developing those skills should be a priority of biology curricula. Many common student misconceptions have been described, and a successful instructor needs a suite of tools for correcting those misconceptions. I review the literature on teaching tree-thinking to undergraduate students and suggest how this material can be presented within an inquiry-based framework. PMID:21572571

  11. Peer-to-peer Monte Carlo simulation of photon migration in topical applications of biomedical optics

    NASA Astrophysics Data System (ADS)

    Doronin, Alexander; Meglinski, Igor

    2012-09-01

    In the framework of further development of the unified approach of photon migration in complex turbid media, such as biological tissues we present a peer-to-peer (P2P) Monte Carlo (MC) code. The object-oriented programming is used for generalization of MC model for multipurpose use in various applications of biomedical optics. The online user interface providing multiuser access is developed using modern web technologies, such as Microsoft Silverlight, ASP.NET. The emerging P2P network utilizing computers with different types of compute unified device architecture-capable graphics processing units (GPUs) is applied for acceleration and to overcome the limitations, imposed by multiuser access in the online MC computational tool. The developed P2P MC was validated by comparing the results of simulation of diffuse reflectance and fluence rate distribution for semi-infinite scattering medium with known analytical results, results of adding-doubling method, and with other GPU-based MC techniques developed in the past. The best speedup of processing multiuser requests in a range of 4 to 35 s was achieved using single-precision computing, and the double-precision computing for floating-point arithmetic operations provides higher accuracy.

  12. Peer-to-peer Monte Carlo simulation of photon migration in topical applications of biomedical optics.

    PubMed

    Doronin, Alexander; Meglinski, Igor

    2012-09-01

    In the framework of further development of the unified approach of photon migration in complex turbid media, such as biological tissues we present a peer-to-peer (P2P) Monte Carlo (MC) code. The object-oriented programming is used for generalization of MC model for multipurpose use in various applications of biomedical optics. The online user interface providing multiuser access is developed using modern web technologies, such as Microsoft Silverlight, ASP.NET. The emerging P2P network utilizing computers with different types of compute unified device architecture-capable graphics processing units (GPUs) is applied for acceleration and to overcome the limitations, imposed by multiuser access in the online MC computational tool. The developed P2P MC was validated by comparing the results of simulation of diffuse reflectance and fluence rate distribution for semi-infinite scattering medium with known analytical results, results of adding-doubling method, and with other GPU-based MC techniques developed in the past. The best speedup of processing multiuser requests in a range of 4 to 35 s was achieved using single-precision computing, and the double-precision computing for floating-point arithmetic operations provides higher accuracy.

  13. Unified Least Squares Methods for the Evaluation of Diagnostic Tests With the Gold Standard

    PubMed Central

    Tang, Liansheng Larry; Yuan, Ao; Collins, John; Che, Xuan; Chan, Leighton

    2017-01-01

    The article proposes a unified least squares method to estimate the receiver operating characteristic (ROC) parameters for continuous and ordinal diagnostic tests, such as cancer biomarkers. The method is based on a linear model framework using the empirically estimated sensitivities and specificities as input “data.” It gives consistent estimates for regression and accuracy parameters when the underlying continuous test results are normally distributed after some monotonic transformation. The key difference between the proposed method and the method of Tang and Zhou lies in the response variable. The response variable in the latter is transformed empirical ROC curves at different thresholds. It takes on many values for continuous test results, but few values for ordinal test results. The limited number of values for the response variable makes it impractical for ordinal data. However, the response variable in the proposed method takes on many more distinct values so that the method yields valid estimates for ordinal data. Extensive simulation studies are conducted to investigate and compare the finite sample performance of the proposed method with an existing method, and the method is then used to analyze 2 real cancer diagnostic example as an illustration. PMID:28469385

  14. A unified approach for debugging is-a structure and mappings in networked taxonomies

    PubMed Central

    2013-01-01

    Background With the increased use of ontologies and ontology mappings in semantically-enabled applications such as ontology-based search and data integration, the issue of detecting and repairing defects in ontologies and ontology mappings has become increasingly important. These defects can lead to wrong or incomplete results for the applications. Results We propose a unified framework for debugging the is-a structure of and mappings between taxonomies, the most used kind of ontologies. We present theory and algorithms as well as an implemented system RepOSE, that supports a domain expert in detecting and repairing missing and wrong is-a relations and mappings. We also discuss two experiments performed by domain experts: an experiment on the Anatomy ontologies from the Ontology Alignment Evaluation Initiative, and a debugging session for the Swedish National Food Agency. Conclusions Semantically-enabled applications need high quality ontologies and ontology mappings. One key aspect is the detection and removal of defects in the ontologies and ontology mappings. Our system RepOSE provides an environment that supports domain experts to deal with this issue. We have shown the usefulness of the approach in two experiments by detecting and repairing circa 200 and 30 defects, respectively. PMID:23548155

  15. General description and understanding of the nonlinear dynamics of mode-locked fiber lasers.

    PubMed

    Wei, Huai; Li, Bin; Shi, Wei; Zhu, Xiushan; Norwood, Robert A; Peyghambarian, Nasser; Jian, Shuisheng

    2017-05-02

    As a type of nonlinear system with complexity, mode-locked fiber lasers are known for their complex behaviour. It is a challenging task to understand the fundamental physics behind such complex behaviour, and a unified description for the nonlinear behaviour and the systematic and quantitative analysis of the underlying mechanisms of these lasers have not been developed. Here, we present a complexity science-based theoretical framework for understanding the behaviour of mode-locked fiber lasers by going beyond reductionism. This hierarchically structured framework provides a model with variable dimensionality, resulting in a simple view that can be used to systematically describe complex states. Moreover, research into the attractors' basins reveals the origin of stochasticity, hysteresis and multistability in these systems and presents a new method for quantitative analysis of these nonlinear phenomena. These findings pave the way for dynamics analysis and system designs of mode-locked fiber lasers. We expect that this paradigm will also enable potential applications in diverse research fields related to complex nonlinear phenomena.

  16. A cell-based computational model of early embryogenesis coupling mechanical behaviour and gene regulation

    NASA Astrophysics Data System (ADS)

    Delile, Julien; Herrmann, Matthieu; Peyriéras, Nadine; Doursat, René

    2017-01-01

    The study of multicellular development is grounded in two complementary domains: cell biomechanics, which examines how physical forces shape the embryo, and genetic regulation and molecular signalling, which concern how cells determine their states and behaviours. Integrating both sides into a unified framework is crucial to fully understand the self-organized dynamics of morphogenesis. Here we introduce MecaGen, an integrative modelling platform enabling the hypothesis-driven simulation of these dual processes via the coupling between mechanical and chemical variables. Our approach relies upon a minimal `cell behaviour ontology' comprising mesenchymal and epithelial cells and their associated behaviours. MecaGen enables the specification and control of complex collective movements in 3D space through a biologically relevant gene regulatory network and parameter space exploration. Three case studies investigating pattern formation, epithelial differentiation and tissue tectonics in zebrafish early embryogenesis, the latter with quantitative comparison to live imaging data, demonstrate the validity and usefulness of our framework.

  17. The Aeroacoustics of Turbulent Flows

    NASA Technical Reports Server (NTRS)

    Goldstein, M. E.

    2008-01-01

    Aerodynamic noise prediction has been an important and challenging research area since James Lighthill first introduced his Acoustic Analogy Approach over fifty years ago. This talk attempts to provide a unified framework for the subsequent theoretical developments in this field. It assumes that there is no single approach that is optimal in all situations and uses the framework as a basis for discussing the strengths weaknesses of the various approaches to this topic. But the emphasis here will be on the important problem of predicting the noise from high speed air jets. Specific results will presented for round jets in the 0.5 to 1.4 Mach number range and compared with experimental data taken on the Glenn SHAR rig. It is demonstrated that nonparallel mean flow effects play an important role in predicting the noise at the supersonic Mach numbers. The results explain the failure of previous attempts based on the parallel flow Lilley model (which has served as the foundation for most jet noise analyses during past two decades).

  18. Model Checking Degrees of Belief in a System of Agents

    NASA Technical Reports Server (NTRS)

    Raimondi, Franco; Primero, Giuseppe; Rungta, Neha

    2014-01-01

    Reasoning about degrees of belief has been investigated in the past by a number of authors and has a number of practical applications in real life. In this paper we present a unified framework to model and verify degrees of belief in a system of agents. In particular, we describe an extension of the temporal-epistemic logic CTLK and we introduce a semantics based on interpreted systems for this extension. In this way, degrees of beliefs do not need to be provided externally, but can be derived automatically from the possible executions of the system, thereby providing a computationally grounded formalism. We leverage the semantics to (a) construct a model checking algorithm, (b) investigate its complexity, (c) provide a Java implementation of the model checking algorithm, and (d) evaluate our approach using the standard benchmark of the dining cryptographers. Finally, we provide a detailed case study: using our framework and our implementation, we assess and verify the situational awareness of the pilot of Air France 447 flying in off-nominal conditions.

  19. Unsupervised Deep Hashing With Pseudo Labels for Scalable Image Retrieval.

    PubMed

    Zhang, Haofeng; Liu, Li; Long, Yang; Shao, Ling

    2018-04-01

    In order to achieve efficient similarity searching, hash functions are designed to encode images into low-dimensional binary codes with the constraint that similar features will have a short distance in the projected Hamming space. Recently, deep learning-based methods have become more popular, and outperform traditional non-deep methods. However, without label information, most state-of-the-art unsupervised deep hashing (DH) algorithms suffer from severe performance degradation for unsupervised scenarios. One of the main reasons is that the ad-hoc encoding process cannot properly capture the visual feature distribution. In this paper, we propose a novel unsupervised framework that has two main contributions: 1) we convert the unsupervised DH model into supervised by discovering pseudo labels; 2) the framework unifies likelihood maximization, mutual information maximization, and quantization error minimization so that the pseudo labels can maximumly preserve the distribution of visual features. Extensive experiments on three popular data sets demonstrate the advantages of the proposed method, which leads to significant performance improvement over the state-of-the-art unsupervised hashing algorithms.

  20. Development Roadmap of an Evolvable and Extensible Multi-Mission Telecom Planning and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Cheung, Kar-Ming; Tung, Ramona H.; Lee, Charles H.

    2003-01-01

    In this paper, we describe the development roadmap and discuss the various challenges of an evolvable and extensible multi-mission telecom planning and analysis framework. Our long-term goal is to develop a set of powerful flexible telecommunications analysis tools that can be easily adapted to different missions while maintain the common Deep Space Communication requirements. The ability of re-using the DSN ground models and the common software utilities in our adaptations has contributed significantly to our development efforts measured in terms of consistency, accuracy, and minimal effort redundancy, which can translate into shorter development time and major cost savings for the individual missions. In our roadmap, we will address the design principles, technical achievements and the associated challenges for following telecom analysis tools (i) Telecom Forecaster Predictor - TFP (ii) Unified Telecom Predictor - UTP (iii) Generalized Telecom Predictor - GTP (iv) Generic TFP (v) Web-based TFP (vi) Application Program Interface - API (vii) Mars Relay Network Planning Tool - MRNPT.

Top