Sample records for unified process model

  1. Enhancement of the Acquisition Process for a Combat System-A Case Study to Model the Workflow Processes for an Air Defense System Acquisition

    DTIC Science & Technology

    2009-12-01

    Business Process Modeling BPMN Business Process Modeling Notation SoA Service-oriented Architecture UML Unified Modeling Language CSP...system developers. Supporting technologies include Business Process Modeling Notation ( BPMN ), Unified Modeling Language (UML), model-driven architecture

  2. Microphysics in Multi-scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2012-01-01

    Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the microphysics development and its performance for the multi-scale modeling system will be presented.

  3. Parametric models to relate spike train and LFP dynamics with neural information processing.

    PubMed

    Banerjee, Arpan; Dean, Heather L; Pesaran, Bijan

    2012-01-01

    Spike trains and local field potentials (LFPs) resulting from extracellular current flows provide a substrate for neural information processing. Understanding the neural code from simultaneous spike-field recordings and subsequent decoding of information processing events will have widespread applications. One way to demonstrate an understanding of the neural code, with particular advantages for the development of applications, is to formulate a parametric statistical model of neural activity and its covariates. Here, we propose a set of parametric spike-field models (unified models) that can be used with existing decoding algorithms to reveal the timing of task or stimulus specific processing. Our proposed unified modeling framework captures the effects of two important features of information processing: time-varying stimulus-driven inputs and ongoing background activity that occurs even in the absence of environmental inputs. We have applied this framework for decoding neural latencies in simulated and experimentally recorded spike-field sessions obtained from the lateral intraparietal area (LIP) of awake, behaving monkeys performing cued look-and-reach movements to spatial targets. Using both simulated and experimental data, we find that estimates of trial-by-trial parameters are not significantly affected by the presence of ongoing background activity. However, including background activity in the unified model improves goodness of fit for predicting individual spiking events. Uncovering the relationship between the model parameters and the timing of movements offers new ways to test hypotheses about the relationship between neural activity and behavior. We obtained significant spike-field onset time correlations from single trials using a previously published data set where significantly strong correlation was only obtained through trial averaging. We also found that unified models extracted a stronger relationship between neural response latency and trial-by-trial behavioral performance than existing models of neural information processing. Our results highlight the utility of the unified modeling framework for characterizing spike-LFP recordings obtained during behavioral performance.

  4. GUDM: Automatic Generation of Unified Datasets for Learning and Reasoning in Healthcare.

    PubMed

    Ali, Rahman; Siddiqi, Muhammad Hameed; Idris, Muhammad; Ali, Taqdir; Hussain, Shujaat; Huh, Eui-Nam; Kang, Byeong Ho; Lee, Sungyoung

    2015-07-02

    A wide array of biomedical data are generated and made available to healthcare experts. However, due to the diverse nature of data, it is difficult to predict outcomes from it. It is therefore necessary to combine these diverse data sources into a single unified dataset. This paper proposes a global unified data model (GUDM) to provide a global unified data structure for all data sources and generate a unified dataset by a "data modeler" tool. The proposed tool implements user-centric priority based approach which can easily resolve the problems of unified data modeling and overlapping attributes across multiple datasets. The tool is illustrated using sample diabetes mellitus data. The diverse data sources to generate the unified dataset for diabetes mellitus include clinical trial information, a social media interaction dataset and physical activity data collected using different sensors. To realize the significance of the unified dataset, we adopted a well-known rough set theory based rules creation process to create rules from the unified dataset. The evaluation of the tool on six different sets of locally created diverse datasets shows that the tool, on average, reduces 94.1% time efforts of the experts and knowledge engineer while creating unified datasets.

  5. Evidence accumulation in decision making: unifying the "take the best" and the "rational" models.

    PubMed

    Lee, Michael D; Cummins, Tarrant D R

    2004-04-01

    An evidence accumulation model of forced-choice decision making is proposed to unify the fast and frugal take the best (TTB) model and the alternative rational (RAT) model with which it is usually contrasted. The basic idea is to treat the TTB model as a sequential-sampling process that terminates as soon as any evidence in favor of a decision is found and the rational approach as a sequential-sampling process that terminates only when all available information has been assessed. The unified TTB and RAT models were tested in an experiment in which participants learned to make correct judgments for a set of real-world stimuli on the basis of feedback, and were then asked to make additional judgments without feedback for cases in which the TTB and the rational models made different predictions. The results show that, in both experiments, there was strong intraparticipant consistency in the use of either the TTB or the rational model but large interparticipant differences in which model was used. The unified model is shown to be able to capture the differences in decision making across participants in an interpretable way and is preferred by the minimum description length model selection criterion.

  6. Unified Science Approach K-12, Proficiency Levels 7-12.

    ERIC Educational Resources Information Center

    Oickle, Eileen M., Ed.

    Presented is the second part of the K-12 unified science materials used in the public schools of Anne Arundel County, Maryland. Detailed descriptions are made of the roles of students and teachers, purposes of the bibliography, major concepts in unified science, processes of inquiry, a scheme and model for scientific literacy, and program…

  7. Unified Science Approach K-12, Proficiency Levels 1-6.

    ERIC Educational Resources Information Center

    Oickle, Eileen M., Ed.

    Presented are first-revision materials of the K-12 unified science program implemented in the public schools of Anne Arundel County, Maryland. Detailed descriptions are given of the roles of students and teachers, purposes of bibliography, major concepts in unified science, processes of inquiry, scheme and model for scientific literacy, and…

  8. In Search of a Unified Model of Language Contact

    ERIC Educational Resources Information Center

    Winford, Donald

    2013-01-01

    Much previous research has pointed to the need for a unified framework for language contact phenomena -- one that would include social factors and motivations, structural factors and linguistic constraints, and psycholinguistic factors involved in processes of language processing and production. While Contact Linguistics has devoted a great deal…

  9. Unified Science Approach K-12, Proficiency Levels 13-21 and Semester Courses.

    ERIC Educational Resources Information Center

    Oickle, Eileen M., Ed.

    Presented is the third part of the K-12 unified science materials used in the public schools of Anne Arundel County, Maryland. Detailed descriptions are presented for the roles of students and teachers, purposes of bibliography, major concepts in unified science, processes of inquiry, scheme and model for scientific literacy, and program…

  10. Concentration-driven models revisited: towards a unified framework to model settling tanks in water resource recovery facilities.

    PubMed

    Torfs, Elena; Martí, M Carmen; Locatelli, Florent; Balemans, Sophie; Bürger, Raimund; Diehl, Stefan; Laurent, Julien; Vanrolleghem, Peter A; François, Pierre; Nopens, Ingmar

    2017-02-01

    A new perspective on the modelling of settling behaviour in water resource recovery facilities is introduced. The ultimate goal is to describe in a unified way the processes taking place both in primary settling tanks (PSTs) and secondary settling tanks (SSTs) for a more detailed operation and control. First, experimental evidence is provided, pointing out distributed particle properties (such as size, shape, density, porosity, and flocculation state) as an important common source of distributed settling behaviour in different settling unit processes and throughout different settling regimes (discrete, hindered and compression settling). Subsequently, a unified model framework that considers several particle classes is proposed in order to describe distributions in settling behaviour as well as the effect of variations in particle properties on the settling process. The result is a set of partial differential equations (PDEs) that are valid from dilute concentrations, where they correspond to discrete settling, to concentrated suspensions, where they correspond to compression settling. Consequently, these PDEs model both PSTs and SSTs.

  11. GUDM: Automatic Generation of Unified Datasets for Learning and Reasoning in Healthcare

    PubMed Central

    Ali, Rahman; Siddiqi, Muhammad Hameed; Idris, Muhammad; Ali, Taqdir; Hussain, Shujaat; Huh, Eui-Nam; Kang, Byeong Ho; Lee, Sungyoung

    2015-01-01

    A wide array of biomedical data are generated and made available to healthcare experts. However, due to the diverse nature of data, it is difficult to predict outcomes from it. It is therefore necessary to combine these diverse data sources into a single unified dataset. This paper proposes a global unified data model (GUDM) to provide a global unified data structure for all data sources and generate a unified dataset by a “data modeler” tool. The proposed tool implements user-centric priority based approach which can easily resolve the problems of unified data modeling and overlapping attributes across multiple datasets. The tool is illustrated using sample diabetes mellitus data. The diverse data sources to generate the unified dataset for diabetes mellitus include clinical trial information, a social media interaction dataset and physical activity data collected using different sensors. To realize the significance of the unified dataset, we adopted a well-known rough set theory based rules creation process to create rules from the unified dataset. The evaluation of the tool on six different sets of locally created diverse datasets shows that the tool, on average, reduces 94.1% time efforts of the experts and knowledge engineer while creating unified datasets. PMID:26147731

  12. Modeling stroke rehabilitation processes using the Unified Modeling Language (UML).

    PubMed

    Ferrante, Simona; Bonacina, Stefano; Pinciroli, Francesco

    2013-10-01

    In organising and providing rehabilitation procedures for stroke patients, the usual need for many refinements makes it inappropriate to attempt rigid standardisation, but greater detail is required concerning workflow. The aim of this study was to build a model of the post-stroke rehabilitation process. The model, implemented in the Unified Modeling Language, was grounded on international guidelines and refined following the clinical pathway adopted at local level by a specialized rehabilitation centre. The model describes the organisation of the rehabilitation delivery and it facilitates the monitoring of recovery during the process. Indeed, a system software was developed and tested to support clinicians in the digital administration of clinical scales. The model flexibility assures easy updating after process evolution. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Using the Unified Modelling Language (UML) to guide the systemic description of biological processes and systems.

    PubMed

    Roux-Rouquié, Magali; Caritey, Nicolas; Gaubert, Laurent; Rosenthal-Sabroux, Camille

    2004-07-01

    One of the main issues in Systems Biology is to deal with semantic data integration. Previously, we examined the requirements for a reference conceptual model to guide semantic integration based on the systemic principles. In the present paper, we examine the usefulness of the Unified Modelling Language (UML) to describe and specify biological systems and processes. This makes unambiguous representations of biological systems, which would be suitable for translation into mathematical and computational formalisms, enabling analysis, simulation and prediction of these systems behaviours.

  14. A unified approach to computer analysis and modeling of spacecraft environmental interactions

    NASA Technical Reports Server (NTRS)

    Katz, I.; Mandell, M. J.; Cassidy, J. J.

    1986-01-01

    A new, coordinated, unified approach to the development of spacecraft plasma interaction models is proposed. The objective is to eliminate the unnecessary duplicative work in order to allow researchers to concentrate on the scientific aspects. By streamlining the developmental process, the interchange between theories and experimentalists is enhanced, and the transfer of technology to the spacecraft engineering community is faster. This approach is called the UNIfied Spacecraft Interaction Model (UNISIM). UNISIM is a coordinated system of software, hardware, and specifications. It is a tool for modeling and analyzing spacecraft interactions. It will be used to design experiments, to interpret results of experiments, and to aid in future spacecraft design. It breaks a Spacecraft Ineraction analysis into several modules. Each module will perform an analysis for some physical process, using phenomenology and algorithms which are well documented and have been subject to review. This system and its characteristics are discussed.

  15. Using Machine Learning as a fast emulator of physical processes within the Met Office's Unified Model

    NASA Astrophysics Data System (ADS)

    Prudden, R.; Arribas, A.; Tomlinson, J.; Robinson, N.

    2017-12-01

    The Unified Model is a numerical model of the atmosphere used at the UK Met Office (and numerous partner organisations including Korean Meteorological Agency, Australian Bureau of Meteorology and US Air Force) for both weather and climate applications.Especifically, dynamical models such as the Unified Model are now a central part of weather forecasting. Starting from basic physical laws, these models make it possible to predict events such as storms before they have even begun to form. The Unified Model can be simply described as having two components: one component solves the navier-stokes equations (usually referred to as the "dynamics"); the other solves relevant sub-grid physical processes (usually referred to as the "physics"). Running weather forecasts requires substantial computing resources - for example, the UK Met Office operates the largest operational High Performance Computer in Europe - and the cost of a typical simulation is spent roughly 50% in the "dynamics" and 50% in the "physics". Therefore there is a high incentive to reduce cost of weather forecasts and Machine Learning is a possible option because, once a machine learning model has been trained, it is often much faster to run than a full simulation. This is the motivation for a technique called model emulation, the idea being to build a fast statistical model which closely approximates a far more expensive simulation. In this paper we discuss the use of Machine Learning as an emulator to replace the "physics" component of the Unified Model. Various approaches and options will be presented and the implications for further model development, operational running of forecasting systems, development of data assimilation schemes, and development of ensemble prediction techniques will be discussed.

  16. A Unified Framework for Complex Networks with Degree Trichotomy Based on Markov Chains.

    PubMed

    Hui, David Shui Wing; Chen, Yi-Chao; Zhang, Gong; Wu, Weijie; Chen, Guanrong; Lui, John C S; Li, Yingtao

    2017-06-16

    This paper establishes a Markov chain model as a unified framework for describing the evolution processes in complex networks. The unique feature of the proposed model is its capability in addressing the formation mechanism that can reflect the "trichotomy" observed in degree distributions, based on which closed-form solutions can be derived. Important special cases of the proposed unified framework are those classical models, including Poisson, Exponential, Power-law distributed networks. Both simulation and experimental results demonstrate a good match of the proposed model with real datasets, showing its superiority over the classical models. Implications of the model to various applications including citation analysis, online social networks, and vehicular networks design, are also discussed in the paper.

  17. What Does CALL Have to Offer Computer Science and What Does Computer Science Have to Offer CALL?

    ERIC Educational Resources Information Center

    Cushion, Steve

    2006-01-01

    We will argue that CALL can usefully be viewed as a subset of computer software engineering and can profit from adopting some of the recent progress in software development theory. The unified modelling language has become the industry standard modelling technique and the accompanying unified process is rapidly gaining acceptance. The manner in…

  18. A unified 3D default space consciousness model combining neurological and physiological processes that underlie conscious experience

    PubMed Central

    Jerath, Ravinder; Crawford, Molly W.; Barnes, Vernon A.

    2015-01-01

    The Global Workspace Theory and Information Integration Theory are two of the most currently accepted consciousness models; however, these models do not address many aspects of conscious experience. We compare these models to our previously proposed consciousness model in which the thalamus fills-in processed sensory information from corticothalamic feedback loops within a proposed 3D default space, resulting in the recreation of the internal and external worlds within the mind. This 3D default space is composed of all cells of the body, which communicate via gap junctions and electrical potentials to create this unified space. We use 3D illustrations to explain how both visual and non-visual sensory information may be filled-in within this dynamic space, creating a unified seamless conscious experience. This neural sensory memory space is likely generated by baseline neural oscillatory activity from the default mode network, other salient networks, brainstem, and reticular activating system. PMID:26379573

  19. Three Tier Unified Process Model for Requirement Negotiations and Stakeholder Collaborations

    NASA Astrophysics Data System (ADS)

    Niazi, Muhammad Ashraf Khan; Abbas, Muhammad; Shahzad, Muhammad

    2012-11-01

    This research paper is focused towards carrying out a pragmatic qualitative analysis of various models and approaches of requirements negotiations (a sub process of requirements management plan which is an output of scope managementís collect requirements process) and studies stakeholder collaborations methodologies (i.e. from within communication management knowledge area). Experiential analysis encompass two tiers; first tier refers to the weighted scoring model while second tier focuses on development of SWOT matrices on the basis of findings of weighted scoring model for selecting an appropriate requirements negotiation model. Finally the results are simulated with the help of statistical pie charts. On the basis of simulated results of prevalent models and approaches of negotiations, a unified approach for requirements negotiations and stakeholder collaborations is proposed where the collaboration methodologies are embeded into selected requirements negotiation model as internal parameters of the proposed process alongside some external required parameters like MBTI, opportunity analysis etc.

  20. A unified architecture for biomedical search engines based on semantic web technologies.

    PubMed

    Jalali, Vahid; Matash Borujerdi, Mohammad Reza

    2011-04-01

    There is a huge growth in the volume of published biomedical research in recent years. Many medical search engines are designed and developed to address the over growing information needs of biomedical experts and curators. Significant progress has been made in utilizing the knowledge embedded in medical ontologies and controlled vocabularies to assist these engines. However, the lack of common architecture for utilized ontologies and overall retrieval process, hampers evaluating different search engines and interoperability between them under unified conditions. In this paper, a unified architecture for medical search engines is introduced. Proposed model contains standard schemas declared in semantic web languages for ontologies and documents used by search engines. Unified models for annotation and retrieval processes are other parts of introduced architecture. A sample search engine is also designed and implemented based on the proposed architecture in this paper. The search engine is evaluated using two test collections and results are reported in terms of precision vs. recall and mean average precision for different approaches used by this search engine.

  1. Microphysics in the Multi-Scale Modeling Systems with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Chern, J.; Lamg, S.; Matsui, T.; Shen, B.; Zeng, X.; Shi, R.

    2011-01-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (l) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, the microphysics developments of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the heavy precipitation processes will be presented.

  2. A Multi-scale Modeling System with Unified Physics to Study Precipitation Processes

    NASA Astrophysics Data System (ADS)

    Tao, W. K.

    2017-12-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), and (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF). The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the precipitation, processes and their sensitivity on model resolution and microphysics schemes will be presented. Also how to use of the multi-satellite simulator to improve precipitation processes will be discussed.

  3. Unified Models of Turbulence and Nonlinear Wave Evolution in the Extended Solar Corona and Solar Wind

    NASA Technical Reports Server (NTRS)

    Cranmer, Steven R.; Wagner, William (Technical Monitor)

    2004-01-01

    The PI (Cranmer) and Co-I (A. van Ballegooijen) made substantial progress toward the goal of producing a unified model of the basic physical processes responsible for solar wind acceleration. The approach outlined in the original proposal comprised two complementary pieces: (1) to further investigate individual physical processes under realistic coronal and solar wind conditions, and (2) to extract the dominant physical effects from simulations and apply them to a 1D model of plasma heating and acceleration. The accomplishments in Year 2 are divided into these two categories: 1a. Focused Study of Kinetic Magnetohydrodynamic (MHD) Turbulence. lb. Focused Study of Non - WKB Alfven Wave Rejection. and 2. The Unified Model Code. We have continued the development of the computational model of a time-study open flux tube in the extended corona. The proton-electron Monte Carlo model is being tested, and collisionless wave-particle interactions are being included. In order to better understand how to easily incorporate various kinds of wave-particle processes into the code, the PI performed a detailed study of the so-called "Ito Calculus", i.e., the mathematical theory of how to update the positions of particles in a probabilistic manner when their motions are governed by diffusion in velocity space.

  4. Unified-theory-of-reinforcement neural networks do not simulate the blocking effect.

    PubMed

    Calvin, Nicholas T; J McDowell, J

    2015-11-01

    For the last 20 years the unified theory of reinforcement (Donahoe et al., 1993) has been used to develop computer simulations to evaluate its plausibility as an account for behavior. The unified theory of reinforcement states that operant and respondent learning occurs via the same neural mechanisms. As part of a larger project to evaluate the operant behavior predicted by the theory, this project was the first replication of neural network models based on the unified theory of reinforcement. In the process of replicating these neural network models it became apparent that a previously published finding, namely, that the networks simulate the blocking phenomenon (Donahoe et al., 1993), was a misinterpretation of the data. We show that the apparent blocking produced by these networks is an artifact of the inability of these networks to generate the same conditioned response to multiple stimuli. The piecemeal approach to evaluate the unified theory of reinforcement via simulation is critiqued and alternatives are discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Addressing Learning Style Criticism: The Unified Learning Style Model Revisited

    NASA Astrophysics Data System (ADS)

    Popescu, Elvira

    Learning style is one of the individual differences that play an important but controversial role in the learning process. This paper aims at providing a critical analysis regarding learning styles and their use in technology enhanced learning. The identified criticism issues are addressed by reappraising the so called Unified Learning Style Model (ULSM). A detailed description of the ULSM components is provided, together with their rationale. The practical applicability of the model in adaptive web-based educational systems and its advantages versus traditional learning style models are also outlined.

  6. Toward a unifying framework for evolutionary processes.

    PubMed

    Paixão, Tiago; Badkobeh, Golnaz; Barton, Nick; Çörüş, Doğan; Dang, Duc-Cuong; Friedrich, Tobias; Lehre, Per Kristian; Sudholt, Dirk; Sutton, Andrew M; Trubenová, Barbora

    2015-10-21

    The theory of population genetics and evolutionary computation have been evolving separately for nearly 30 years. Many results have been independently obtained in both fields and many others are unique to its respective field. We aim to bridge this gap by developing a unifying framework for evolutionary processes that allows both evolutionary algorithms and population genetics models to be cast in the same formal framework. The framework we present here decomposes the evolutionary process into its several components in order to facilitate the identification of similarities between different models. In particular, we propose a classification of evolutionary operators based on the defining properties of the different components. We cast several commonly used operators from both fields into this common framework. Using this, we map different evolutionary and genetic algorithms to different evolutionary regimes and identify candidates with the most potential for the translation of results between the fields. This provides a unified description of evolutionary processes and represents a stepping stone towards new tools and results to both fields. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Simulation of Aerosols and Chemistry with a Unified Global Model

    NASA Technical Reports Server (NTRS)

    Chin, Mian

    2004-01-01

    This project is to continue the development of the global simulation capabilities of tropospheric and stratospheric chemistry and aerosols in a unified global model. This is a part of our overall investigation of aerosol-chemistry-climate interaction. In the past year, we have enabled the tropospheric chemistry simulations based on the GEOS-CHEM model, and added stratospheric chemical reactions into the GEOS-CHEM such that a globally unified troposphere-stratosphere chemistry and transport can be simulated consistently without any simplifications. The tropospheric chemical mechanism in the GEOS-CHEM includes 80 species and 150 reactions. 24 tracers are transported, including O3, NOx, total nitrogen (NOy), H2O2, CO, and several types of hydrocarbon. The chemical solver used in the GEOS-CHEM model is a highly accurate sparse-matrix vectorized Gear solver (SMVGEAR). The stratospheric chemical mechanism includes an additional approximately 100 reactions and photolysis processes. Because of the large number of total chemical reactions and photolysis processes and very different photochemical regimes involved in the unified simulation, the model demands significant computer resources that are currently not practical. Therefore, several improvements will be taken, such as massive parallelization, code optimization, or selecting a faster solver. We have also continued aerosol simulation (including sulfate, dust, black carbon, organic carbon, and sea-salt) in the global model to cover most of year 2002. These results have been made available to many groups worldwide and accessible from the website http://code916.gsfc.nasa.gov/People/Chin/aot.html.

  8. Examination of Modeling Languages to Allow Quantitative Analysis for Model-Based Systems Engineering

    DTIC Science & Technology

    2014-06-01

    x THIS PAGE INTENTIONALLY LEFT BLANK xi LIST OF ACRONYMS AND ABBREVIATIONS BOM Base Object Model BPMN Business Process Model & Notation DOD...SysML. There are many variants such as the Unified Profile for DODAF/MODAF (UPDM) and Business Process Model & Notation ( BPMN ) that have origins in

  9. Unifying practice schedules in the timescales of motor learning and performance.

    PubMed

    Verhoeven, F Martijn; Newell, Karl M

    2018-06-01

    In this article, we elaborate from a multiple time scales model of motor learning to examine the independent and integrated effects of massed and distributed practice schedules within- and between-sessions on the persistent (learning) and transient (warm-up, fatigue) processes of performance change. The timescales framework reveals the influence of practice distribution on four learning-related processes: the persistent processes of learning and forgetting, and the transient processes of warm-up decrement and fatigue. The superposition of the different processes of practice leads to a unified set of effects for massed and distributed practice within- and between-sessions in learning motor tasks. This analysis of the interaction between the duration of the interval of practice trials or sessions and parameters of the introduced time scale model captures the unified influence of the between trial and session scheduling of practice on learning and performance. It provides a starting point for new theoretically based hypotheses, and the scheduling of practice that minimizes the negative effects of warm-up decrement, fatigue and forgetting while exploiting the positive effects of learning and retention. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. unmarked: An R package for fitting hierarchical models of wildlife occurrence and abundance

    USGS Publications Warehouse

    Fiske, Ian J.; Chandler, Richard B.

    2011-01-01

    Ecological research uses data collection techniques that are prone to substantial and unique types of measurement error to address scientific questions about species abundance and distribution. These data collection schemes include a number of survey methods in which unmarked individuals are counted, or determined to be present, at spatially- referenced sites. Examples include site occupancy sampling, repeated counts, distance sampling, removal sampling, and double observer sampling. To appropriately analyze these data, hierarchical models have been developed to separately model explanatory variables of both a latent abundance or occurrence process and a conditional detection process. Because these models have a straightforward interpretation paralleling mechanisms under which the data arose, they have recently gained immense popularity. The common hierarchical structure of these models is well-suited for a unified modeling interface. The R package unmarked provides such a unified modeling framework, including tools for data exploration, model fitting, model criticism, post-hoc analysis, and model comparison.

  11. Mission Assurance in a Distributed Environment

    DTIC Science & Technology

    2009-06-01

    Notation ( BPMN ) – Graphical representation of business processes in a workflow • Unified Modeling Language (UML) – Use standard UML diagrams to model the system – Component, sequence, activity diagrams

  12. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    PubMed

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  13. A Model of the Creative Process Based on Quantum Physics and Vedic Science.

    ERIC Educational Resources Information Center

    Rose, Laura Hall

    1988-01-01

    Using tenets from Vedic science and quantum physics, this model of the creative process suggests that the unified field of creation is pure consciousness, and that the development of the creative process within individuals mirrors the creative process within the universe. Rational and supra-rational creative thinking techniques are also described.…

  14. The Layer-Based, Pragmatic Model of the Communication Process.

    ERIC Educational Resources Information Center

    Targowski, Andrew S.; Bowman, Joel P.

    1988-01-01

    Presents the Targowski/Bowman model of the communication process, which introduces a new paradigm that isolates the various components for individual measurement and analysis, places these components into a unified whole, and places communication and its business component into a larger cultural context. (MM)

  15. A Goddard Multi-Scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, W.K.; Anderson, D.; Atlas, R.; Chern, J.; Houser, P.; Hou, A.; Lang, S.; Lau, W.; Peters-Lidard, C.; Kakar, R.; hide

    2008-01-01

    Numerical cloud resolving models (CRMs), which are based the non-hydrostatic equations of motion, have been extensively applied to cloud-scale and mesoscale processes during the past four decades. Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that CRMs agree with observations in simulating various types of clouds and cloud systems from different geographic locations. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that Numerical Weather Prediction (NWP) and regional scale model can be run in grid size similar to cloud resolving model through nesting technique. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a szrper-parameterization or multi-scale modeling -framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign can provide initial conditions as well as validation through utilizing the Earth Satellite simulators. At Goddard, we have developed a multi-scale modeling system with unified physics. The modeling system consists a coupled GCM-CRM (or MMF); a state-of-the-art weather research forecast model (WRF) and a cloud-resolving model (Goddard Cumulus Ensemble model). In these models, the same microphysical schemes (2ICE, several 3ICE), radiation (including explicitly calculated cloud optical properties), and surface models are applied. In addition, a comprehensive unified Earth Satellite simulator has been developed at GSFC, which is designed to fully utilize the multi-scale modeling system. A brief review of the multi-scale modeling system with unified physics/simulator and examples is presented in this article.

  16. Conceptualising paediatric health disparities: a metanarrative systematic review and unified conceptual framework.

    PubMed

    Ridgeway, Jennifer L; Wang, Zhen; Finney Rutten, Lila J; van Ryn, Michelle; Griffin, Joan M; Murad, M Hassan; Asiedu, Gladys B; Egginton, Jason S; Beebe, Timothy J

    2017-08-04

    There exists a paucity of work in the development and testing of theoretical models specific to childhood health disparities even though they have been linked to the prevalence of adult health disparities including high rates of chronic disease. We conducted a systematic review and thematic analysis of existing models of health disparities specific to children to inform development of a unified conceptual framework. We systematically reviewed articles reporting theoretical or explanatory models of disparities on a range of outcomes related to child health. We searched Ovid Medline In-Process & Other Non-Indexed Citations, Ovid MEDLINE, Ovid Embase, Ovid Cochrane Central Register of Controlled Trials, Ovid Cochrane Database of Systematic Reviews, and Scopus (database inception to 9 July 2015). A metanarrative approach guided the analysis process. A total of 48 studies presenting 48 models were included. This systematic review found multiple models but no consensus on one approach. However, we did discover a fair amount of overlap, such that the 48 models reviewed converged into the unified conceptual framework. The majority of models included factors in three domains: individual characteristics and behaviours (88%), healthcare providers and systems (63%), and environment/community (56%), . Only 38% of models included factors in the health and public policies domain. A disease-agnostic unified conceptual framework may inform integration of existing knowledge of child health disparities and guide future research. This multilevel framework can focus attention among clinical, basic and social science research on the relationships between policy, social factors, health systems and the physical environment that impact children's health outcomes. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  17. Trichotomous processes in early memory development, aging, and neurocognitive impairment: a unified theory.

    PubMed

    Brainerd, C J; Reyna, V F; Howe, M L

    2009-10-01

    One of the most extensively investigated topics in the adult memory literature, dual memory processes, has had virtually no impact on the study of early memory development. The authors remove the key obstacles to such research by formulating a trichotomous theory of recall that combines the traditional dual processes of recollection and familiarity with a reconstruction process. The theory is then embedded in a hidden Markov model that measures all 3 processes with low-burden tasks that are appropriate for even young children. These techniques are applied to a large corpus of developmental studies of recall, yielding stable findings about the emergence of dual memory processes between childhood and young adulthood and generating tests of many theoretical predictions. The techniques are extended to the study of healthy aging and to the memory sequelae of common forms of neurocognitive impairment, resulting in a theoretical framework that is unified over 4 major domains of memory research: early development, mainstream adult research, aging, and neurocognitive impairment. The techniques are also extended to recognition, creating a unified dual process framework for recall and recognition.

  18. A CPT for Improving Turbulence and Cloud Processes in the NCEP Global Models

    NASA Astrophysics Data System (ADS)

    Krueger, S. K.; Moorthi, S.; Randall, D. A.; Pincus, R.; Bogenschutz, P.; Belochitski, A.; Chikira, M.; Dazlich, D. A.; Swales, D. J.; Thakur, P. K.; Yang, F.; Cheng, A.

    2016-12-01

    Our Climate Process Team (CPT) is based on the premise that the NCEP (National Centers for Environmental Prediction) global models can be improved by installing an integrated, self-consistent description of turbulence, clouds, deep convection, and the interactions between clouds and radiative and microphysical processes. The goal of our CPT is to unify the representation of turbulence and subgrid-scale (SGS) cloud processes and to unify the representation of SGS deep convective precipitation and grid-scale precipitation as the horizontal resolution decreases. We aim to improve the representation of small-scale phenomena by implementing a PDF-based SGS turbulence and cloudiness scheme that replaces the boundary layer turbulence scheme, the shallow convection scheme, and the cloud fraction schemes in the GFS (Global Forecast System) and CFS (Climate Forecast System) global models. We intend to improve the treatment of deep convection by introducing a unified parameterization that scales continuously between the simulation of individual clouds when and where the grid spacing is sufficiently fine and the behavior of a conventional parameterization of deep convection when and where the grid spacing is coarse. We will endeavor to improve the representation of the interactions of clouds, radiation, and microphysics in the GFS/CFS by using the additional information provided by the PDF-based SGS cloud scheme. The team is evaluating the impacts of the model upgrades with metrics used by the NCEP short-range and seasonal forecast operations.

  19. Unified Models of Turbulence and Nonlinear Wave Evolution in the Extended Solar Corona and Solar Wind

    NASA Technical Reports Server (NTRS)

    Wagner, William (Technical Monitor); Cranmer, Steven R.

    2005-01-01

    The paper discusses the following: 1. No-cost Extension. The no-cost extension is required to complete the work on the unified model codes (both hydrodynamic and kinetic Monte Carlo) as described in the initial proposal and previous annual reports. 2. Scientific Accomplishments during the Report Period. We completed a comprehensive model of Alfvtn wave reflection that spans the full distance from the photosphere to the distant heliosphere. 3. Comparison of Accomplishments with Proposed Goals. The proposal contained two specific objectives for Year 3: (1) to complete the unified model code, and (2) to apply it to various kinds of coronal holes (and polar plumes within coronal holes). Although the anticipated route toward these two final goals has changed (see accomplishments 2a and 2b above), they remain the major milestones for the extended period of performance. Accomplishments la and IC were necessary prerequisites for the derivation of "physically relevant transport and mode-coupling terms" for the unified model codes (as stated in the proposal Year 3 goals). We have fulfilled the proposed "core work" to study 4 general types of physical processes; in previous years we studied turbulence, mode coupling (Le., non-WKB reflection), and kinetic wave damping, and accomplishment lb provides the fourth topic: nonlinear steepening.

  20. AN APPROACH TO A UNIFIED PROCESS-BASED REGIONAL EMISSION FLUX MODELING PLATFORM

    EPA Science Inventory

    The trend towards episodic modeling of environmentally-dependent emissions is increasing, with models available or under development for dust, ammonia, biogenic volatile organic compounds, soil nitrous oxide, pesticides, sea salt, and chloride, mercury, and wildfire emissions. T...

  1. The Roles of First Language and Proficiency in L2 Processing of Spanish Clitics: Global Effects

    ERIC Educational Resources Information Center

    Seibert Hanson, Aroline E.; Carlson, Matthew T.

    2014-01-01

    We assessed the roles of first language (L1) and second language (L2) proficiency in the processing of preverbal clitics in L2 Spanish by considering the predictions of four processing theories--the Input Processing Theory, the Unified Competition Model, the Amalgamation Model, and the Associative-Cognitive CREED. We compared the performance of L1…

  2. Memory and cognitive control in an integrated theory of language processing.

    PubMed

    Slevc, L Robert; Novick, Jared M

    2013-08-01

    Pickering & Garrod's (P&G's) integrated model of production and comprehension includes no explicit role for nonlinguistic cognitive processes. Yet, how domain-general cognitive functions contribute to language processing has become clearer with well-specified theories and supporting data. We therefore believe that their account can benefit by incorporating functions like working memory and cognitive control into a unified model of language processing.

  3. A unified account of tilt illusions, association fields, and contour detection based on elastica.

    PubMed

    Keemink, Sander W; van Rossum, Mark C W

    2016-09-01

    As expressed in the Gestalt law of good continuation, human perception tends to associate stimuli that form smooth continuations. Contextual modulation in primary visual cortex, in the form of association fields, is believed to play an important role in this process. Yet a unified and principled account of the good continuation law on the neural level is lacking. In this study we introduce a population model of primary visual cortex. Its contextual interactions depend on the elastica curvature energy of the smoothest contour connecting oriented bars. As expected, this model leads to association fields consistent with data. However, in addition the model displays tilt-illusions for stimulus configurations with grating and single bars that closely match psychophysics. Furthermore, the model explains not only pop-out of contours amid a variety of backgrounds, but also pop-out of single targets amid a uniform background. We thus propose that elastica is a unifying principle of the visual cortical network. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  4. Reconceptualizing Social Influence in Counseling: The Elaboration Likelihood Model.

    ERIC Educational Resources Information Center

    McNeill, Brian W.; Stoltenberg, Cal D.

    1989-01-01

    Presents Elaboration Likelihood Model (ELM) of persuasion (a reconceptualization of the social influence process) as alternative model of attitude change. Contends ELM unifies conflicting social psychology results and can potentially account for inconsistent research findings in counseling psychology. Provides guidelines on integrating…

  5. WICS: A New Model for School Psychology

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    2010-01-01

    This article presents a unified model for cognitive processing, WICS, which is an acronym for wisdom, intelligence, and creativity, synthesized. The model can be applied to identification/admissions, diagnosis, instruction, and assessment. I discuss why there is a need for such a model. Then I describe traditional models, after which I describe…

  6. A UML model for the description of different brain-computer interface systems.

    PubMed

    Quitadamo, Lucia Rita; Abbafati, Manuel; Saggio, Giovanni; Marciani, Maria Grazia; Cardarilli, Gian Carlo; Bianchi, Luigi

    2008-01-01

    BCI research lacks a universal descriptive language among labs and a unique standard model for the description of BCI systems. This results in a serious problem in comparing performances of different BCI processes and in unifying tools and resources. In such a view we implemented a Unified Modeling Language (UML) model for the description virtually of any BCI protocol and we demonstrated that it can be successfully applied to the most common ones such as P300, mu-rhythms, SCP, SSVEP, fMRI. Finally we illustrated the advantages in utilizing a standard terminology for BCIs and how the same basic structure can be successfully adopted for the implementation of new systems.

  7. Toward a unified account of comprehension and production in language development.

    PubMed

    McCauley, Stewart M; Christiansen, Morten H

    2013-08-01

    Although Pickering & Garrod (P&G) argue convincingly for a unified system for language comprehension and production, they fail to explain how such a system might develop. Using a recent computational model of language acquisition as an example, we sketch a developmental perspective on the integration of comprehension and production. We conclude that only through development can we fully understand the intertwined nature of comprehension and production in adult processing.

  8. Wind Sensing, Analysis, and Modeling

    NASA Technical Reports Server (NTRS)

    Corvin, Michael A.

    1995-01-01

    The purpose of this task was to begin development of a unified approach to the sensing, analysis, and modeling of the wind environments in which launch system operate. The initial activity was to examine the current usage and requirements for wind modeling for the Titan 4 vehicle. This was to be followed by joint technical efforts with NASA Langley Research Center to develop applicable analysis methods. This work was to be performed in and demonstrate the use of prototype tools implementing an environment in which to realize a unified system. At the convenience of the customer, due to resource limitations, the task was descoped. The survey of Titan 4 processes was accomplished and is reported in this document. A summary of general requirements is provided. Current versions of prototype Process Management Environment tools are being provided to the customer.

  9. Wind sensing, analysis, and modeling

    NASA Technical Reports Server (NTRS)

    Corvin, Michael A.

    1995-01-01

    The purpose of this task was to begin development of a unified approach to the sensing, analysis, and modeling of the wind environments in which launch systems operate. The initial activity was to examine the current usage and requirements for wind modeling for the Titan 4 vehicle. This was to be followed by joint technical efforts with NASA Langley Research Center to develop applicable analysis methods. This work was to be performed in and demonstrate the use of prototype tools implementing an environment in which to realize a unified system. At the convenience of the customer, due to resource limitations, the task was descoped. The survey of Titan 4 processes was accomplished and is reported in this document. A summary of general requirements is provided . Current versions of prototype Process Management Environment tools are being provided to the customer.

  10. Unified Approximations: A New Approach for Monoprotic Weak Acid-Base Equilibria

    ERIC Educational Resources Information Center

    Pardue, Harry; Odeh, Ihab N.; Tesfai, Teweldemedhin M.

    2004-01-01

    The unified approximations reduce the conceptual complexity by combining solutions for a relatively large number of different situations into just two similar sets of processes. Processes used to solve problems by either the unified or classical approximations require similar degrees of understanding of the underlying chemical processes.

  11. Unifying the field: developing an integrative paradigm for behavior therapy.

    PubMed

    Eifert, G H; Forsyth, J P; Schauss, S L

    1993-06-01

    The limitations of early conditioning models and treatments have led many behavior therapists to abandon conditioning principles and replace them with loosely defined cognitive theories and treatments. Systematic theory extensions to human behavior, using new concepts and processes derived from and built upon the basic principles, could have prevented the divisive debates over whether psychological dysfunctions are the results of conditioning or cognition and whether they should be treated with conditioning or cognitive techniques. Behavior therapy could also benefit from recent advances in experimental cognitive psychology that provide objective behavioral methods of studying dysfunctional processes. We suggest a unifying paradigm for explaining abnormal behavior that links and integrates different fields of study and processes that are frequently believed to be incompatible or antithetical such as biological vulnerability variables, learned behavioral repertoires, and that also links historical and current antecedents of the problem. An integrative paradigmatic behavioral approach may serve a unifying function in behavior therapy (a) by promoting an understanding of the dysfunctional processes involved in different disorders and (b) by helping clinicians conduct functional analyses that lead to theory-based, individualized, and effective treatments.

  12. Design Of Computer Based Test Using The Unified Modeling Language

    NASA Astrophysics Data System (ADS)

    Tedyyana, Agus; Danuri; Lidyawati

    2017-12-01

    The Admission selection of Politeknik Negeri Bengkalis through interest and talent search (PMDK), Joint Selection of admission test for state Polytechnics (SB-UMPN) and Independent (UM-Polbeng) were conducted by using paper-based Test (PBT). Paper Based Test model has some weaknesses. They are wasting too much paper, the leaking of the questios to the public, and data manipulation of the test result. This reasearch was Aimed to create a Computer-based Test (CBT) models by using Unified Modeling Language (UML) the which consists of Use Case diagrams, Activity diagram and sequence diagrams. During the designing process of the application, it is important to pay attention on the process of giving the password for the test questions before they were shown through encryption and description process. RSA cryptography algorithm was used in this process. Then, the questions shown in the questions banks were randomized by using the Fisher-Yates Shuffle method. The network architecture used in Computer Based test application was a client-server network models and Local Area Network (LAN). The result of the design was the Computer Based Test application for admission to the selection of Politeknik Negeri Bengkalis.

  13. SENSITIVITY OF OZONE AND AEROSOL PREDICTIONS TO THE TRANSPORT ALGORITHMS IN THE MODELS-3 COMMUNITY MULTI-SCALE AIR QUALITY (CMAQ) MODELING SYSTEM

    EPA Science Inventory

    EPA's Models-3 CMAQ system is intended to provide a community modeling paradigm that allows continuous improvement of the one-atmosphere modeling capability in a unified fashion. CMAQ's modular design promotes incorporation of several sets of science process modules representing ...

  14. Towards a unified theory of health-disease: II. Holopathogenesis

    PubMed Central

    Almeida-Filho, Naomar

    2014-01-01

    This article presents a systematic framework for modeling several classes of illness-sickness-disease named as Holopathogenesis. Holopathogenesis is defined as processes of over-determination of diseases and related conditions taken as a whole, comprising selected facets of the complex object Health. First, a conceptual background of Holopathogenesis is presented as a series of significant interfaces (biomolecular-immunological, physiopathological-clinical, epidemiological-ecosocial). Second, propositions derived from Holopathogenesis are introduced in order to allow drawing the disease-illness-sickness complex as a hierarchical network of networks. Third, a formalization of intra- and inter-level correspondences, over-determination processes, effects and links of Holopathogenesis models is proposed. Finally, the Holopathogenesis frame is evaluated as a comprehensive theoretical pathology taken as a preliminary step towards a unified theory of health-disease. PMID:24897040

  15. A quasi-likelihood approach to non-negative matrix factorization

    PubMed Central

    Devarajan, Karthik; Cheung, Vincent C.K.

    2017-01-01

    A unified approach to non-negative matrix factorization based on the theory of generalized linear models is proposed. This approach embeds a variety of statistical models, including the exponential family, within a single theoretical framework and provides a unified view of such factorizations from the perspective of quasi-likelihood. Using this framework, a family of algorithms for handling signal-dependent noise is developed and its convergence proven using the Expectation-Maximization algorithm. In addition, a measure to evaluate the goodness-of-fit of the resulting factorization is described. The proposed methods allow modeling of non-linear effects via appropriate link functions and are illustrated using an application in biomedical signal processing. PMID:27348511

  16. Customer-experienced rapid prototyping

    NASA Astrophysics Data System (ADS)

    Zhang, Lijuan; Zhang, Fu; Li, Anbo

    2008-12-01

    In order to describe accurately and comprehend quickly the perfect GIS requirements, this article will integrate the ideas of QFD (Quality Function Deployment) and UML (Unified Modeling Language), and analyze the deficiency of prototype development model, and will propose the idea of the Customer-Experienced Rapid Prototyping (CE-RP) and describe in detail the process and framework of the CE-RP, from the angle of the characteristics of Modern-GIS. The CE-RP is mainly composed of Customer Tool-Sets (CTS), Developer Tool-Sets (DTS) and Barrier-Free Semantic Interpreter (BF-SI) and performed by two roles of customer and developer. The main purpose of the CE-RP is to produce the unified and authorized requirements data models between customer and software developer.

  17. The polyadenylation code: a unified model for the regulation of mRNA alternative polyadenylation*

    PubMed Central

    Davis, Ryan; Shi, Yongsheng

    2014-01-01

    The majority of eukaryotic genes produce multiple mRNA isoforms with distinct 3′ ends through a process called mRNA alternative polyadenylation (APA). Recent studies have demonstrated that APA is dynamically regulated during development and in response to environmental stimuli. A number of mechanisms have been described for APA regulation. In this review, we attempt to integrate all the known mechanisms into a unified model. This model not only explains most of previous results, but also provides testable predictions that will improve our understanding of the mechanistic details of APA regulation. Finally, we briefly discuss the known and putative functions of APA regulation. PMID:24793760

  18. Unified connected theory of few-body reaction mechanisms in N-body scattering theory

    NASA Technical Reports Server (NTRS)

    Polyzou, W. N.; Redish, E. F.

    1978-01-01

    A unified treatment of different reaction mechanisms in nonrelativistic N-body scattering is presented. The theory is based on connected kernel integral equations that are expected to become compact for reasonable constraints on the potentials. The operators T/sub +-//sup ab/(A) are approximate transition operators that describe the scattering proceeding through an arbitrary reaction mechanism A. These operators are uniquely determined by a connected kernel equation and satisfy an optical theorem consistent with the choice of reaction mechanism. Connected kernel equations relating T/sub +-//sup ab/(A) to the full T/sub +-//sup ab/ allow correction of the approximate solutions for any ignored process to any order. This theory gives a unified treatment of all few-body reaction mechanisms with the same dynamic simplicity of a model calculation, but can include complicated reaction mechanisms involving overlapping configurations where it is difficult to formulate models.

  19. Acceptance and Commitment Therapy as a Unified Model of Behavior Change

    ERIC Educational Resources Information Center

    Hayes, Steven C.; Pistorello, Jacqueline; Levin, Michael E.

    2012-01-01

    The present article summarizes the assumptions, model, techniques, evidence, and diversity/social justice commitments of Acceptance and Commitment Therapy (ACT). ACT focused on six processes (acceptance, defusion, self, now, values, and action) that bear on a single overall target (psychological flexibility). The ACT model of behavior change has…

  20. A unified view on weakly correlated recurrent networks

    PubMed Central

    Grytskyy, Dmytro; Tetzlaff, Tom; Diesmann, Markus; Helias, Moritz

    2013-01-01

    The diversity of neuron models used in contemporary theoretical neuroscience to investigate specific properties of covariances in the spiking activity raises the question how these models relate to each other. In particular it is hard to distinguish between generic properties of covariances and peculiarities due to the abstracted model. Here we present a unified view on pairwise covariances in recurrent networks in the irregular regime. We consider the binary neuron model, the leaky integrate-and-fire (LIF) model, and the Hawkes process. We show that linear approximation maps each of these models to either of two classes of linear rate models (LRM), including the Ornstein–Uhlenbeck process (OUP) as a special case. The distinction between both classes is the location of additive noise in the rate dynamics, which is located on the output side for spiking models and on the input side for the binary model. Both classes allow closed form solutions for the covariance. For output noise it separates into an echo term and a term due to correlated input. The unified framework enables us to transfer results between models. For example, we generalize the binary model and the Hawkes process to the situation with synaptic conduction delays and simplify derivations for established results. Our approach is applicable to general network structures and suitable for the calculation of population averages. The derived averages are exact for fixed out-degree network architectures and approximate for fixed in-degree. We demonstrate how taking into account fluctuations in the linearization procedure increases the accuracy of the effective theory and we explain the class dependent differences between covariances in the time and the frequency domain. Finally we show that the oscillatory instability emerging in networks of LIF models with delayed inhibitory feedback is a model-invariant feature: the same structure of poles in the complex frequency plane determines the population power spectra. PMID:24151463

  1. Unified theory for stochastic modelling of hydroclimatic processes: Preserving marginal distributions, correlation structures, and intermittency

    NASA Astrophysics Data System (ADS)

    Papalexiou, Simon Michael

    2018-05-01

    Hydroclimatic processes come in all "shapes and sizes". They are characterized by different spatiotemporal correlation structures and probability distributions that can be continuous, mixed-type, discrete or even binary. Simulating such processes by reproducing precisely their marginal distribution and linear correlation structure, including features like intermittency, can greatly improve hydrological analysis and design. Traditionally, modelling schemes are case specific and typically attempt to preserve few statistical moments providing inadequate and potentially risky distribution approximations. Here, a single framework is proposed that unifies, extends, and improves a general-purpose modelling strategy, based on the assumption that any process can emerge by transforming a specific "parent" Gaussian process. A novel mathematical representation of this scheme, introducing parametric correlation transformation functions, enables straightforward estimation of the parent-Gaussian process yielding the target process after the marginal back transformation, while it provides a general description that supersedes previous specific parameterizations, offering a simple, fast and efficient simulation procedure for every stationary process at any spatiotemporal scale. This framework, also applicable for cyclostationary and multivariate modelling, is augmented with flexible parametric correlation structures that parsimoniously describe observed correlations. Real-world simulations of various hydroclimatic processes with different correlation structures and marginals, such as precipitation, river discharge, wind speed, humidity, extreme events per year, etc., as well as a multivariate example, highlight the flexibility, advantages, and complete generality of the method.

  2. Inflation, reheating, and dark matter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardenas, Victor H.

    2007-04-15

    In a recent paper, Liddle and Urena-Lopez suggested that to have a unified model of inflation and dark matter is imperative to have a proper reheating process where part of the inflaton field remains. In this paper I propose a model where this is possible. I found that incorporating the effect of plasma masses generated by the inflaton products enables us to stop the process. A numerical estimated model is presented.

  3. A Goddard Multi-Scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2010-01-01

    A multi-scale modeling system with unified physics has been developed at NASA Goddard Space Flight Center (GSFC). The system consists of an MMF, the coupled NASA Goddard finite-volume GCM (fvGCM) and Goddard Cumulus Ensemble model (GCE, a CRM); the state-of-the-art Weather Research and Forecasting model (WRF) and the stand alone GCE. These models can share the same microphysical schemes, radiation (including explicitly calculated cloud optical properties), and surface models that have been developed, improved and tested for different environments. In this talk, I will present: (1) A brief review on GCE model and its applications on the impact of the aerosol on deep precipitation processes, (2) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), and preliminary results (the comparison with traditional GCMs), and (3) A discussion on the Goddard WRF version (its developments and applications). We are also performing the inline tracer calculation to comprehend the ph ysical processes (i.e., boundary layer and each quadrant in the boundary layer) related to the development and structure of hurricanes and mesoscale convective systems.

  4. Enterprise Pattern: integrating the business process into a unified enterprise model of modern service company

    NASA Astrophysics Data System (ADS)

    Li, Ying; Luo, Zhiling; Yin, Jianwei; Xu, Lida; Yin, Yuyu; Wu, Zhaohui

    2017-01-01

    Modern service company (MSC), the enterprise involving special domains, such as the financial industry, information service industry and technology development industry, depends heavily on information technology. Modelling of such enterprise has attracted much research attention because it promises to help enterprise managers to analyse basic business strategies (e.g. the pricing strategy) and even optimise the business process (BP) to gain benefits. While the existing models proposed by economists cover the economic elements, they fail to address the basic BP and its relationship with the economic characteristics. Those proposed in computer science regardless of achieving great success in BP modelling perform poorly in supporting the economic analysis. Therefore, the existing approaches fail to satisfy the requirement of enterprise modelling for MSC, which demands simultaneous consideration of both economic analysing and business processing. In this article, we provide a unified enterprise modelling approach named Enterprise Pattern (EP) which bridges the gap between the BP model and the enterprise economic model of MSC. Proposing a language named Enterprise Pattern Description Language (EPDL) covering all the basic language elements of EP, we formulate the language syntaxes and two basic extraction rules assisting economic analysis. Furthermore, we extend Business Process Model and Notation (BPMN) to support EPDL, named BPMN for Enterprise Pattern (BPMN4EP). The example of mobile application platform is studied in detail for a better understanding of EPDL.

  5. Development and application of unified algorithms for problems in computational science

    NASA Technical Reports Server (NTRS)

    Shankar, Vijaya; Chakravarthy, Sukumar

    1987-01-01

    A framework is presented for developing computationally unified numerical algorithms for solving nonlinear equations that arise in modeling various problems in mathematical physics. The concept of computational unification is an attempt to encompass efficient solution procedures for computing various nonlinear phenomena that may occur in a given problem. For example, in Computational Fluid Dynamics (CFD), a unified algorithm will be one that allows for solutions to subsonic (elliptic), transonic (mixed elliptic-hyperbolic), and supersonic (hyperbolic) flows for both steady and unsteady problems. The objectives are: development of superior unified algorithms emphasizing accuracy and efficiency aspects; development of codes based on selected algorithms leading to validation; application of mature codes to realistic problems; and extension/application of CFD-based algorithms to problems in other areas of mathematical physics. The ultimate objective is to achieve integration of multidisciplinary technologies to enhance synergism in the design process through computational simulation. Specific unified algorithms for a hierarchy of gas dynamics equations and their applications to two other areas: electromagnetic scattering, and laser-materials interaction accounting for melting.

  6. Photoionization and Recombination

    NASA Technical Reports Server (NTRS)

    Nahar, Sultana N.

    2000-01-01

    Theoretically self-consistent calculations for photoionization and (e + ion) recombination are described. The same eigenfunction expansion for the ion is employed in coupled channel calculations for both processes, thus ensuring consistency between cross sections and rates. The theoretical treatment of (e + ion) recombination subsumes both the non-resonant recombination ("radiative recombination"), and the resonant recombination ("di-electronic recombination") processes in a unified scheme. In addition to the total, unified recombination rates, level-specific recombination rates and photoionization cross sections are obtained for a large number of atomic levels. Both relativistic Breit-Pauli, and non-relativistic LS coupling, calculations are carried out in the close coupling approximation using the R-matrix method. Although the calculations are computationally intensive, they yield nearly all photoionization and recombination parameters needed for astrophysical photoionization models with higher precision than hitherto possible, estimated at about 10-20% from comparison with experimentally available data (including experimentally derived DR rates). Results are electronically available for over 40 atoms and ions. Photoionization and recombination of He-, and Li-like C and Fe are described for X-ray modeling. The unified method yields total and complete (e+ion) recombination rate coefficients, that can not otherwise be obtained theoretically or experimentally.

  7. DEVS Unified Process for Web-Centric Development and Testing of System of Systems

    DTIC Science & Technology

    2008-05-20

    gathering from the user. Further, methodologies have been developed to generate DEVS models from BPMN /BPEL-based and message-based requirement specifications...27] 3. BPMN /BPEL based system specifications: Business Process Modeling Notation ( BPMN ) [bpm] or Business Process Execution Language (BPEL) provide a...information is stored in .wsdl and .bpel files for BPEL but in proprietary format for BPMN . 4. DoDAF-based requirement specifications: Department of

  8. Hyporheic flow and transport processes: mechanisms, models, and biogeochemical implications

    USGS Publications Warehouse

    Boano, Fulvio; Harvey, Judson W.; Marion, Andrea; Packman, Aaron I.; Revelli, Roberto; Ridolfi, Luca; Anders, Wörman

    2014-01-01

    Fifty years of hyporheic zone research have shown the important role played by the hyporheic zone as an interface between groundwater and surface waters. However, it is only in the last two decades that what began as an empirical science has become a mechanistic science devoted to modeling studies of the complex fluid dynamical and biogeochemical mechanisms occurring in the hyporheic zone. These efforts have led to the picture of surface-subsurface water interactions as regulators of the form and function of fluvial ecosystems. Rather than being isolated systems, surface water bodies continuously interact with the subsurface. Exploration of hyporheic zone processes has led to a new appreciation of their wide reaching consequences for water quality and stream ecology. Modern research aims toward a unified approach, in which processes occurring in the hyporheic zone are key elements for the appreciation, management, and restoration of the whole river environment. In this unifying context, this review summarizes results from modeling studies and field observations about flow and transport processes in the hyporheic zone and describes the theories proposed in hydrology and fluid dynamics developed to quantitatively model and predict the hyporheic transport of water, heat, and dissolved and suspended compounds from sediment grain scale up to the watershed scale. The implications of these processes for stream biogeochemistry and ecology are also discussed."

  9. Hyporheic flow and transport processes: Mechanisms, models, and biogeochemical implications

    NASA Astrophysics Data System (ADS)

    Boano, F.; Harvey, J. W.; Marion, A.; Packman, A. I.; Revelli, R.; Ridolfi, L.; Wörman, A.

    2014-12-01

    Fifty years of hyporheic zone research have shown the important role played by the hyporheic zone as an interface between groundwater and surface waters. However, it is only in the last two decades that what began as an empirical science has become a mechanistic science devoted to modeling studies of the complex fluid dynamical and biogeochemical mechanisms occurring in the hyporheic zone. These efforts have led to the picture of surface-subsurface water interactions as regulators of the form and function of fluvial ecosystems. Rather than being isolated systems, surface water bodies continuously interact with the subsurface. Exploration of hyporheic zone processes has led to a new appreciation of their wide reaching consequences for water quality and stream ecology. Modern research aims toward a unified approach, in which processes occurring in the hyporheic zone are key elements for the appreciation, management, and restoration of the whole river environment. In this unifying context, this review summarizes results from modeling studies and field observations about flow and transport processes in the hyporheic zone and describes the theories proposed in hydrology and fluid dynamics developed to quantitatively model and predict the hyporheic transport of water, heat, and dissolved and suspended compounds from sediment grain scale up to the watershed scale. The implications of these processes for stream biogeochemistry and ecology are also discussed.

  10. Cognitive Affective Engagement Model of Multiple Source Use

    ERIC Educational Resources Information Center

    List, Alexandra; Alexander, Patricia A.

    2017-01-01

    This article introduces the cognitive affective engagement model (CAEM) of multiple source use. The CAEM is presented as a way of unifying cognitive and behaviorally focused models of multiple text engagement with research on the role of affective factors (e.g., interest) in text processing. The CAEM proposes that students' engagement with…

  11. Higher Education Quality Assessment Model: Towards Achieving Educational Quality Standard

    ERIC Educational Resources Information Center

    Noaman, Amin Y.; Ragab, Abdul Hamid M.; Madbouly, Ayman I.; Khedra, Ahmed M.; Fayoumi, Ayman G.

    2017-01-01

    This paper presents a developed higher education quality assessment model (HEQAM) that can be applied for enhancement of university services. This is because there is no universal unified quality standard model that can be used to assess the quality criteria of higher education institutes. The analytical hierarchy process is used to identify the…

  12. Colaborated Architechture Framework for Composition UML 2.0 in Zachman Framework

    NASA Astrophysics Data System (ADS)

    Hermawan; Hastarista, Fika

    2016-01-01

    Zachman Framework (ZF) is the framework of enterprise architechture that most widely adopted in the Enterprise Information System (EIS) development. In this study, has been developed Colaborated Architechture Framework (CAF) to collaborate ZF with Unified Modeling Language (UML) 2.0 modeling. The CAF provides the composition of ZF matrix that each cell is consist of the Model Driven architechture (MDA) from the various UML models and many Software Requirement Specification (SRS) documents. Implementation of this modeling is used to develops Enterprise Resource Planning (ERP). Because ERP have a coverage of applications in large numbers and complexly relations, it is necessary to use Agile Model Driven Design (AMDD) approach as an advanced method to transforms MDA into components of application modules with efficiently and accurately. Finally, through the using of the CAF, give good achievement in fullfilment the needs from all stakeholders that are involved in the overall process stage of Rational Unified Process (RUP), and also obtaining a high satisfaction to fullfiled the functionality features of the ERP software in PT. Iglas (Persero) Gresik.

  13. A unified model for transfer alignment at random misalignment angles based on second-order EKF

    NASA Astrophysics Data System (ADS)

    Cui, Xiao; Mei, Chunbo; Qin, Yongyuan; Yan, Gongmin; Liu, Zhenbo

    2017-04-01

    In the transfer alignment process of inertial navigation systems (INSs), the conventional linear error model based on the small misalignment angle assumption cannot be applied to large misalignment situations. Furthermore, the nonlinear model based on the large misalignment angle suffers from redundant computation with nonlinear filters. This paper presents a unified model for transfer alignment suitable for arbitrary misalignment angles. The alignment problem is transformed into an estimation of the relative attitude between the master INS (MINS) and the slave INS (SINS), by decomposing the attitude matrix of the latter. Based on the Rodriguez parameters, a unified alignment model in the inertial frame with the linear state-space equation and a second order nonlinear measurement equation are established, without making any assumptions about the misalignment angles. Furthermore, we employ the Taylor series expansions on the second-order nonlinear measurement equation to implement the second-order extended Kalman filter (EKF2). Monte-Carlo simulations demonstrate that the initial alignment can be fulfilled within 10 s, with higher accuracy and much smaller computational cost compared with the traditional unscented Kalman filter (UKF) at large misalignment angles.

  14. The application of the unified modeling language in object-oriented analysis of healthcare information systems.

    PubMed

    Aggarwal, Vinod

    2002-10-01

    This paper concerns itself with the beneficial effects of the Unified Modeling Language (UML), a nonproprietary object modeling standard, in specifying, visualizing, constructing, documenting, and communicating the model of a healthcare information system from the user's perspective. The author outlines the process of object-oriented analysis (OOA) using the UML and illustrates this with healthcare examples to demonstrate the practicality of application of the UML by healthcare personnel to real-world information system problems. The UML will accelerate advanced uses of object-orientation such as reuse technology, resulting in significantly higher software productivity. The UML is also applicable in the context of a component paradigm that promises to enhance the capabilities of healthcare information systems and simplify their management and maintenance.

  15. Unifying models of dialect spread and extinction using surface tension dynamics

    PubMed Central

    2018-01-01

    We provide a unified mathematical explanation of two classical forms of spatial linguistic spread. The wave model describes the radiation of linguistic change outwards from a central focus. Changes can also jump between population centres in a process known as hierarchical diffusion. It has recently been proposed that the spatial evolution of dialects can be understood using surface tension at linguistic boundaries. Here we show that the inclusion of long-range interactions in the surface tension model generates both wave-like spread, and hierarchical diffusion, and that it is surface tension that is the dominant effect in deciding the stable distribution of dialect patterns. We generalize the model to allow population mixing which can induce shrinkage of linguistic domains, or destroy dialect regions from within. PMID:29410847

  16. Cortical and subcortical predictive dynamics and learning during perception, cognition, emotion and action

    PubMed Central

    Grossberg, Stephen

    2009-01-01

    An intimate link exists between the predictive and learning processes in the brain. Perceptual/cognitive and spatial/motor processes use complementary predictive mechanisms to learn, recognize, attend and plan about objects in the world, determine their current value, and act upon them. Recent neural models clarify these mechanisms and how they interact in cortical and subcortical brain regions. The present paper reviews and synthesizes data and models of these processes, and outlines a unified theory of predictive brain processing. PMID:19528003

  17. Unifying Theory of Low-Energy Nuclear Reaction and Transmutation Processes in Deuterated/hydrogenated Metals, Acoustic Cavitation, Glow Discharge, and Deuteron Beam Experiments

    NASA Astrophysics Data System (ADS)

    Kim, Yeong E.; Zubarev, Alexander L.

    The most basic theoretical challenge for understanding low-energy nuclear reaction (LENR) and transmutation reaction (LETR) in condensed matters is to find mechanisms by which the large Coulomb barrier between fusing nuclei can be overcome. A unifying theory of LENR and LETR has been developed to provide possible mechanisms for the LENR and LETR processes in matters based on high-density nano-scale and micro-scale quantum plasmas. It is shown that recently developed theoretical models based on Bose-Einstein Fusion (BEF) mechanism and Quantum Plasma Nuclear Fusion (QPNF) mechanism are applicable to the results of many different types of LENR and LETR experiments.

  18. An Abstract Process and Metrics Model for Evaluating Unified Command and Control: A Scenario and Technology Agnostic Approach

    DTIC Science & Technology

    2004-06-01

    18 EBO Cognitive or Memetic input type ..................................................................... 18 Unanticipated EBO generated... Memetic Effects Based COA.................................................................................... 23 Policy...41 Belief systems or Memetic Content Metrics

  19. Neural Underpinnings of Decision Strategy Selection: A Review and a Theoretical Model.

    PubMed

    Wichary, Szymon; Smolen, Tomasz

    2016-01-01

    In multi-attribute choice, decision makers use decision strategies to arrive at the final choice. What are the neural mechanisms underlying decision strategy selection? The first goal of this paper is to provide a literature review on the neural underpinnings and cognitive models of decision strategy selection and thus set the stage for a neurocognitive model of this process. The second goal is to outline such a unifying, mechanistic model that can explain the impact of noncognitive factors (e.g., affect, stress) on strategy selection. To this end, we review the evidence for the factors influencing strategy selection, the neural basis of strategy use and the cognitive models of this process. We also present the Bottom-Up Model of Strategy Selection (BUMSS). The model assumes that the use of the rational Weighted Additive strategy and the boundedly rational heuristic Take The Best can be explained by one unifying, neurophysiologically plausible mechanism, based on the interaction of the frontoparietal network, orbitofrontal cortex, anterior cingulate cortex and the brainstem nucleus locus coeruleus. According to BUMSS, there are three processes that form the bottom-up mechanism of decision strategy selection and lead to the final choice: (1) cue weight computation, (2) gain modulation, and (3) weighted additive evaluation of alternatives. We discuss how these processes might be implemented in the brain, and how this knowledge allows us to formulate novel predictions linking strategy use and neural signals.

  20. Aerospace System Unified Life Cycle Engineering Producibility Measurement Issues

    DTIC Science & Technology

    1989-05-01

    Control .................................................................. 11-9 5 . C o st...in the development process; these computer -aided models offer clarity approaching that of a prototype model. Once a part geometry is represented...of part geometry , allowing manufacturability evaluation and possibly other computer -integrated manufacturing (CIM) tasks. (Other papers that discuss

  1. Building Dynamic Conceptual Physics Understanding

    ERIC Educational Resources Information Center

    Trout, Charlotte; Sinex, Scott A.; Ragan, Susan

    2011-01-01

    Models are essential to the learning and doing of science, and systems thinking is key to appreciating many environmental issues. The National Science Education Standards include models and systems in their unifying concepts and processes standard, while the AAAS Benchmarks include them in their common themes chapter. Hyerle and Marzano argue for…

  2. Performance modeling codes for the QuakeSim problem solving environment

    NASA Technical Reports Server (NTRS)

    Parker, J. W.; Donnellan, A.; Lyzenga, G.; Rundle, J.; Tullis, T.

    2003-01-01

    The QuakeSim Problem Solving Environment uses a web-services approach to unify and deploy diverse remote data sources and processing services within a browser environment. Here we focus on the high-performance crustal modeling applications that will be included in this set of remote but interoperable applications.

  3. A unified statistical approach to non-negative matrix factorization and probabilistic latent semantic indexing

    PubMed Central

    Wang, Guoli; Ebrahimi, Nader

    2014-01-01

    Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H, such that V ∼ W H. It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H. In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data. PMID:25821345

  4. A unified statistical approach to non-negative matrix factorization and probabilistic latent semantic indexing.

    PubMed

    Devarajan, Karthik; Wang, Guoli; Ebrahimi, Nader

    2015-04-01

    Non-negative matrix factorization (NMF) is a powerful machine learning method for decomposing a high-dimensional nonnegative matrix V into the product of two nonnegative matrices, W and H , such that V ∼ W H . It has been shown to have a parts-based, sparse representation of the data. NMF has been successfully applied in a variety of areas such as natural language processing, neuroscience, information retrieval, image processing, speech recognition and computational biology for the analysis and interpretation of large-scale data. There has also been simultaneous development of a related statistical latent class modeling approach, namely, probabilistic latent semantic indexing (PLSI), for analyzing and interpreting co-occurrence count data arising in natural language processing. In this paper, we present a generalized statistical approach to NMF and PLSI based on Renyi's divergence between two non-negative matrices, stemming from the Poisson likelihood. Our approach unifies various competing models and provides a unique theoretical framework for these methods. We propose a unified algorithm for NMF and provide a rigorous proof of monotonicity of multiplicative updates for W and H . In addition, we generalize the relationship between NMF and PLSI within this framework. We demonstrate the applicability and utility of our approach as well as its superior performance relative to existing methods using real-life and simulated document clustering data.

  5. An Estimation Procedure for the Structural Parameters of the Unified Cognitive/IRT Model.

    ERIC Educational Resources Information Center

    Jiang, Hai; And Others

    L. V. DiBello, W. F. Stout, and L. A. Roussos (1993) have developed a new item response model, the Unified Model, which brings together the discrete, deterministic aspects of cognition favored by cognitive scientists, and the continuous, stochastic aspects of test response behavior that underlie item response theory (IRT). The Unified Model blends…

  6. A Unified Approach to Modeling Multidisciplinary Interactions

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.; Bhatia, Kumar G.

    2000-01-01

    There are a number of existing methods to transfer information among various disciplines. For a multidisciplinary application with n disciplines, the traditional methods may be required to model (n(exp 2) - n) interactions. This paper presents a unified three-dimensional approach that reduces the number of interactions from (n(exp 2) - n) to 2n by using a computer-aided design model. The proposed modeling approach unifies the interactions among various disciplines. The approach is independent of specific discipline implementation, and a number of existing methods can be reformulated in the context of the proposed unified approach. This paper provides an overview of the proposed unified approach and reformulations for two existing methods. The unified approach is specially tailored for application environments where the geometry is created and managed through a computer-aided design system. Results are presented for a blended-wing body and a high-speed civil transport.

  7. DICOM static and dynamic representation through unified modeling language

    NASA Astrophysics Data System (ADS)

    Martinez-Martinez, Alfonso; Jimenez-Alaniz, Juan R.; Gonzalez-Marquez, A.; Chavez-Avelar, N.

    2004-04-01

    The DICOM standard, as all standards, specifies in generic way the management in network and storage media environments of digital medical images and their related information. However, understanding the specifications for particular implementation is not a trivial work. Thus, this work is about understanding and modelling parts of the DICOM standard using Object Oriented methodologies, as part of software development processes. This has offered different static and dynamic views, according with the standard specifications, and the resultant models have been represented through the Unified Modelling Language (UML). The modelled parts are related to network conformance claim: Network Communication Support for Message Exchange, Message Exchange, Information Object Definitions, Service Class Specifications, Data Structures and Encoding, and Data Dictionary. The resultant models have given a better understanding about DICOM parts and have opened the possibility of create a software library to develop DICOM conformable PACS applications.

  8. Stochastic nonlinear dynamics pattern formation and growth models

    PubMed Central

    Yaroslavsky, Leonid P

    2007-01-01

    Stochastic evolutionary growth and pattern formation models are treated in a unified way in terms of algorithmic models of nonlinear dynamic systems with feedback built of a standard set of signal processing units. A number of concrete models is described and illustrated by numerous examples of artificially generated patterns that closely imitate wide variety of patterns found in the nature. PMID:17908341

  9. A Goddard Multi-Scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2010-01-01

    A multi-scale modeling system with unified physics has been developed at NASA Goddard Space Flight Center (GSFC). The system consists of an MMF, the coupled NASA Goddard finite-volume GCM (fvGCM) and Goddard Cumulus Ensemble model (GCE, a CRM); the state-of-the-art Weather Research and Forecasting model (WRF) and the stand alone GCE. These models can share the same microphysical schemes, radiation (including explicitly calculated cloud optical properties), and surface models that have been developed, improved and tested for different environments. In this talk, I will present: (1) A brief review on GCE model and its applications on the impact of the aerosol on deep precipitation processes, (2) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), and preliminary results (the comparison with traditional GCMs), and (3) A discussion on the Goddard WRF version (its developments and applications). We are also performing the inline tracer calculation to comprehend the physical processes (i.e., boundary layer and each quadrant in the boundary layer) related to the development and structure of hurricanes and mesoscale convective systems. In addition, high - resolution (spatial. 2km, and temporal, I minute) visualization showing the model results will be presented.

  10. Unified semiclassical theory for the two-state system: an analytical solution for general nonadiabatic tunneling.

    PubMed

    Zhu, Chaoyuan; Lin, Sheng Hsien

    2006-07-28

    Unified semiclasical solution for general nonadiabatic tunneling between two adiabatic potential energy surfaces is established by employing unified semiclassical solution for pure nonadiabatic transition [C. Zhu, J. Chem. Phys. 105, 4159 (1996)] with the certain symmetry transformation. This symmetry comes from a detailed analysis of the reduced scattering matrix for Landau-Zener type of crossing as a special case of nonadiabatic transition and nonadiabatic tunneling. Traditional classification of crossing and noncrossing types of nonadiabatic transition can be quantitatively defined by the rotation angle of adiabatic-to-diabatic transformation, and this rotational angle enters the analytical solution for general nonadiabatic tunneling. The certain two-state exponential potential models are employed for numerical tests, and the calculations from the present general nonadiabatic tunneling formula are demonstrated in very good agreement with the results from exact quantum mechanical calculations. The present general nonadiabatic tunneling formula can be incorporated with various mixed quantum-classical methods for modeling electronically nonadiabatic processes in photochemistry.

  11. Workload capacity spaces: a unified methodology for response time measures of efficiency as workload is varied.

    PubMed

    Townsend, James T; Eidels, Ami

    2011-08-01

    Increasing the number of available sources of information may impair or facilitate performance, depending on the capacity of the processing system. Tests performed on response time distributions are proving to be useful tools in determining the workload capacity (as well as other properties) of cognitive systems. In this article, we develop a framework and relevant mathematical formulae that represent different capacity assays (Miller's race model bound, Grice's bound, and Townsend's capacity coefficient) in the same space. The new space allows a direct comparison between the distinct bounds and the capacity coefficient values and helps explicate the relationships among the different measures. An analogous common space is proposed for the AND paradigm, relating the capacity index to the Colonius-Vorberg bounds. We illustrate the effectiveness of the unified spaces by presenting data from two simulated models (standard parallel, coactive) and a prototypical visual detection experiment. A conversion table for the unified spaces is provided.

  12. Dataflow models for fault-tolerant control systems

    NASA Technical Reports Server (NTRS)

    Papadopoulos, G. M.

    1984-01-01

    Dataflow concepts are used to generate a unified hardware/software model of redundant physical systems which are prone to faults. Basic results in input congruence and synchronization are shown to reduce to a simple model of data exchanges between processing sites. Procedures are given for the construction of congruence schemata, the distinguishing features of any correctly designed redundant system.

  13. Unified multiphase modeling for evolving, acoustically coupled systems consisting of acoustic, elastic, poroelastic media and septa

    NASA Astrophysics Data System (ADS)

    Lee, Joong Seok; Kang, Yeon June; Kim, Yoon Young

    2012-12-01

    This paper presents a new modeling technique that can represent acoustically coupled systems in a unified manner. The proposed unified multiphase (UMP) modeling technique uses Biot's equations that are originally derived for poroelastic media to represent not only poroelastic media but also non-poroelastic ones ranging from acoustic and elastic media to septa. To recover the original vibro-acoustic behaviors of non-poroelastic media, material parameters of a base poroelastic medium are adjusted depending on the target media. The real virtue of this UMP technique is that interface coupling conditions between any media can be automatically satisfied, so no medium-dependent interface condition needs to be imposed explicitly. Thereby, the proposed technique can effectively model any acoustically coupled system having locally varying medium phases and evolving interfaces. A typical situation can occur in an iterative design process. Because the proposed UMP modeling technique needs theoretical justifications for further development, this work is mainly focused on how the technique recovers the governing equations of non-poroelastic media and expresses their interface conditions. We also address how to describe various boundary conditions of the media in the technique. Some numerical studies are carried out to demonstrate the validity of the proposed modeling technique.

  14. Construction of multi-functional open modulized Matlab simulation toolbox for imaging ladar system

    NASA Astrophysics Data System (ADS)

    Wu, Long; Zhao, Yuan; Tang, Meng; He, Jiang; Zhang, Yong

    2011-06-01

    Ladar system simulation is to simulate the ladar models using computer simulation technology in order to predict the performance of the ladar system. This paper presents the developments of laser imaging radar simulation for domestic and overseas studies and the studies of computer simulation on ladar system with different application requests. The LadarSim and FOI-LadarSIM simulation facilities of Utah State University and Swedish Defence Research Agency are introduced in details. This paper presents the low level of simulation scale, un-unified design and applications of domestic researches in imaging ladar system simulation, which are mostly to achieve simple function simulation based on ranging equations for ladar systems. Design of laser imaging radar simulation with open and modularized structure is proposed to design unified modules for ladar system, laser emitter, atmosphere models, target models, signal receiver, parameters setting and system controller. Unified Matlab toolbox and standard control modules have been built with regulated input and output of the functions, and the communication protocols between hardware modules. A simulation based on ICCD gain-modulated imaging ladar system for a space shuttle is made based on the toolbox. The simulation result shows that the models and parameter settings of the Matlab toolbox are able to simulate the actual detection process precisely. The unified control module and pre-defined parameter settings simplify the simulation of imaging ladar detection. Its open structures enable the toolbox to be modified for specialized requests. The modulization gives simulations flexibility.

  15. Neutral theory and the species abundance distribution: recent developments and prospects for unifying niche and neutral perspectives

    PubMed Central

    Matthews, Thomas J; Whittaker, Robert J

    2014-01-01

    Published in 2001, The Unified Neutral Theory of Biodiversity and Biogeography (UNTB) emphasizes the importance of stochastic processes in ecological community structure, and has challenged the traditional niche-based view of ecology. While neutral models have since been applied to a broad range of ecological and macroecological phenomena, the majority of research relating to neutral theory has focused exclusively on the species abundance distribution (SAD). Here, we synthesize the large body of work on neutral theory in the context of the species abundance distribution, with a particular focus on integrating ideas from neutral theory with traditional niche theory. First, we summarize the basic tenets of neutral theory; both in general and in the context of SADs. Second, we explore the issues associated with neutral theory and the SAD, such as complications with fitting and model comparison, the underlying assumptions of neutral models, and the difficultly of linking pattern to process. Third, we highlight the advances in understanding of SADs that have resulted from neutral theory and models. Finally, we focus consideration on recent developments aimed at unifying neutral- and niche-based approaches to ecology, with a particular emphasis on what this means for SAD theory, embracing, for instance, ideas of emergent neutrality and stochastic niche theory. We put forward the argument that the prospect of the unification of niche and neutral perspectives represents one of the most promising future avenues of neutral theory research. PMID:25360266

  16. Neutral theory and the species abundance distribution: recent developments and prospects for unifying niche and neutral perspectives.

    PubMed

    Matthews, Thomas J; Whittaker, Robert J

    2014-06-01

    Published in 2001, The Unified Neutral Theory of Biodiversity and Biogeography (UNTB) emphasizes the importance of stochastic processes in ecological community structure, and has challenged the traditional niche-based view of ecology. While neutral models have since been applied to a broad range of ecological and macroecological phenomena, the majority of research relating to neutral theory has focused exclusively on the species abundance distribution (SAD). Here, we synthesize the large body of work on neutral theory in the context of the species abundance distribution, with a particular focus on integrating ideas from neutral theory with traditional niche theory. First, we summarize the basic tenets of neutral theory; both in general and in the context of SADs. Second, we explore the issues associated with neutral theory and the SAD, such as complications with fitting and model comparison, the underlying assumptions of neutral models, and the difficultly of linking pattern to process. Third, we highlight the advances in understanding of SADs that have resulted from neutral theory and models. Finally, we focus consideration on recent developments aimed at unifying neutral- and niche-based approaches to ecology, with a particular emphasis on what this means for SAD theory, embracing, for instance, ideas of emergent neutrality and stochastic niche theory. We put forward the argument that the prospect of the unification of niche and neutral perspectives represents one of the most promising future avenues of neutral theory research.

  17. MODEL CHANGES SINCE 1991

    Science.gov Websites

    , effects of balloon drift in time and space included Forecast and post processing: Improved orography minor changes: Observations and analysis: Higher resolution sea ice mask Forecast and post processing . 12/04/07 12Z: Use of Unified Post Processor in GFS 12/04/07 12Z: GFS Ensemble (NAEFS/TIGGE) UPGRADE

  18. The Development of Cadastral Domain Model Oriented at Unified Real Estate Registration of China Based on Ontology

    NASA Astrophysics Data System (ADS)

    Li, M.; Zhu, X.; Shen, C.; Chen, D.; Guo, W.

    2012-07-01

    With the certain regulation of unified real estate registration taken by the Property Law and the step-by-step advance of simultaneous development in urban and rural in China, it is the premise and foundation to clearly specify property rights and their relations in promoting the integrated management of urban and rural land. This paper aims at developing a cadastral domain model oriented at unified real estate registration of China from the perspective of legal and spatial, which set up the foundation for unified real estate registration, and facilitates the effective interchange of cadastral information and the administration of land use. The legal cadastral model is provided based on the analysis of gap between current model and the demand of unified real estate registration, which implies the restrictions between different rights. Then the new cadastral domain model is constructed based on the legal cadastral domain model and CCDM (van Oosterom et al., 2006), which integrate real estate rights of urban land and rural land. Finally, the model is validated by a prototype system. The results show that the model is applicable for unified real estate registration in China.

  19. Unifying Screening Processes Within the PROSPR Consortium: A Conceptual Model for Breast, Cervical, and Colorectal Cancer Screening

    PubMed Central

    Kim, Jane J.; Schapira, Marilyn M.; Tosteson, Anna N. A.; Zauber, Ann G.; Geiger, Ann M.; Kamineni, Aruna; Weaver, Donald L.; Tiro, Jasmin A.

    2015-01-01

    General frameworks of the cancer screening process are available, but none directly compare the process in detail across different organ sites. This limits the ability of medical and public health professionals to develop and evaluate coordinated screening programs that apply resources and population management strategies available for one cancer site to other sites. We present a trans-organ conceptual model that incorporates a single screening episode for breast, cervical, and colorectal cancers into a unified framework based on clinical guidelines and protocols; the model concepts could be expanded to other organ sites. The model covers four types of care in the screening process: risk assessment, detection, diagnosis, and treatment. Interfaces between different provider teams (eg, primary care and specialty care), including communication and transfer of responsibility, may occur when transitioning between types of care. Our model highlights across each organ site similarities and differences in steps, interfaces, and transitions in the screening process and documents the conclusion of a screening episode. This model was developed within the National Cancer Institute–funded consortium Population-based Research Optimizing Screening through Personalized Regimens (PROSPR). PROSPR aims to optimize the screening process for breast, cervical, and colorectal cancer and includes seven research centers and a statistical coordinating center. Given current health care reform initiatives in the United States, this conceptual model can facilitate the development of comprehensive quality metrics for cancer screening and promote trans-organ comparative cancer screening research. PROSPR findings will support the design of interventions that improve screening outcomes across multiple cancer sites. PMID:25957378

  20. Unified underpinning of human mobility in the real world and cyberspace

    NASA Astrophysics Data System (ADS)

    Zhao, Yi-Ming; Zeng, An; Yan, Xiao-Yong; Wang, Wen-Xu; Lai, Ying-Cheng

    2016-05-01

    Human movements in the real world and in cyberspace affect not only dynamical processes such as epidemic spreading and information diffusion but also social and economical activities such as urban planning and personalized recommendation in online shopping. Despite recent efforts in characterizing and modeling human behaviors in both the real and cyber worlds, the fundamental dynamics underlying human mobility have not been well understood. We develop a minimal, memory-based random walk model in limited space for reproducing, with a single parameter, the key statistical behaviors characterizing human movements in both cases. The model is validated using relatively big data from mobile phone and online commerce, suggesting memory-based random walk dynamics as the unified underpinning for human mobility, regardless of whether it occurs in the real world or in cyberspace.

  1. Model-Unified Planning and Execution for Distributed Autonomous System Control

    NASA Technical Reports Server (NTRS)

    Aschwanden, Pascal; Baskaran, Vijay; Bernardini, Sara; Fry, Chuck; Moreno, Maria; Muscettola, Nicola; Plaunt, Chris; Rijsman, David; Tompkins, Paul

    2006-01-01

    The Intelligent Distributed Execution Architecture (IDEA) is a real-time architecture that exploits artificial intelligence planning as the core reasoning engine for interacting autonomous agents. Rather than enforcing separate deliberation and execution layers, IDEA unifies them under a single planning technology. Deliberative and reactive planners reason about and act according to a single representation of the past, present and future domain state. The domain state behaves the rules dictated by a declarative model of the subsystem to be controlled, internal processes of the IDEA controller, and interactions with other agents. We present IDEA concepts - modeling, the IDEA core architecture, the unification of deliberation and reaction under planning - and illustrate its use in a simple example. Finally, we present several real-world applications of IDEA, and compare IDEA to other high-level control approaches.

  2. Using Multi-Scale Modeling Systems and Satellite Data to Study the Precipitation Processes

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Chern, J.; Lamg, S.; Matsui, T.; Shen, B.; Zeng, X.; Shi, R.

    2011-01-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (l) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, the recent developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the precipitating systems and hurricanes/typhoons will be presented. The high-resolution spatial and temporal visualization will be utilized to show the evolution of precipitation processes. Also how to use of the multi-satellite simulator tqimproy precipitation processes will be discussed.

  3. Neuro-cognitive mechanisms of conscious and unconscious visual perception: From a plethora of phenomena to general principles

    PubMed Central

    Kiefer, Markus; Ansorge, Ulrich; Haynes, John-Dylan; Hamker, Fred; Mattler, Uwe; Verleger, Rolf; Niedeggen, Michael

    2011-01-01

    Psychological and neuroscience approaches have promoted much progress in elucidating the cognitive and neural mechanisms that underlie phenomenal visual awareness during the last decades. In this article, we provide an overview of the latest research investigating important phenomena in conscious and unconscious vision. We identify general principles to characterize conscious and unconscious visual perception, which may serve as important building blocks for a unified model to explain the plethora of findings. We argue that in particular the integration of principles from both conscious and unconscious vision is advantageous and provides critical constraints for developing adequate theoretical models. Based on the principles identified in our review, we outline essential components of a unified model of conscious and unconscious visual perception. We propose that awareness refers to consolidated visual representations, which are accessible to the entire brain and therefore globally available. However, visual awareness not only depends on consolidation within the visual system, but is additionally the result of a post-sensory gating process, which is mediated by higher-level cognitive control mechanisms. We further propose that amplification of visual representations by attentional sensitization is not exclusive to the domain of conscious perception, but also applies to visual stimuli, which remain unconscious. Conscious and unconscious processing modes are highly interdependent with influences in both directions. We therefore argue that exactly this interdependence renders a unified model of conscious and unconscious visual perception valuable. Computational modeling jointly with focused experimental research could lead to a better understanding of the plethora of empirical phenomena in consciousness research. PMID:22253669

  4. A time for multi-scale modeling of anti-fibrotic therapies. Comment on "Towards a unified approach in the modeling of fibrosis: A review with research perspectives" by Martine Ben Amar and Carlo Bianca

    NASA Astrophysics Data System (ADS)

    Wu, Min

    2016-07-01

    The development of anti-fibrotic therapies in diversities of diseases becomes more and more urgent recently, such as in pulmonary, renal and liver fibrosis [1,2], as well as in malignant tumor growths [3]. As reviewed by Ben Amar and Bianca [4], various theoretical, experimental and in-silico models have been developed to understand the fibrosis process, where the implication on therapeutic strategies has also been frequently demonstrated (e.g., [5-7]). In [4], these models are analyzed and sorted according to their approaches, and in the end of [4], a unified multi-scale approach was proposed to understand fibrosis. While one of the major purposes of extensive modeling of fibrosis is to shed light on therapeutic strategies, the theoretical, experimental and in-silico studies of anti-fibrosis therapies should be conducted more intensively.

  5. Building a Unified Information Network.

    ERIC Educational Resources Information Center

    Avram, Henriette D.

    1988-01-01

    Discusses cooperative efforts between research organizations and libraries to create a national information network. Topics discussed include the Linked System Project (LSP); technical processing versus reference and research functions; Open Systems Interconnection (OSI) Reference Model; the National Science Foundation Network (NSFNET); and…

  6. A Four-Tier Differentiation Model: Engage All Students in the Learning Process

    ERIC Educational Resources Information Center

    Herrelko, Janet M.

    2013-01-01

    This study details the creation of a four-tiered format designed to help preservice teachers write differentiated lesson plans. A short history of lesson plan differentiation models is described and how the four-tier approach was developed through collaboration with classroom teachers and university faculty. The unifying element for the format…

  7. A unified approach for process-based hydrologic modeling: Part 2. Model implementation and case studies

    USDA-ARS?s Scientific Manuscript database

    Understanding and prediction of snowmelt-generated streamflow at sub-daily time scales is important for reservoir scheduling and climate change characterization. This is particularly important in the Western U.S. where over 50% of water supply is provided by snowmelt during the melting period. Previ...

  8. The mental health care model in Brazil: analyses of the funding, governance processes, and mechanisms of assessment

    PubMed Central

    Trapé, Thiago Lavras; Campos, Rosana Onocko

    2017-01-01

    ABSTRACT OBJECTIVE This study aims to analyze the current status of the mental health care model of the Brazilian Unified Health System, according to its funding, governance processes, and mechanisms of assessment. METHODS We have carried out a documentary analysis of the ordinances, technical reports, conference reports, normative resolutions, and decrees from 2009 to 2014. RESULTS This is a time of consolidation of the psychosocial model, with expansion of the health care network and inversion of the funding for community services with a strong emphasis on the area of crack cocaine and other drugs. Mental health is an underfunded area within the chronically underfunded Brazilian Unified Health System. The governance model constrains the progress of essential services, which creates the need for the incorporation of a process of regionalization of the management. The mechanisms of assessment are not incorporated into the health policy in the bureaucratic field. CONCLUSIONS There is a need to expand the global funding of the area of health, specifically mental health, which has been shown to be a successful policy. The current focus of the policy seems to be archaic in relation to the precepts of the psychosocial model. Mechanisms of assessment need to be expanded. PMID:28355335

  9. Theory Creation, Modification, and Testing: An Information-Processing Model and Theory of the Anticipated and Unanticipated Consequences of Research and Development

    ERIC Educational Resources Information Center

    Perla, Rocco J.; Carifio, James

    2011-01-01

    Background: Extending Merton's (1936) work on the consequences of purposive social action, the model, theory and taxonomy outlined here incorporates and formalizes both anticipated and unanticipated research findings in a unified theoretical framework. The model of anticipated research findings was developed initially by Carifio (1975, 1977) and…

  10. A Unifying Mechanistic Model of Selective Attention in Spiking Neurons

    PubMed Central

    Bobier, Bruce; Stewart, Terrence C.; Eliasmith, Chris

    2014-01-01

    Visuospatial attention produces myriad effects on the activity and selectivity of cortical neurons. Spiking neuron models capable of reproducing a wide variety of these effects remain elusive. We present a model called the Attentional Routing Circuit (ARC) that provides a mechanistic description of selective attentional processing in cortex. The model is described mathematically and implemented at the level of individual spiking neurons, with the computations for performing selective attentional processing being mapped to specific neuron types and laminar circuitry. The model is used to simulate three studies of attention in macaque, and is shown to quantitatively match several observed forms of attentional modulation. Specifically, ARC demonstrates that with shifts of spatial attention, neurons may exhibit shifting and shrinking of receptive fields; increases in responses without changes in selectivity for non-spatial features (i.e. response gain), and; that the effect on contrast-response functions is better explained as a response-gain effect than as contrast-gain. Unlike past models, ARC embodies a single mechanism that unifies the above forms of attentional modulation, is consistent with a wide array of available data, and makes several specific and quantifiable predictions. PMID:24921249

  11. Toward a Transdisciplinary Model of Evidence-Based Practice

    PubMed Central

    Satterfield, Jason M; Spring, Bonnie; Brownson, Ross C; Mullen, Edward J; Newhouse, Robin P; Walker, Barbara B; Whitlock, Evelyn P

    2009-01-01

    Context This article describes the historical context and current developments in evidence-based practice (EBP) for medicine, nursing, psychology, social work, and public health, as well as the evolution of the seminal “three circles” model of evidence-based medicine, highlighting changes in EBP content, processes, and philosophies across disciplines. Methods The core issues and challenges in EBP are identified by comparing and contrasting EBP models across various health disciplines. Then a unified, transdisciplinary EBP model is presented, drawing on the strengths and compensating for the weaknesses of each discipline. Findings Common challenges across disciplines include (1) how “evidence” should be defined and comparatively weighted; (2) how and when the patient's and/or other contextual factors should enter the clinical decision-making process; (3) the definition and role of the “expert”; and (4) what other variables should be considered when selecting an evidence-based practice, such as age, social class, community resources, and local expertise. Conclusions A unified, transdisciplinary EBP model would address historical shortcomings by redefining the contents of each model circle, clarifying the practitioner's expertise and competencies, emphasizing shared decision making, and adding both environmental and organizational contexts. Implications for academia, practice, and policy also are discussed. PMID:19523122

  12. Neural Underpinnings of Decision Strategy Selection: A Review and a Theoretical Model

    PubMed Central

    Wichary, Szymon; Smolen, Tomasz

    2016-01-01

    In multi-attribute choice, decision makers use decision strategies to arrive at the final choice. What are the neural mechanisms underlying decision strategy selection? The first goal of this paper is to provide a literature review on the neural underpinnings and cognitive models of decision strategy selection and thus set the stage for a neurocognitive model of this process. The second goal is to outline such a unifying, mechanistic model that can explain the impact of noncognitive factors (e.g., affect, stress) on strategy selection. To this end, we review the evidence for the factors influencing strategy selection, the neural basis of strategy use and the cognitive models of this process. We also present the Bottom-Up Model of Strategy Selection (BUMSS). The model assumes that the use of the rational Weighted Additive strategy and the boundedly rational heuristic Take The Best can be explained by one unifying, neurophysiologically plausible mechanism, based on the interaction of the frontoparietal network, orbitofrontal cortex, anterior cingulate cortex and the brainstem nucleus locus coeruleus. According to BUMSS, there are three processes that form the bottom-up mechanism of decision strategy selection and lead to the final choice: (1) cue weight computation, (2) gain modulation, and (3) weighted additive evaluation of alternatives. We discuss how these processes might be implemented in the brain, and how this knowledge allows us to formulate novel predictions linking strategy use and neural signals. PMID:27877103

  13. A Unified Approach to Quantifying Feedbacks in Earth System Models

    NASA Astrophysics Data System (ADS)

    Taylor, K. E.

    2008-12-01

    In order to speed progress in reducing uncertainty in climate projections, the processes that most strongly influence those projections must be identified. It is of some importance, therefore, to assess the relative strengths of various climate feedbacks and to determine the degree to which various earth system models (ESMs) agree in their simulations of these processes. Climate feedbacks have been traditionally quantified in terms of their impact on the radiative balance of the planet, whereas carbon cycle responses have been assessed in terms of the size of the perturbations to the surface fluxes of carbon dioxide. In this study we introduce a diagnostic strategy for unifying the two approaches, which allows us to directly compare the strength of carbon-climate feedbacks with other conventional climate feedbacks associated with atmospheric and surface changes. Applying this strategy to a highly simplified model of the carbon-climate system demonstrates the viability of the approach. In the simple model we find that even if the strength of the carbon-climate feedbacks is very large, the uncertainty associated with the overall response of the climate system is likely to be dominated by uncertainties in the much larger feedbacks associated with clouds. This does not imply that the carbon cycle itself is unimportant, only that changes in the carbon cycle that are associated with climate change have a relatively small impact on global temperatures. This new, unified diagnostic approach is suitable for assessing feedbacks in even the most sophisticated earth system models. It will be interesting to see whether our preliminary conclusions are confirmed when output from the more realistic models is analyzed. This work was carried out at the University of California Lawrence Livermore National Laboratory under Contract W-7405-Eng-48.

  14. Using Multi-Scale Modeling Systems and Satellite Data to Study the Precipitation Processes

    NASA Technical Reports Server (NTRS)

    Tao, Wei--Kuo; Chern, J.; Lamg, S.; Matsui, T.; Shen, B.; Zeng, X.; Shi, R.

    2010-01-01

    In recent years, exponentially increasing computer power extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 sq km in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale models can be run in grid size similar to cloud resolving models through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model). (2) a regional scale model (a NASA unified weather research and forecast, W8F). (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling systems to study the interactions between clouds, precipitation, and aerosols will be presented. Also how to use the multi-satellite simulator to improve precipitation processes will be discussed.

  15. Using Multi-Scale Modeling Systems to Study the Precipitation Processes

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2010-01-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the interactions between clouds, precipitation, and aerosols will be presented. Also how to use of the multi-satellite simulator to improve precipitation processes will be discussed.

  16. Auditory Working Memory Load Impairs Visual Ventral Stream Processing: Toward a Unified Model of Attentional Load

    ERIC Educational Resources Information Center

    Klemen, Jane; Buchel, Christian; Buhler, Mira; Menz, Mareike M.; Rose, Michael

    2010-01-01

    Attentional interference between tasks performed in parallel is known to have strong and often undesired effects. As yet, however, the mechanisms by which interference operates remain elusive. A better knowledge of these processes may facilitate our understanding of the effects of attention on human performance and the debilitating consequences…

  17. Toward a Unified Modeling of Learner's Growth Process and Flow Theory

    ERIC Educational Resources Information Center

    Challco, Geiser C.; Andrade, Fernando R. H.; Borges, Simone S.; Bittencourt, Ig I.; Isotani, Seiji

    2016-01-01

    Flow is the affective state in which a learner is so engaged and involved in an activity that nothing else seems to matter. In this sense, to help students in the skill development and knowledge acquisition (referred to as learners' growth process) under optimal conditions, the instructional designers should create learning scenarios that favor…

  18. Differential morphology and image processing.

    PubMed

    Maragos, P

    1996-01-01

    Image processing via mathematical morphology has traditionally used geometry to intuitively understand morphological signal operators and set or lattice algebra to analyze them in the space domain. We provide a unified view and analytic tools for morphological image processing that is based on ideas from differential calculus and dynamical systems. This includes ideas on using partial differential or difference equations (PDEs) to model distance propagation or nonlinear multiscale processes in images. We briefly review some nonlinear difference equations that implement discrete distance transforms and relate them to numerical solutions of the eikonal equation of optics. We also review some nonlinear PDEs that model the evolution of multiscale morphological operators and use morphological derivatives. Among the new ideas presented, we develop some general 2-D max/min-sum difference equations that model the space dynamics of 2-D morphological systems (including the distance computations) and some nonlinear signal transforms, called slope transforms, that can analyze these systems in a transform domain in ways conceptually similar to the application of Fourier transforms to linear systems. Thus, distance transforms are shown to be bandpass slope filters. We view the analysis of the multiscale morphological PDEs and of the eikonal PDE solved via weighted distance transforms as a unified area in nonlinear image processing, which we call differential morphology, and briefly discuss its potential applications to image processing and computer vision.

  19. What makes a thriver? Unifying the concepts of posttraumatic and postecstatic growth

    PubMed Central

    Mangelsdorf, Judith; Eid, Michael

    2015-01-01

    The thriver model is a novel framework that unifies the concepts of posttraumatic and postecstatic growth. According to the model, it is not the quality of an event, but the way it is processed, that is critical for the occurrence of post-event growth. The model proposes that meaning making, supportive relationships, and positive emotions facilitate growth processes after positive as well as traumatic experiences. The tenability of these propositions was investigated in two dissimilar cultures. In Study 1, participants from the USA (n = 555) and India (n = 599) answered an extended version of the Social Readjustment Rating Scale to rank the socioemotional impact of events. Results indicate that negative events are perceived as more impactful than positive ones in the USA, whereas the reverse is true in India. In Study 2, participants from the USA (n = 342) and India (n = 341) answered questions about the thriver model's main components. Results showed that posttraumatic and postecstatic growth are highly interrelated. All elements of the thriver model were key variables for the prediction of growth. Supportive relationships and positive emotions had a direct effect on growth, while meaning making mediated the direct effect of major life events. PMID:26157399

  20. Kinetics of Cation and Oxyanion Adsorption and Desorption on Ferrihydrite: Roles of Ferrihydrite Binding Sites and a Unified Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Lei; Shi, Zhenqing; Lu, Yang

    Understanding the kinetics of toxic ion reactions with ferrihydrite is crucial for predicting the dynamic behavior of contaminants in soil environments. In this study, the kinetics of As(V), Cr(VI), Cu, and Pb adsorption and desorption on ferrihydrite were investigated with a combination of laboratory macroscopic experiments, microscopic investigation and mechanistic modeling. The rates of As(V), Cr(VI), Cu, and Pb adsorption and desorption on ferrihydrite, as systematically studied using a stirred-flow method, was highly dependent on the reaction pH and metal concentrations and varied significantly among four metals. Spherical aberration-corrected scanning transmission electron microscopy (Cs-STEM) showed, at sub-nano scales, all fourmore » metals were distributed within the ferrihydrite particle aggregates homogeneously after adsorption reactions, with no evidence of surface diffusion-controlled processes. Based on experimental results, we developed a unifying kinetics model for both cation and oxyanion adsorption/desorption on ferrihydrite based on the mechanistic-based equilibrium model CD-MUSIC. Overall, the model described the kinetic results well, and we quantitatively demonstrated how the equilibrium properties of the cation and oxyanion binding to various ferrihydrite sites affected the adsorption and desorption rates. Our results provided a unifying quantitative modeling method for the kinetics of both cation and oxyanion adsorption/desorption on iron minerals.« less

  1. Diffusion of Defaults Among Financial Institutions

    NASA Astrophysics Data System (ADS)

    Demange, Gabrielle

    The paper proposes a simple unified model for the diffusion of defaults across financial institutions and presents some measures for evaluating the risk imposed by a bank on the system. So far the standard contagion processes might not incorporate some important features of financial contagion.

  2. Unifying screening processes within the PROSPR consortium: a conceptual model for breast, cervical, and colorectal cancer screening.

    PubMed

    Beaber, Elisabeth F; Kim, Jane J; Schapira, Marilyn M; Tosteson, Anna N A; Zauber, Ann G; Geiger, Ann M; Kamineni, Aruna; Weaver, Donald L; Tiro, Jasmin A

    2015-06-01

    General frameworks of the cancer screening process are available, but none directly compare the process in detail across different organ sites. This limits the ability of medical and public health professionals to develop and evaluate coordinated screening programs that apply resources and population management strategies available for one cancer site to other sites. We present a trans-organ conceptual model that incorporates a single screening episode for breast, cervical, and colorectal cancers into a unified framework based on clinical guidelines and protocols; the model concepts could be expanded to other organ sites. The model covers four types of care in the screening process: risk assessment, detection, diagnosis, and treatment. Interfaces between different provider teams (eg, primary care and specialty care), including communication and transfer of responsibility, may occur when transitioning between types of care. Our model highlights across each organ site similarities and differences in steps, interfaces, and transitions in the screening process and documents the conclusion of a screening episode. This model was developed within the National Cancer Institute-funded consortium Population-based Research Optimizing Screening through Personalized Regimens (PROSPR). PROSPR aims to optimize the screening process for breast, cervical, and colorectal cancer and includes seven research centers and a statistical coordinating center. Given current health care reform initiatives in the United States, this conceptual model can facilitate the development of comprehensive quality metrics for cancer screening and promote trans-organ comparative cancer screening research. PROSPR findings will support the design of interventions that improve screening outcomes across multiple cancer sites. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. Unified constitutive models for high-temperature structural applications

    NASA Technical Reports Server (NTRS)

    Lindholm, U. S.; Chan, K. S.; Bodner, S. R.; Weber, R. M.; Walker, K. P.

    1988-01-01

    Unified constitutive models are characterized by the use of a single inelastic strain rate term for treating all aspects of inelastic deformation, including plasticity, creep, and stress relaxation under monotonic or cyclic loading. The structure of this class of constitutive theory pertinent for high temperature structural applications is first outlined and discussed. The effectiveness of the unified approach for representing high temperature deformation of Ni-base alloys is then evaluated by extensive comparison of experimental data and predictions of the Bodner-Partom and the Walker models. The use of the unified approach for hot section structural component analyses is demonstrated by applying the Walker model in finite element analyses of a benchmark notch problem and a turbine blade problem.

  4. A unified approach for determining the ultimate strength of RC members subjected to combined axial force, bending, shear and torsion

    PubMed Central

    Huang, Zhen

    2017-01-01

    This paper uses experimental investigation and theoretical derivation to study the unified failure mechanism and ultimate capacity model of reinforced concrete (RC) members under combined axial, bending, shear and torsion loading. Fifteen RC members are tested under different combinations of compressive axial force, bending, shear and torsion using experimental equipment designed by the authors. The failure mechanism and ultimate strength data for the four groups of tested RC members under different combined loading conditions are investigated and discussed in detail. The experimental research seeks to determine how the ultimate strength of RC members changes with changing combined loads. According to the experimental research, a unified theoretical model is established by determining the shape of the warped failure surface, assuming an appropriate stress distribution on the failure surface, and considering the equilibrium conditions. This unified failure model can be reasonably and systematically changed into well-known failure theories of concrete members under single or combined loading. The unified calculation model could be easily used in design applications with some assumptions and simplifications. Finally, the accuracy of this theoretical unified model is verified by comparisons with experimental results. PMID:28414777

  5. Unified modeling language and design of a case-based retrieval system in medical imaging.

    PubMed Central

    LeBozec, C.; Jaulent, M. C.; Zapletal, E.; Degoulet, P.

    1998-01-01

    One goal of artificial intelligence research into case-based reasoning (CBR) systems is to develop approaches for designing useful and practical interactive case-based environments. Explaining each step of the design of the case-base and of the retrieval process is critical for the application of case-based systems to the real world. We describe herein our approach to the design of IDEM--Images and Diagnosis from Examples in Medicine--a medical image case-based retrieval system for pathologists. Our approach is based on the expressiveness of an object-oriented modeling language standard: the Unified Modeling Language (UML). We created a set of diagrams in UML notation illustrating the steps of the CBR methodology we used. The key aspect of this approach was selecting the relevant objects of the system according to user requirements and making visualization of cases and of the components of the case retrieval process. Further evaluation of the expressiveness of the design document is required but UML seems to be a promising formalism, improving the communication between the developers and users. Images Figure 6 Figure 7 PMID:9929346

  6. Unified modeling language and design of a case-based retrieval system in medical imaging.

    PubMed

    LeBozec, C; Jaulent, M C; Zapletal, E; Degoulet, P

    1998-01-01

    One goal of artificial intelligence research into case-based reasoning (CBR) systems is to develop approaches for designing useful and practical interactive case-based environments. Explaining each step of the design of the case-base and of the retrieval process is critical for the application of case-based systems to the real world. We describe herein our approach to the design of IDEM--Images and Diagnosis from Examples in Medicine--a medical image case-based retrieval system for pathologists. Our approach is based on the expressiveness of an object-oriented modeling language standard: the Unified Modeling Language (UML). We created a set of diagrams in UML notation illustrating the steps of the CBR methodology we used. The key aspect of this approach was selecting the relevant objects of the system according to user requirements and making visualization of cases and of the components of the case retrieval process. Further evaluation of the expressiveness of the design document is required but UML seems to be a promising formalism, improving the communication between the developers and users.

  7. Failure and recovery in dynamical networks.

    PubMed

    Böttcher, L; Luković, M; Nagler, J; Havlin, S; Herrmann, H J

    2017-02-03

    Failure, damage spread and recovery crucially underlie many spatially embedded networked systems ranging from transportation structures to the human body. Here we study the interplay between spontaneous damage, induced failure and recovery in both embedded and non-embedded networks. In our model the network's components follow three realistic processes that capture these features: (i) spontaneous failure of a component independent of the neighborhood (internal failure), (ii) failure induced by failed neighboring nodes (external failure) and (iii) spontaneous recovery of a component. We identify a metastable domain in the global network phase diagram spanned by the model's control parameters where dramatic hysteresis effects and random switching between two coexisting states are observed. This dynamics depends on the characteristic link length of the embedded system. For the Euclidean lattice in particular, hysteresis and switching only occur in an extremely narrow region of the parameter space compared to random networks. We develop a unifying theory which links the dynamics of our model to contact processes. Our unifying framework may help to better understand controllability in spatially embedded and random networks where spontaneous recovery of components can mitigate spontaneous failure and damage spread in dynamical networks.

  8. Competition and cooperation among similar representations: toward a unified account of facilitative and inhibitory effects of lexical neighbors.

    PubMed

    Chen, Qi; Mirman, Daniel

    2012-04-01

    One of the core principles of how the mind works is the graded, parallel activation of multiple related or similar representations. Parallel activation of multiple representations has been particularly important in the development of theories and models of language processing, where coactivated representations (neighbors) have been shown to exhibit both facilitative and inhibitory effects on word recognition and production. Researchers generally ascribe these effects to interactive activation and competition, but there is no unified explanation for why the effects are facilitative in some cases and inhibitory in others. We present a series of simulations of a simple domain-general interactive activation and competition model that is broadly consistent with more specialized domain-specific models of lexical processing. The results showed that interactive activation and competition can indeed account for the complex pattern of reversals. Critically, the simulations revealed a core computational principle that determines whether neighbor effects are facilitative or inhibitory: strongly active neighbors exert a net inhibitory effect, and weakly active neighbors exert a net facilitative effect.

  9. Putting It All Together: A Unified Account of Word Recognition and Reaction-Time Distributions

    ERIC Educational Resources Information Center

    Norris, Dennis

    2009-01-01

    R. Ratcliff, P. Gomez, and G. McKoon (2004) suggested much of what goes on in lexical decision is attributable to decision processes and may not be particularly informative about word recognition. They proposed that lexical decision should be characterized by a decision process, taking the form of a drift-diffusion model (R. Ratcliff, 1978), that…

  10. Review and Implementation Status of Prior Defense Business Board Recommendations

    DTIC Science & Technology

    2007-04-01

    Resource Management • Support unified models for shared services , and be prepared to adjust forward approaches for a Unified Medical Command...models for shared services – including by and between Veterans Affairs and Defense, electronic information exchange, disease treatment and prevention...www.dod.mil/dbb/pdf/DBB- Report-on-the-Military.pdf. • Continue to support unified models for shared services – including by and between Veterans Affairs

  11. LDA-Based Unified Topic Modeling for Similar TV User Grouping and TV Program Recommendation.

    PubMed

    Pyo, Shinjee; Kim, Eunhui; Kim, Munchurl

    2015-08-01

    Social TV is a social media service via TV and social networks through which TV users exchange their experiences about TV programs that they are viewing. For social TV service, two technical aspects are envisioned: grouping of similar TV users to create social TV communities and recommending TV programs based on group and personal interests for personalizing TV. In this paper, we propose a unified topic model based on grouping of similar TV users and recommending TV programs as a social TV service. The proposed unified topic model employs two latent Dirichlet allocation (LDA) models. One is a topic model of TV users, and the other is a topic model of the description words for viewed TV programs. The two LDA models are then integrated via a topic proportion parameter for TV programs, which enforces the grouping of similar TV users and associated description words for watched TV programs at the same time in a unified topic modeling framework. The unified model identifies the semantic relation between TV user groups and TV program description word groups so that more meaningful TV program recommendations can be made. The unified topic model also overcomes an item ramp-up problem such that new TV programs can be reliably recommended to TV users. Furthermore, from the topic model of TV users, TV users with similar tastes can be grouped as topics, which can then be recommended as social TV communities. To verify our proposed method of unified topic-modeling-based TV user grouping and TV program recommendation for social TV services, in our experiments, we used real TV viewing history data and electronic program guide data from a seven-month period collected by a TV poll agency. The experimental results show that the proposed unified topic model yields an average 81.4% precision for 50 topics in TV program recommendation and its performance is an average of 6.5% higher than that of the topic model of TV users only. For TV user prediction with new TV programs, the average prediction precision was 79.6%. Also, we showed the superiority of our proposed model in terms of both topic modeling performance and recommendation performance compared to two related topic models such as polylingual topic model and bilingual topic model.

  12. Ontological Approach to Military Knowledge Modeling and Management

    DTIC Science & Technology

    2004-03-01

    federated search mechanism has to reformulate user queries (expressed using the ontology) in the query languages of the different sources (e.g. SQL...ontologies as a common terminology – Unified query to perform federated search • Query processing – Ontology mapping to sources reformulate queries

  13. A Unified Multi-scale Model for Cross-Scale Evaluation and Integration of Hydrological and Biogeochemical Processes

    NASA Astrophysics Data System (ADS)

    Liu, C.; Yang, X.; Bailey, V. L.; Bond-Lamberty, B. P.; Hinkle, C.

    2013-12-01

    Mathematical representations of hydrological and biogeochemical processes in soil, plant, aquatic, and atmospheric systems vary with scale. Process-rich models are typically used to describe hydrological and biogeochemical processes at the pore and small scales, while empirical, correlation approaches are often used at the watershed and regional scales. A major challenge for multi-scale modeling is that water flow, biogeochemical processes, and reactive transport are described using different physical laws and/or expressions at the different scales. For example, the flow is governed by the Navier-Stokes equations at the pore-scale in soils, by the Darcy law in soil columns and aquifer, and by the Navier-Stokes equations again in open water bodies (ponds, lake, river) and atmosphere surface layer. This research explores whether the physical laws at the different scales and in different physical domains can be unified to form a unified multi-scale model (UMSM) to systematically investigate the cross-scale, cross-domain behavior of fundamental processes at different scales. This presentation will discuss our research on the concept, mathematical equations, and numerical execution of the UMSM. Three-dimensional, multi-scale hydrological processes at the Disney Wilderness Preservation (DWP) site, Florida will be used as an example for demonstrating the application of the UMSM. In this research, the UMSM was used to simulate hydrological processes in rooting zones at the pore and small scales including water migration in soils under saturated and unsaturated conditions, root-induced hydrological redistribution, and role of rooting zone biogeochemical properties (e.g., root exudates and microbial mucilage) on water storage and wetting/draining. The small scale simulation results were used to estimate effective water retention properties in soil columns that were superimposed on the bulk soil water retention properties at the DWP site. The UMSM parameterized from smaller scale simulations were then used to simulate coupled flow and moisture migration in soils in saturated and unsaturated zones, surface and groundwater exchange, and surface water flow in streams and lakes at the DWP site under dynamic precipitation conditions. Laboratory measurements of soil hydrological and biogeochemical properties are used to parameterize the UMSM at the small scales, and field measurements are used to evaluate the UMSM.

  14. Predicting crystal growth via a unified kinetic three-dimensional partition model

    NASA Astrophysics Data System (ADS)

    Anderson, Michael W.; Gebbie-Rayet, James T.; Hill, Adam R.; Farida, Nani; Attfield, Martin P.; Cubillas, Pablo; Blatov, Vladislav A.; Proserpio, Davide M.; Akporiaye, Duncan; Arstad, Bjørnar; Gale, Julian D.

    2017-04-01

    Understanding and predicting crystal growth is fundamental to the control of functionality in modern materials. Despite investigations for more than one hundred years, it is only recently that the molecular intricacies of these processes have been revealed by scanning probe microscopy. To organize and understand this large amount of new information, new rules for crystal growth need to be developed and tested. However, because of the complexity and variety of different crystal systems, attempts to understand crystal growth in detail have so far relied on developing models that are usually applicable to only one system. Such models cannot be used to achieve the wide scope of understanding that is required to create a unified model across crystal types and crystal structures. Here we describe a general approach to understanding and, in theory, predicting the growth of a wide range of crystal types, including the incorporation of defect structures, by simultaneous molecular-scale simulation of crystal habit and surface topology using a unified kinetic three-dimensional partition model. This entails dividing the structure into ‘natural tiles’ or Voronoi polyhedra that are metastable and, consequently, temporally persistent. As such, these units are then suitable for re-construction of the crystal via a Monte Carlo algorithm. We demonstrate our approach by predicting the crystal growth of a diverse set of crystal types, including zeolites, metal-organic frameworks, calcite, urea and L-cystine.

  15. A Unified Theoretical Framework for Cognitive Sequencing.

    PubMed

    Savalia, Tejas; Shukla, Anuj; Bapi, Raju S

    2016-01-01

    The capacity to sequence information is central to human performance. Sequencing ability forms the foundation stone for higher order cognition related to language and goal-directed planning. Information related to the order of items, their timing, chunking and hierarchical organization are important aspects in sequencing. Past research on sequencing has emphasized two distinct and independent dichotomies: implicit vs. explicit and goal-directed vs. habits. We propose a theoretical framework unifying these two streams. Our proposal relies on brain's ability to implicitly extract statistical regularities from the stream of stimuli and with attentional engagement organizing sequences explicitly and hierarchically. Similarly, sequences that need to be assembled purposively to accomplish a goal require engagement of attentional processes. With repetition, these goal-directed plans become habits with concomitant disengagement of attention. Thus, attention and awareness play a crucial role in the implicit-to-explicit transition as well as in how goal-directed plans become automatic habits. Cortico-subcortical loops basal ganglia-frontal cortex and hippocampus-frontal cortex loops mediate the transition process. We show how the computational principles of model-free and model-based learning paradigms, along with a pivotal role for attention and awareness, offer a unifying framework for these two dichotomies. Based on this framework, we make testable predictions related to the potential influence of response-to-stimulus interval (RSI) on developing awareness in implicit learning tasks.

  16. A Unified Theoretical Framework for Cognitive Sequencing

    PubMed Central

    Savalia, Tejas; Shukla, Anuj; Bapi, Raju S.

    2016-01-01

    The capacity to sequence information is central to human performance. Sequencing ability forms the foundation stone for higher order cognition related to language and goal-directed planning. Information related to the order of items, their timing, chunking and hierarchical organization are important aspects in sequencing. Past research on sequencing has emphasized two distinct and independent dichotomies: implicit vs. explicit and goal-directed vs. habits. We propose a theoretical framework unifying these two streams. Our proposal relies on brain's ability to implicitly extract statistical regularities from the stream of stimuli and with attentional engagement organizing sequences explicitly and hierarchically. Similarly, sequences that need to be assembled purposively to accomplish a goal require engagement of attentional processes. With repetition, these goal-directed plans become habits with concomitant disengagement of attention. Thus, attention and awareness play a crucial role in the implicit-to-explicit transition as well as in how goal-directed plans become automatic habits. Cortico-subcortical loops basal ganglia-frontal cortex and hippocampus-frontal cortex loops mediate the transition process. We show how the computational principles of model-free and model-based learning paradigms, along with a pivotal role for attention and awareness, offer a unifying framework for these two dichotomies. Based on this framework, we make testable predictions related to the potential influence of response-to-stimulus interval (RSI) on developing awareness in implicit learning tasks. PMID:27917146

  17. Measuring health care process quality with software quality measures.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2012-01-01

    Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.

  18. A unified engineering model of the first stroke in downward negative lightning

    NASA Astrophysics Data System (ADS)

    Nag, Amitabh; Rakov, Vladimir A.

    2016-03-01

    Each stroke in a negative cloud-to-ground lightning flash is composed of downward leader and upward return stroke processes, which are usually modeled individually. The first stroke leader is stepped and starts with preliminary breakdown (PB) which is often viewed as a separate process. We present the first unified engineering model for computing the electric field produced by a sequence of PB, stepped leader, and return stroke processes, serving to transport negative charge to ground. We assume that a negatively charged channel extends downward in a stepped fashion during both the PB and leader stages. Each step involves a current wave that propagates upward along the newly formed channel section. Once the leader attaches to ground, an upward propagating return stroke neutralizes the charge deposited along the channel. Model-predicted electric fields are in reasonably good agreement with simultaneous measurements at both near (hundreds of meters, electrostatic field component is dominant) and far (tens of kilometers, radiation field component is dominant) distances from the lightning channel. Relations between the features of computed electric field waveforms and model input parameters are examined. It appears that peak currents associated with PB pulses are similar to return stroke peak currents, and the observed variation of electric radiation field peaks produced by leader steps at different heights above ground is influenced by the ground corona space charge.

  19. Modeling a terminology-based electronic nursing record system: an object-oriented approach.

    PubMed

    Park, Hyeoun-Ae; Cho, InSook; Byeun, NamSoo

    2007-10-01

    The aim of this study was to present our perspectives on healthcare information analysis at a conceptual level and the lessons learned from our experience with the development of a terminology-based enterprise electronic nursing record system - which was one of components in an EMR system at a tertiary teaching hospital in Korea - using an object-oriented system analysis and design concept. To ensure a systematic approach and effective collaboration, the department of nursing constituted a system modeling team comprising a project manager, systems analysts, user representatives, an object-oriented methodology expert, and healthcare informaticists (including the authors). A rational unified process (RUP) and the Unified Modeling Language were used as a development process and for modeling notation, respectively. From the scenario and RUP approach, user requirements were formulated into use case sets and the sequence of activities in the scenario was depicted in an activity diagram. The structure of the system was presented in a class diagram. This approach allowed us to identify clearly the structural and behavioral states and important factors of a terminology-based ENR system (e.g., business concerns and system design concerns) according to the viewpoints of both domain and technical experts.

  20. Unified Model for the Overall Efficiency of Inlets Sampling from Horizontal Aerosol Flows

    NASA Astrophysics Data System (ADS)

    Hangal, Sunil Pralhad

    When sampling aerosols from ambient or industrial air environments, the sampled aerosol must be representative of the aerosol in the free stream. The changes that occur during sampling must be assessed quantitatively so that sampling errors can be compensated for. In this study, unified models have been developed for the overall efficiency of tubular sharp-edged inlets sampling from horizontal aerosol flows oriented at 0 to 90^circ relative to the wind direction in the vertical (pitch) and horizontal plane(yaw). In the unified model, based on experimental data, the aspiration efficiency is represented by a single equation with different inertial parameters at 0 to 60^ circ and 45 to 90^circ . Tnt transmission efficiency is separated into two components: one due to gravitational settling in the boundary layer and the other due to impaction. The gravitational settling component is determined by extending a previously developed isoaxial sampling model to nonisoaxial sampling. The impaction component is determined by a new model that quantifies the particle losses caused by wall impaction. The model also quantifies the additional particle losses resulting from turbulent motion in the vena contracta which is formed in the inlet when the inlet velocity is higher than the wind velocity. When sampling aerosols in ambient or industrial environments with an inlet, small changes in wind direction or physical constraints in positioning the inlet in the system necessitates the assessment of sampling efficiency in both the vertical and horizontal plane. The overall sampling efficiency of tubular inlets has been experimentally investigated in yaw and pitch orientations at 0 to 20 ^circ from horizontal aerosol flows using a wind tunnel facility. The model for overall sampling efficiency has been extended to include both yaw and pitch sampling based on the new data. In this model, the difference between yaw and pitch is expressed by the effect of gravity on the impaction process inside the inlet described by a newly developed gravity effect angle. At yaw, the gravity effect angle on the wall impaction process does not change with sampling angle. At pitch, the gravity effect on the impaction process results in particle loss increase for upward and decrease for downward sampling. Using the unified model, graphical representations have been developed for sampling at small angles. These can be used in the field to determine the overall sampling efficiency of inlets at several operating conditions and the operating conditions that result in an acceptable sampling error. Pitch and diameter factors have been introduced for relating the efficiency values over a wide range of conditions to those of a reference condition. The pitch factor determines the overall sampling efficiency at pitch from yaw values, and the diameter factor determines the overall sampling efficiency at different inlet diameters.

  1. Tachyon cosmology with non-vanishing minimum potential: a unified model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Huiquan, E-mail: hqli@ustc.edu.cn

    2012-07-01

    We investigate the tachyon condensation process in the effective theory with non-vanishing minimum potential and its implications to cosmology. It is shown that the tachyon condensation on an unstable three-brane described by this modified tachyon field theory leads to lower-dimensional branes (defects) forming within a stable three-brane. Thus, in the cosmological background, we can get well-behaved tachyon matter after tachyon inflation, (partially) avoiding difficulties encountered in the original tachyon cosmological models. This feature also implies that the tachyon inflated and reheated universe is followed by a Chaplygin gas dark matter and dark energy universe. Hence, such an unstable three-brane behavesmore » quite like our universe, reproducing the key features of the whole evolutionary history of the universe and providing a unified description of inflaton, dark matter and dark energy in a very simple single-scalar field model.« less

  2. Space-Time Processing for Tactical Mobile Ad Hoc Networks

    DTIC Science & Technology

    2008-08-01

    vision for multiple concurrent communication settings, i.e., a many-to-many framework where multi-packet transmissions (MPTs) and multi-packet...modelling framework of capacity-delay tradeoffs We have introduced the first unified modeling framework for the computation of fundamental limits o We...dalities in wireless n twor i-packet modelling framework to account for the use of m lti-packet reception (MPR) f ad hoc networks with MPT under

  3. A unified partial likelihood approach for X-chromosome association on time-to-event outcomes.

    PubMed

    Xu, Wei; Hao, Meiling

    2018-02-01

    The expression of X-chromosome undergoes three possible biological processes: X-chromosome inactivation (XCI), escape of the X-chromosome inactivation (XCI-E), and skewed X-chromosome inactivation (XCI-S). Although these expressions are included in various predesigned genetic variation chip platforms, the X-chromosome has generally been excluded from the majority of genome-wide association studies analyses; this is most likely due to the lack of a standardized method in handling X-chromosomal genotype data. To analyze the X-linked genetic association for time-to-event outcomes with the actual process unknown, we propose a unified approach of maximizing the partial likelihood over all of the potential biological processes. The proposed method can be used to infer the true biological process and derive unbiased estimates of the genetic association parameters. A partial likelihood ratio test statistic that has been proved asymptotically chi-square distributed can be used to assess the X-chromosome genetic association. Furthermore, if the X-chromosome expression pertains to the XCI-S process, we can infer the correct skewed direction and magnitude of inactivation, which can elucidate significant findings regarding the genetic mechanism. A population-level model and a more general subject-level model have been developed to model the XCI-S process. Finite sample performance of this novel method is examined via extensive simulation studies. An application is illustrated with implementation of the method on a cancer genetic study with survival outcome. © 2017 WILEY PERIODICALS, INC.

  4. Unifying error structures in commonly used biotracer mixing models.

    PubMed

    Stock, Brian C; Semmens, Brice X

    2016-10-01

    Mixing models are statistical tools that use biotracers to probabilistically estimate the contribution of multiple sources to a mixture. These biotracers may include contaminants, fatty acids, or stable isotopes, the latter of which are widely used in trophic ecology to estimate the mixed diet of consumers. Bayesian implementations of mixing models using stable isotopes (e.g., MixSIR, SIAR) are regularly used by ecologists for this purpose, but basic questions remain about when each is most appropriate. In this study, we describe the structural differences between common mixing model error formulations in terms of their assumptions about the predation process. We then introduce a new parameterization that unifies these mixing model error structures, as well as implicitly estimates the rate at which consumers sample from source populations (i.e., consumption rate). Using simulations and previously published mixing model datasets, we demonstrate that the new error parameterization outperforms existing models and provides an estimate of consumption. Our results suggest that the error structure introduced here will improve future mixing model estimates of animal diet. © 2016 by the Ecological Society of America.

  5. Neural Network Processing of Natural Language: II. Towards a Unified Model of Corticostriatal Function in Learning Sentence Comprehension and Non-Linguistic Sequencing

    ERIC Educational Resources Information Center

    Dominey, Peter Ford; Inui, Toshio; Hoen, Michel

    2009-01-01

    A central issue in cognitive neuroscience today concerns how distributed neural networks in the brain that are used in language learning and processing can be involved in non-linguistic cognitive sequence learning. This issue is informed by a wealth of functional neurophysiology studies of sentence comprehension, along with a number of recent…

  6. Toward a Unified Componential Theory of Human Reasoning. Technical Report No. 4.

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    The unified theory described in this paper characterizes human reasoning as an information processing system with a hierarchical sequence of components and subtheories that account for performance on successively narrower tasks. Both deductive and inductive theories are subsumed in the unified componential theory, including transitive chain theory…

  7. A Unified Model of Cloud-to-Ground Lightning Stroke

    NASA Astrophysics Data System (ADS)

    Nag, A.; Rakov, V. A.

    2014-12-01

    The first stroke in a cloud-to-ground lightning discharge is thought to follow (or be initiated by) the preliminary breakdown process which often produces a train of relatively large microsecond-scale electric field pulses. This process is poorly understood and rarely modeled. Each lightning stroke is composed of a downward leader process and an upward return-stroke process, which are usually modeled separately. We present a unified engineering model for computing the electric field produced by a sequence of preliminary breakdown, stepped leader, and return stroke processes, serving to transport negative charge to ground. We assume that a negatively-charged channel extends downward in a stepped fashion through the relatively-high-field region between the main negative and lower positive charge centers and then through the relatively-low-field region below the lower positive charge center. A relatively-high-field region is also assumed to exist near ground. The preliminary breakdown pulse train is assumed to be generated when the negatively-charged channel interacts with the lower positive charge region. At each step, an equivalent current source is activated at the lower extremity of the channel, resulting in a step current wave that propagates upward along the channel. The leader deposits net negative charge onto the channel. Once the stepped leader attaches to ground (upward connecting leader is presently neglected), an upward-propagating return stroke is initiated, which neutralizes the charge deposited by the leader along the channel. We examine the effect of various model parameters, such as step length and current propagation speed, on model-predicted electric fields. We also compare the computed fields with pertinent measurements available in the literature.

  8. Simulations of ecosystem hydrological processes using a unified multi-scale model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Xiaofan; Liu, Chongxuan; Fang, Yilin

    2015-01-01

    This paper presents a unified multi-scale model (UMSM) that we developed to simulate hydrological processes in an ecosystem containing both surface water and groundwater. The UMSM approach modifies the Navier–Stokes equation by adding a Darcy force term to formulate a single set of equations to describe fluid momentum and uses a generalized equation to describe fluid mass balance. The advantage of the approach is that the single set of the equations can describe hydrological processes in both surface water and groundwater where different models are traditionally required to simulate fluid flow. This feature of the UMSM significantly facilitates modelling ofmore » hydrological processes in ecosystems, especially at locations where soil/sediment may be frequently inundated and drained in response to precipitation, regional hydrological and climate changes. In this paper, the UMSM was benchmarked using WASH123D, a model commonly used for simulating coupled surface water and groundwater flow. Disney Wilderness Preserve (DWP) site at the Kissimmee, Florida, where active field monitoring and measurements are ongoing to understand hydrological and biogeochemical processes, was then used as an example to illustrate the UMSM modelling approach. The simulations results demonstrated that the DWP site is subject to the frequent changes in soil saturation, the geometry and volume of surface water bodies, and groundwater and surface water exchange. All the hydrological phenomena in surface water and groundwater components including inundation and draining, river bank flow, groundwater table change, soil saturation, hydrological interactions between groundwater and surface water, and the migration of surface water and groundwater interfaces can be simultaneously simulated using the UMSM. Overall, the UMSM offers a cross-scale approach that is particularly suitable to simulate coupled surface and ground water flow in ecosystems with strong surface water and groundwater interactions.« less

  9. Firming-Up Core: A Collaborative Approach.

    ERIC Educational Resources Information Center

    McInnis, Bernadette

    The Collaborative Probing Model (CPM) is a heuristic approach to writing across the disciplines that stresses discovery, process, and assessment. Faculty input will help the English department design an oral and written communication block that will be unified by a series of interdisciplinary videotaped presentations. CPM also uses flow charting…

  10. The Proletarianisation of Academic Labour in Australia

    ERIC Educational Resources Information Center

    McCarthy, Greg; Song, Xianlin; Jayasuriya, Kanishka

    2017-01-01

    Australian universities over the last 25 years have been unified, internationalised, corporatised and become mass educational providers. This process is replicated globally as a response to rapid mass enrolments and marketisation. In the light of these changes, a corporate and managerial model has been identified, which has been the subject of…

  11. A unified inversion scheme to process multifrequency measurements of various dispersive electromagnetic properties

    NASA Astrophysics Data System (ADS)

    Han, Y.; Misra, S.

    2018-04-01

    Multi-frequency measurement of a dispersive electromagnetic (EM) property, such as electrical conductivity, dielectric permittivity, or magnetic permeability, is commonly analyzed for purposes of material characterization. Such an analysis requires inversion of the multi-frequency measurement based on a specific relaxation model, such as Cole-Cole model or Pelton's model. We develop a unified inversion scheme that can be coupled to various type of relaxation models to independently process multi-frequency measurement of varied EM properties for purposes of improved EM-based geomaterial characterization. The proposed inversion scheme is firstly tested in few synthetic cases in which different relaxation models are coupled into the inversion scheme and then applied to multi-frequency complex conductivity, complex resistivity, complex permittivity, and complex impedance measurements. The method estimates up to seven relaxation-model parameters exhibiting convergence and accuracy for random initializations of the relaxation-model parameters within up to 3-orders of magnitude variation around the true parameter values. The proposed inversion method implements a bounded Levenberg algorithm with tuning initial values of damping parameter and its iterative adjustment factor, which are fixed in all the cases shown in this paper and irrespective of the type of measured EM property and the type of relaxation model. Notably, jump-out step and jump-back-in step are implemented as automated methods in the inversion scheme to prevent the inversion from getting trapped around local minima and to honor physical bounds of model parameters. The proposed inversion scheme can be easily used to process various types of EM measurements without major changes to the inversion scheme.

  12. Imaging of 2-D multichannel land seismic data using an iterative inversion-migration scheme, Naga Thrust and Fold Belt, Assam, India

    NASA Astrophysics Data System (ADS)

    Jaiswal, Priyank; Dasgupta, Rahul

    2010-05-01

    We demonstrate that imaging of 2-D multichannel land seismic data can be effectively accomplished by a combination of reflection traveltime tomography and pre-stack depth migration (PSDM); we refer to the combined process as "the unified imaging". The unified imaging comprises cyclic runs of joint reflection and direct arrival inversion and pre-stack depth migration. From one cycle to another, both the inversion and the migration provide mutual feedbacks that are guided by the geological interpretation. The unified imaging is implemented in two broad stages. The first stage is similar to the conventional imaging except that it involves a significant use of velocity model from the inversion of the direct arrivals for both datuming and stacking velocity analysis. The first stage ends with an initial interval velocity model (from the stacking velocity analysis) and a corresponding depth migrated image. The second stage updates the velocity model and the depth image from the first stage in a cyclic manner; a single cycle comprises a single run of reflection traveltime inversion followed by PSDM. Interfaces used in the inversion are interpretations of the PSDM image in the previous cycle and the velocity model used in PSDM is from the joint inversion in the current cycle. Additionally in every cycle interpreted horizons in the stacked data are inverted as zero-offset reflections for constraining the interfaces; the velocity model is maintained stationary for the zero-offset inversion. A congruency factor, j, which measures the discrepancy between interfaces from the interpretation of the PSDM image and their corresponding counterparts from the inversion of the zero-offset reflections within assigned uncertainties, is computed in every cycle. A value of unity for jindicates that images from both the inversion and the migration are equivalent; at this point the unified imaging is said to have converged and is halted. We apply the unified imaging to 2-D multichannel seismic data from the Naga Thrust and Fold Belt (NTFB), India, were several exploratory wells in the last decade targeting sub-thrust leads in the footwall have failed. This failure is speculatively due to incorrect depth images which are in turn attributed to incorrect velocity models that are developed using conventional methods. The 2-D seismic data in this study is acquired perpendicular to the trend of the NTFB where the outcropping hanging wall has a topographic culmination. The acquisition style is split-spread with 30 m shot and receiver spacing and a nominal fold of 90. The data are recorded with a sample interval of 2 ms. Overall the data have a moderate signal-to-noise ratio and a broad frequency bandwidth of 8-80 Hz. The seismic line contains the failed exploratory well in the central part. The final results from unified imaging (both the depth image and the corresponding velocity model) suggest presence of a triangle zone, which was previously undiscovered. Conventional imaging had falsely portrayed the triangle zone as structural high which was interpreted as an anticline. As a result, the exploratory well, meant to target the anticline, met with pressure changes which were neither expected nor explained. The unified imaging results not only explain the observations in the well but also reveal new leads in the region. The velocity model from unified imaging was also found to be adequate for frequency-domain full-waveform imaging of the hanging wall. Results from waveform inversion are further corroborated by the geological interpretation of the exploratory well.

  13. Sediment transport modeling in deposited bed sewers: unified form of May's equations using the particle swarm optimization algorithm.

    PubMed

    Safari, Mir Jafar Sadegh; Shirzad, Akbar; Mohammadi, Mirali

    2017-08-01

    May proposed two dimensionless parameters of transport (η) and mobility (F s ) for self-cleansing design of sewers with deposited bed condition. The relationships between those two parameters were introduced in conditional form for specific ranges of F s , which makes it difficult to use as a practical tool for sewer design. In this study, using the same experimental data used by May and employing the particle swarm optimization algorithm, a unified equation is recommended based on η and F s . The developed model is compared with original May relationships as well as corresponding models available in the literature. A large amount of data taken from the literature is used for the models' evaluation. The results demonstrate that the developed model in this study is superior to May and other existing models in the literature. Due to the fact that in May's dimensionless parameters more effective variables in the sediment transport process in sewers with deposited bed condition are considered, it is concluded that the revised May equation proposed in this study is a reliable model for sewer design.

  14. A remote sensing computer-assisted learning tool developed using the unified modeling language

    NASA Astrophysics Data System (ADS)

    Friedrich, J.; Karslioglu, M. O.

    The goal of this work has been to create an easy-to-use and simple-to-make learning tool for remote sensing at an introductory level. Many students struggle to comprehend what seems to be a very basic knowledge of digital images, image processing and image arithmetic, for example. Because professional programs are generally too complex and overwhelming for beginners and often not tailored to the specific needs of a course regarding functionality, a computer-assisted learning (CAL) program was developed based on the unified modeling language (UML), the present standard for object-oriented (OO) system development. A major advantage of this approach is an easier transition from modeling to coding of such an application, if modern UML tools are being used. After introducing the constructed UML model, its implementation is briefly described followed by a series of learning exercises. They illustrate how the resulting CAL tool supports students taking an introductory course in remote sensing at the author's institution.

  15. The Markov process admits a consistent steady-state thermodynamic formalism

    NASA Astrophysics Data System (ADS)

    Peng, Liangrong; Zhu, Yi; Hong, Liu

    2018-01-01

    The search for a unified formulation for describing various non-equilibrium processes is a central task of modern non-equilibrium thermodynamics. In this paper, a novel steady-state thermodynamic formalism was established for general Markov processes described by the Chapman-Kolmogorov equation. Furthermore, corresponding formalisms of steady-state thermodynamics for the master equation and Fokker-Planck equation could be rigorously derived in mathematics. To be concrete, we proved that (1) in the limit of continuous time, the steady-state thermodynamic formalism for the Chapman-Kolmogorov equation fully agrees with that for the master equation; (2) a similar one-to-one correspondence could be established rigorously between the master equation and Fokker-Planck equation in the limit of large system size; (3) when a Markov process is restrained to one-step jump, the steady-state thermodynamic formalism for the Fokker-Planck equation with discrete state variables also goes to that for master equations, as the discretization step gets smaller and smaller. Our analysis indicated that general Markov processes admit a unified and self-consistent non-equilibrium steady-state thermodynamic formalism, regardless of underlying detailed models.

  16. Anti-gravity with present technology - Implementation and theoretical foundation

    NASA Astrophysics Data System (ADS)

    Alzofon, F. E.

    1981-07-01

    This paper proposes a semi-empirical model of the processes leading to the gravitational field based on accepted features of subatomic processes. Through an analogy with methods of cryogenics, a method of decreasing (or increasing) the gravitational force on a vehicle, using presently-known technology, is suggested. Various ways of ultilizing this effect in vehicle propulsion are described. A unified field theory is then detailed which provides a more formal foundation for the gravitational field model first introduced. In distinction to the general theory of relativity, it features physical processes which generate the gravitational field.

  17. An Unified Multiscale Framework for Planar, Surface, and Curve Skeletonization.

    PubMed

    Jalba, Andrei C; Sobiecki, Andre; Telea, Alexandru C

    2016-01-01

    Computing skeletons of 2D shapes, and medial surface and curve skeletons of 3D shapes, is a challenging task. In particular, there is no unified framework that detects all types of skeletons using a single model, and also produces a multiscale representation which allows to progressively simplify, or regularize, all skeleton types. In this paper, we present such a framework. We model skeleton detection and regularization by a conservative mass transport process from a shape's boundary to its surface skeleton, next to its curve skeleton, and finally to the shape center. The resulting density field can be thresholded to obtain a multiscale representation of progressively simplified surface, or curve, skeletons. We detail a numerical implementation of our framework which is demonstrably stable and has high computational efficiency. We demonstrate our framework on several complex 2D and 3D shapes.

  18. Revisiting the PLUMBER Experiments from a Process-Diagnostics Perspective

    NASA Astrophysics Data System (ADS)

    Nearing, G. S.; Ruddell, B. L.; Clark, M. P.; Nijssen, B.; Peters-Lidard, C. D.

    2017-12-01

    The PLUMBER benchmarking experiments [1] showed that some of the most sophisticated land models (CABLE, CH-TESSEL, COLA-SSiB, ISBA-SURFEX, JULES, Mosaic, Noah, ORCHIDEE) were outperformed - in simulations of half-hourly surface energy fluxes - by instantaneous, out-of-sample, and globally-stationary regressions with no state memory. One criticism of PLUMBER is that the benchmarking methodology was not derived formally, so that applying a similar methodology with different performance metrics can result in qualitatively different results. Another common criticism of model intercomparison projects in general is that they offer little insight into process-level deficiencies in the models, and therefore are of marginal value for helping to improve the models. We address both of these issues by proposing a formal benchmarking methodology that also yields a formal and quantitative method for process-level diagnostics. We apply this to the PLUMBER experiments to show that (1) the PLUMBER conclusions were generally correct - the models use only a fraction of the information available to them from met forcing data (<50% by our analysis), and (2) all of the land models investigated by PLUMBER have similar process-level error structures, and therefore together do not represent a meaningful sample of structural or epistemic uncertainty. We conclude by suggesting two ways to improve the experimental design of model intercomparison and/or model benchmarking studies like PLUMBER. First, PLUMBER did not report model parameter values, and it is necessary to know these values to separate parameter uncertainty from structural uncertainty. This is a first order requirement if we want to use intercomparison studies to provide feedback to model development. Second, technical documentation of land models is inadequate. Future model intercomparison projects should begin with a collaborative effort by model developers to document specific differences between model structures. This could be done in a reproducible way using a unified, process-flexible system like SUMMA [2]. [1] Best, M.J. et al. (2015) 'The plumbing of land surface models: benchmarking model performance', J. Hydrometeor. [2] Clark, M.P. et al. (2015) 'A unified approach for process-based hydrologic modeling: 1. Modeling concept', Water Resour. Res.

  19. Next Generation Community Based Unified Global Modeling System Development and Operational Implementation Strategies at NCEP

    NASA Astrophysics Data System (ADS)

    Tallapragada, V.

    2017-12-01

    NOAA's Next Generation Global Prediction System (NGGPS) has provided the unique opportunity to develop and implement a non-hydrostatic global model based on Geophysical Fluid Dynamics Laboratory (GFDL) Finite Volume Cubed Sphere (FV3) Dynamic Core at National Centers for Environmental Prediction (NCEP), making a leap-step advancement in seamless prediction capabilities across all spatial and temporal scales. Model development efforts are centralized with unified model development in the NOAA Environmental Modeling System (NEMS) infrastructure based on Earth System Modeling Framework (ESMF). A more sophisticated coupling among various earth system components is being enabled within NEMS following National Unified Operational Prediction Capability (NUOPC) standards. The eventual goal of unifying global and regional models will enable operational global models operating at convective resolving scales. Apart from the advanced non-hydrostatic dynamic core and coupling to various earth system components, advanced physics and data assimilation techniques are essential for improved forecast skill. NGGPS is spearheading ambitious physics and data assimilation strategies, concentrating on creation of a Common Community Physics Package (CCPP) and Joint Effort for Data Assimilation Integration (JEDI). Both initiatives are expected to be community developed, with emphasis on research transitioning to operations (R2O). The unified modeling system is being built to support the needs of both operations and research. Different layers of community partners are also established with specific roles/responsibilities for researchers, core development partners, trusted super-users, and operations. Stakeholders are engaged at all stages to help drive the direction of development, resources allocations and prioritization. This talk presents the current and future plans of unified model development at NCEP for weather, sub-seasonal, and seasonal climate prediction applications with special emphasis on implementation of NCEP FV3 Global Forecast System (GFS) and Global Ensemble Forecast System (GEFS) into operations by 2019.

  20. A unified view of "how allostery works".

    PubMed

    Tsai, Chung-Jung; Nussinov, Ruth

    2014-02-01

    The question of how allostery works was posed almost 50 years ago. Since then it has been the focus of much effort. This is for two reasons: first, the intellectual curiosity of basic science and the desire to understand fundamental phenomena, and second, its vast practical importance. Allostery is at play in all processes in the living cell, and increasingly in drug discovery. Many models have been successfully formulated, and are able to describe allostery even in the absence of a detailed structural mechanism. However, conceptual schemes designed to qualitatively explain allosteric mechanisms usually lack a quantitative mathematical model, and are unable to link its thermodynamic and structural foundations. This hampers insight into oncogenic mutations in cancer progression and biased agonists' actions. Here, we describe how allostery works from three different standpoints: thermodynamics, free energy landscape of population shift, and structure; all with exactly the same allosteric descriptors. This results in a unified view which not only clarifies the elusive allosteric mechanism but also provides structural grasp of agonist-mediated signaling pathways, and guides allosteric drug discovery. Of note, the unified view reasons that allosteric coupling (or communication) does not determine the allosteric efficacy; however, a communication channel is what makes potential binding sites allosteric.

  1. Dirac relaxation of the Israel junction conditions: Unified Randall-Sundrum brane theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, Aharon; Gurwich, Ilya

    2006-08-15

    Following Dirac's brane variation prescription, the brane must not be deformed during the variation process, or else the linearity of the variation may be lost. Alternatively, the variation of the brane is done, in a special Dirac frame, by varying the bulk coordinate system itself. Imposing appropriate Dirac-style boundary conditions on the constrained 'sandwiched' gravitational action, we show how Israel junction conditions get relaxed, but remarkably, all solutions of the original Israel equations are still respected. The Israel junction conditions are traded, in the Z{sub 2}-symmetric case, for a generalized Regge-Teitelboim type equation (plus a local conservation law), and inmore » the generic Z{sub 2}-asymmetric case, for a pair of coupled Regge-Teitelboim equations. The Randall-Sundrum model and its derivatives, such as the Dvali-Gabadadze-Porrati and the Collins-Holdom models, get generalized accordingly. Furthermore, Randall-Sundrum and Regge-Teitelboim brane theories appear now to be two different faces of the one and the same unified brane theory. Within the framework of unified brane cosmology, we examine the dark matter/energy interpretation of the effective energy/momentum deviations from general relativity.« less

  2. A unified approach to the analysis and design of elasto-plastic structures with mechanical contact

    NASA Technical Reports Server (NTRS)

    Bendsoe, Martin P.; Olhoff, Niels; Taylor, John E.

    1990-01-01

    With structural design in mind, a new unified variational model has been developed which represents the mechanics of deformation elasto-plasticity with unilateral contact conditions. For a design problem formulated as maximization of the load carrying capacity of a structure under certain constraints, the unified model allows for a simultaneous analysis and design synthesis for a whole range of mechanical behavior.

  3. A unified account of perceptual layering and surface appearance in terms of gamut relativity.

    PubMed

    Vladusich, Tony; McDonnell, Mark D

    2014-01-01

    When we look at the world--or a graphical depiction of the world--we perceive surface materials (e.g. a ceramic black and white checkerboard) independently of variations in illumination (e.g. shading or shadow) and atmospheric media (e.g. clouds or smoke). Such percepts are partly based on the way physical surfaces and media reflect and transmit light and partly on the way the human visual system processes the complex patterns of light reaching the eye. One way to understand how these percepts arise is to assume that the visual system parses patterns of light into layered perceptual representations of surfaces, illumination and atmospheric media, one seen through another. Despite a great deal of previous experimental and modelling work on layered representation, however, a unified computational model of key perceptual demonstrations is still lacking. Here we present the first general computational model of perceptual layering and surface appearance--based on a boarder theoretical framework called gamut relativity--that is consistent with these demonstrations. The model (a) qualitatively explains striking effects of perceptual transparency, figure-ground separation and lightness, (b) quantitatively accounts for the role of stimulus- and task-driven constraints on perceptual matching performance, and (c) unifies two prominent theoretical frameworks for understanding surface appearance. The model thereby provides novel insights into the remarkable capacity of the human visual system to represent and identify surface materials, illumination and atmospheric media, which can be exploited in computer graphics applications.

  4. A Unified Account of Perceptual Layering and Surface Appearance in Terms of Gamut Relativity

    PubMed Central

    Vladusich, Tony; McDonnell, Mark D.

    2014-01-01

    When we look at the world—or a graphical depiction of the world—we perceive surface materials (e.g. a ceramic black and white checkerboard) independently of variations in illumination (e.g. shading or shadow) and atmospheric media (e.g. clouds or smoke). Such percepts are partly based on the way physical surfaces and media reflect and transmit light and partly on the way the human visual system processes the complex patterns of light reaching the eye. One way to understand how these percepts arise is to assume that the visual system parses patterns of light into layered perceptual representations of surfaces, illumination and atmospheric media, one seen through another. Despite a great deal of previous experimental and modelling work on layered representation, however, a unified computational model of key perceptual demonstrations is still lacking. Here we present the first general computational model of perceptual layering and surface appearance—based on a boarder theoretical framework called gamut relativity—that is consistent with these demonstrations. The model (a) qualitatively explains striking effects of perceptual transparency, figure-ground separation and lightness, (b) quantitatively accounts for the role of stimulus- and task-driven constraints on perceptual matching performance, and (c) unifies two prominent theoretical frameworks for understanding surface appearance. The model thereby provides novel insights into the remarkable capacity of the human visual system to represent and identify surface materials, illumination and atmospheric media, which can be exploited in computer graphics applications. PMID:25402466

  5. Unified Plant Growth Model (UPGM). 1. Background, objectives, and vision.

    USDA-ARS?s Scientific Manuscript database

    Since the development of the Environmental Policy Integrated Climate (EPIC) model in 1988, the EPIC-based plant growth code has been incorporated and modified into many agro-ecosystem models. The goals of the Unified Plant Growth Model (UPGM) project are: 1) integrating into one platform the enhance...

  6. Catastrophe Theory: A Unified Model for Educational Change.

    ERIC Educational Resources Information Center

    Cryer, Patricia; Elton, Lewis

    1990-01-01

    Catastrophe Theory and Herzberg's theory of motivation at work was used to create a model of change that unifies and extends Lewin's two separate stage and force field models. This new model is used to analyze the behavior of academics as they adapt to the changing university environment. (Author/MLW)

  7. Unified approach for extrapolation and bridging of adult information in early-phase dose-finding paediatric studies.

    PubMed

    Petit, Caroline; Samson, Adeline; Morita, Satoshi; Ursino, Moreno; Guedj, Jérémie; Jullien, Vincent; Comets, Emmanuelle; Zohar, Sarah

    2018-06-01

    The number of trials conducted and the number of patients per trial are typically small in paediatric clinical studies. This is due to ethical constraints and the complexity of the medical process for treating children. While incorporating prior knowledge from adults may be extremely valuable, this must be done carefully. In this paper, we propose a unified method for designing and analysing dose-finding trials in paediatrics, while bridging information from adults. The dose-range is calculated under three extrapolation options, linear, allometry and maturation adjustment, using adult pharmacokinetic data. To do this, it is assumed that target exposures are the same in both populations. The working model and prior distribution parameters of the dose-toxicity and dose-efficacy relationships are obtained using early-phase adult toxicity and efficacy data at several dose levels. Priors are integrated into the dose-finding process through Bayesian model selection or adaptive priors. This calibrates the model to adjust for misspecification, if the adult and pediatric data are very different. We performed a simulation study which indicates that incorporating prior adult information in this way may improve dose selection in children.

  8. A development framework for semantically interoperable health information systems.

    PubMed

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  9. Generic-distributed framework for cloud services marketplace based on unified ontology.

    PubMed

    Hasan, Samer; Valli Kumari, V

    2017-11-01

    Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo) to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors' knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.

  10. Complex Networks in Psychological Models

    NASA Astrophysics Data System (ADS)

    Wedemann, R. S.; Carvalho, L. S. A. V. D.; Donangelo, R.

    We develop schematic, self-organizing, neural-network models to describe mechanisms associated with mental processes, by a neurocomputational substrate. These models are examples of real world complex networks with interesting general topological structures. Considering dopaminergic signal-to-noise neuronal modulation in the central nervous system, we propose neural network models to explain development of cortical map structure and dynamics of memory access, and unify different mental processes into a single neurocomputational substrate. Based on our neural network models, neurotic behavior may be understood as an associative memory process in the brain, and the linguistic, symbolic associative process involved in psychoanalytic working-through can be mapped onto a corresponding process of reconfiguration of the neural network. The models are illustrated through computer simulations, where we varied dopaminergic modulation and observed the self-organizing emergent patterns at the resulting semantic map, interpreting them as different manifestations of mental functioning, from psychotic through to normal and neurotic behavior, and creativity.

  11. Climbing the ladder: capability maturity model integration level 3

    NASA Astrophysics Data System (ADS)

    Day, Bryce; Lutteroth, Christof

    2011-02-01

    This article details the attempt to form a complete workflow model for an information and communication technologies (ICT) company in order to achieve a capability maturity model integration (CMMI) maturity rating of 3. During this project, business processes across the company's core and auxiliary sectors were documented and extended using modern enterprise modelling tools and a The Open Group Architectural Framework (TOGAF) methodology. Different challenges were encountered with regard to process customisation and tool support for enterprise modelling. In particular, there were problems with the reuse of process models, the integration of different project management methodologies and the integration of the Rational Unified Process development process framework that had to be solved. We report on these challenges and the perceived effects of the project on the company. Finally, we point out research directions that could help to improve the situation in the future.

  12. A feedback model of visual attention.

    PubMed

    Spratling, M W; Johnson, M H

    2004-03-01

    Feedback connections are a prominent feature of cortical anatomy and are likely to have a significant functional role in neural information processing. We present a neural network model of cortical feedback that successfully simulates neurophysiological data associated with attention. In this domain, our model can be considered a more detailed, and biologically plausible, implementation of the biased competition model of attention. However, our model is more general as it can also explain a variety of other top-down processes in vision, such as figure/ground segmentation and contextual cueing. This model thus suggests that a common mechanism, involving cortical feedback pathways, is responsible for a range of phenomena and provides a unified account of currently disparate areas of research.

  13. Towards a Global Unified Model of Europa's Tenuous Atmosphere

    NASA Astrophysics Data System (ADS)

    Plainaki, Christina; Cassidy, Tim A.; Shematovich, Valery I.; Milillo, Anna; Wurz, Peter; Vorburger, Audrey; Roth, Lorenz; Galli, André; Rubin, Martin; Blöcker, Aljona; Brandt, Pontus C.; Crary, Frank; Dandouras, Iannis; Jia, Xianzhe; Grassi, Davide; Hartogh, Paul; Lucchetti, Alice; McGrath, Melissa; Mangano, Valeria; Mura, Alessandro; Orsini, Stefano; Paranicas, Chris; Radioti, Aikaterini; Retherford, Kurt D.; Saur, Joachim; Teolis, Ben

    2018-02-01

    Despite the numerous modeling efforts of the past, our knowledge on the radiation-induced physical and chemical processes in Europa's tenuous atmosphere and on the exchange of material between the moon's surface and Jupiter's magnetosphere remains limited. In lack of an adequate number of in situ observations, the existence of a wide variety of models based on different scenarios and considerations has resulted in a fragmentary understanding of the interactions of the magnetospheric ion population with both the moon's icy surface and neutral gas envelope. Models show large discrepancy in the source and loss rates of the different constituents as well as in the determination of the spatial distribution of the atmosphere and its variation with time. The existence of several models based on very different approaches highlights the need of a detailed comparison among them with the final goal of developing a unified model of Europa's tenuous atmosphere. The availability to the science community of such a model could be of particular interest in view of the planning of the future mission observations (e.g., ESA's JUpiter ICy moons Explorer (JUICE) mission, and NASA's Europa Clipper mission). We review the existing models of Europa's tenuous atmosphere and discuss each of their derived characteristics of the neutral environment. We also discuss discrepancies among different models and the assumptions of the plasma environment in the vicinity of Europa. A summary of the existing observations of both the neutral and the plasma environments at Europa is also presented. The characteristics of a global unified model of the tenuous atmosphere are, then, discussed. Finally, we identify needed future experimental work in laboratories and propose some suitable observation strategies for upcoming missions.

  14. Conducting requirements analyses for research using routinely collected health data: a model driven approach.

    PubMed

    de Lusignan, Simon; Cashman, Josephine; Poh, Norman; Michalakidis, Georgios; Mason, Aaron; Desombre, Terry; Krause, Paul

    2012-01-01

    Medical research increasingly requires the linkage of data from different sources. Conducting a requirements analysis for a new application is an established part of software engineering, but rarely reported in the biomedical literature; and no generic approaches have been published as to how to link heterogeneous health data. Literature review, followed by a consensus process to define how requirements for research, using, multiple data sources might be modeled. We have developed a requirements analysis: i-ScheDULEs - The first components of the modeling process are indexing and create a rich picture of the research study. Secondly, we developed a series of reference models of progressive complexity: Data flow diagrams (DFD) to define data requirements; unified modeling language (UML) use case diagrams to capture study specific and governance requirements; and finally, business process models, using business process modeling notation (BPMN). These requirements and their associated models should become part of research study protocols.

  15. Robust Brain-Machine Interface Design Using Optimal Feedback Control Modeling and Adaptive Point Process Filtering

    PubMed Central

    Carmena, Jose M.

    2016-01-01

    Much progress has been made in brain-machine interfaces (BMI) using decoders such as Kalman filters and finding their parameters with closed-loop decoder adaptation (CLDA). However, current decoders do not model the spikes directly, and hence may limit the processing time-scale of BMI control and adaptation. Moreover, while specialized CLDA techniques for intention estimation and assisted training exist, a unified and systematic CLDA framework that generalizes across different setups is lacking. Here we develop a novel closed-loop BMI training architecture that allows for processing, control, and adaptation using spike events, enables robust control and extends to various tasks. Moreover, we develop a unified control-theoretic CLDA framework within which intention estimation, assisted training, and adaptation are performed. The architecture incorporates an infinite-horizon optimal feedback-control (OFC) model of the brain’s behavior in closed-loop BMI control, and a point process model of spikes. The OFC model infers the user’s motor intention during CLDA—a process termed intention estimation. OFC is also used to design an autonomous and dynamic assisted training technique. The point process model allows for neural processing, control and decoder adaptation with every spike event and at a faster time-scale than current decoders; it also enables dynamic spike-event-based parameter adaptation unlike current CLDA methods that use batch-based adaptation on much slower adaptation time-scales. We conducted closed-loop experiments in a non-human primate over tens of days to dissociate the effects of these novel CLDA components. The OFC intention estimation improved BMI performance compared with current intention estimation techniques. OFC assisted training allowed the subject to consistently achieve proficient control. Spike-event-based adaptation resulted in faster and more consistent performance convergence compared with batch-based methods, and was robust to parameter initialization. Finally, the architecture extended control to tasks beyond those used for CLDA training. These results have significant implications towards the development of clinically-viable neuroprosthetics. PMID:27035820

  16. An integrated approach for high spatial resolution mapping of water and carbon fluxes using multi-sensor data

    USDA-ARS?s Scientific Manuscript database

    In the last few years, modeling of surface processes, such as water and carbon balances, vegetation growth and energy budgets, has focused on integrated approaches that combine aspects of hydrology, biology and meteorology into unified analyses. In this context, remotely sensed data often have a cor...

  17. The Dialectical Development of "Storytelling" Learning Organizations: A Case Study of a Public Research University

    ERIC Educational Resources Information Center

    Hillon, Yue Cai; Boje, David M.

    2017-01-01

    Purpose: Calls for dialectical learning process model development in learning organizations have largely gone unheeded, thereby limiting conceptual understanding and application in the field. This paper aims to unify learning organization theory with a new understanding of Hegelian dialectics to trace the development of the storytelling learning…

  18. An Integrated Approach for Preservice Teachers' Acceptance and Use of Technology: UTAUT-PST Scale

    ERIC Educational Resources Information Center

    Kabakçi-Yurdakul, Isil; Ursavas, Ömer Faruk; Becit-Isçitürk, Gökçe

    2014-01-01

    Problem Statement: In educational systems, teachers and preservice teachers are the keys to the effective use of technology in the teaching and learning processes. Predicting teachers' technology acceptance and use remains an important issue. Models and theories have been developed to explain and predict technology acceptance. The Unified Theory…

  19. Failing to Learn: Towards a Unified Design Approach for Failure-Based Learning

    ERIC Educational Resources Information Center

    Tawfik, Andrew A.; Rong, Hui; Choi, Ikseon

    2015-01-01

    To date, many instructional systems are designed to support learners as they progress through a problem-solving task. Often these systems are designed in accordance with instructional design models that progress the learner efficiently through the problem-solving process. However, theories from various fields have discussed failure as a strategic…

  20. Unifying Different Theories of Learning: Theoretical Framework and Empirical Evidence

    ERIC Educational Resources Information Center

    Phan, Huy Phuong

    2008-01-01

    The main aim of this research study was to test out a conceptual model encompassing the theoretical frameworks of achievement goals, study processing strategies, effort, and reflective thinking practice. In particular, it was postulated that the causal influences of achievement goals on academic performance are direct and indirect through study…

  1. District-Wide Involvement: The Key to Successful School Improvement.

    ERIC Educational Resources Information Center

    Mundell, Scott; Babich, George

    1989-01-01

    Describes the self-study process used by the Marana Unified School District to meet accreditation requirements with minimal expense, to emphasize curriculum development, and to improve the school. Considers the key feature of the cyclical review model to be the personal involvement of nearly every faculty member in the 10-school district. (DMM)

  2. Unifying Psychology and Experiential Education: Toward an Integrated Understanding of "Why" It Works

    ERIC Educational Resources Information Center

    Houge Mackenzie, Susan; Son, Julie S.; Hollenhorst, Steve

    2014-01-01

    This article examines the significance of psychology to experiential education (EE) and critiques EE models that have developed in isolation from larger psychological theories and developments. Following a review of literature and current issues, select areas of psychology are explored with reference to experiential learning processes. The state…

  3. Using the Maximum Entropy Principle as a Unifying Theory Characterization and Sampling of Multi-Scaling Processes in Hydrometeorology

    DTIC Science & Technology

    2015-08-20

    evapotranspiration (ET) over oceans may be significantly lower than previously thought. The MEP model parameterized turbulent transfer coefficients...fluxes, ocean freshwater fluxes, regional crop yield among others. An on-going study suggests that the global annual evapotranspiration (ET) over...Bras, Jingfeng Wang. A model of evapotranspiration based on the theory of maximum entropy production, Water Resources Research, (03 2011): 0. doi

  4. Rosen's (M,R) system in Unified Modelling Language.

    PubMed

    Zhang, Ling; Williams, Richard A; Gatherer, Derek

    2016-01-01

    Robert Rosen's (M,R) system is an abstract biological network architecture that is allegedly non-computable on a Turing machine. If (M,R) is truly non-computable, there are serious implications for the modelling of large biological networks in computer software. A body of work has now accumulated addressing Rosen's claim concerning (M,R) by attempting to instantiate it in various software systems. However, a conclusive refutation has remained elusive, principally since none of the attempts to date have unambiguously avoided the critique that they have altered the properties of (M,R) in the coding process, producing merely approximate simulations of (M,R) rather than true computational models. In this paper, we use the Unified Modelling Language (UML), a diagrammatic notation standard, to express (M,R) as a system of objects having attributes, functions and relations. We believe that this instantiates (M,R) in such a way than none of the original properties of the system are corrupted in the process. Crucially, we demonstrate that (M,R) as classically represented in the relational biology literature is implicitly a UML communication diagram. Furthermore, since UML is formally compatible with object-oriented computing languages, instantiation of (M,R) in UML strongly implies its computability in object-oriented coding languages. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. A unified bond theory, probabilistic meso-scale modeling, and experimental validation of deformed steel rebar in normal strength concrete

    NASA Astrophysics Data System (ADS)

    Wu, Chenglin

    Bond between deformed rebar and concrete is affected by rebar deformation pattern, concrete properties, concrete confinement, and rebar-concrete interfacial properties. Two distinct groups of bond models were traditionally developed based on the dominant effects of concrete splitting and near-interface shear-off failures. Their accuracy highly depended upon the test data sets selected in analysis and calibration. In this study, a unified bond model is proposed and developed based on an analogy to the indentation problem around the rib front of deformed rebar. This mechanics-based model can take into account the combined effect of concrete splitting and interface shear-off failures, resulting in average bond strengths for all practical scenarios. To understand the fracture process associated with bond failure, a probabilistic meso-scale model of concrete is proposed and its sensitivity to interface and confinement strengths are investigated. Both the mechanical and finite element models are validated with the available test data sets and are superior to existing models in prediction of average bond strength (< 6% error) and crack spacing (< 6% error). The validated bond model is applied to derive various interrelations among concrete crushing, concrete splitting, interfacial behavior, and the rib spacing-to-height ratio of deformed rebar. It can accurately predict the transition of failure modes from concrete splitting to rebar pullout and predict the effect of rebar surface characteristics as the rib spacing-to-height ratio increases. Based on the unified theory, a global bond model is proposed and developed by introducing bond-slip laws, and validated with testing of concrete beams with spliced reinforcement, achieving a load capacity prediction error of less than 26%. The optimal rebar parameters and concrete cover in structural designs can be derived from this study.

  6. Hybrid processing of laser scanning data

    NASA Astrophysics Data System (ADS)

    Badenko, Vladimir; Zotov, Dmitry; Fedotov, Alexander

    2018-03-01

    In this article the analysis of gaps in processing of raw laser scanning data and results of bridging the gaps discovered on the base of usage of laser scanning data for historic building information modeling is presented. The results of the development of a unified hybrid technology for the processing, storage, access and visualization of combined laser scanning and photography data about historical buildings are analyzed. The first result of the technology application for the historical building of St. Petersburg Polytechnic University shows reliability of the proposed approaches.

  7. Research on key technologies of data processing in internet of things

    NASA Astrophysics Data System (ADS)

    Zhu, Yangqing; Liang, Peiying

    2017-08-01

    The data of Internet of things (IOT) has the characteristics of polymorphism, heterogeneous, large amount and processing real-time. The traditional structured and static batch processing method has not met the requirements of data processing of IOT. This paper studied a middleware that can integrate heterogeneous data of IOT, and integrated different data formats into a unified format. Designed a data processing model of IOT based on the Storm flow calculation architecture, integrated the existing Internet security technology to build the Internet security system of IOT data processing, which provided reference for the efficient transmission and processing of IOT data.

  8. Modeling microbial products in activated sludge under feast-famine conditions.

    PubMed

    Ni, Bing-Jie; Fang, Fang; Rittmann, Bruce E; Yu, Han-Qing

    2009-04-01

    We develop an expanded unified model that integrates production and consumption of internal storage products (X(STO)) into a unified model for extracellular polymeric substances (EPS), soluble microbial products (SMP), and active and inert biomass in activated sludge. We also conducted independent experiments to find needed parameter values and to test the ability of the expanded unified model to describe all the microbial products, along with original substrate and oxygen uptake. The model simulations match all experimental measurements and provide insights into the dynamics of soluble and solid components in activated sludge exposed to dynamic feast-and-famine conditions in two batch experiments and in one cycle of a sequencing batch reactor. In particular, the model illustrates how X(STO) cycles up and down rapidly during feast and famine periods, while EPS and biomass components are relatively stable despite feast and famine. The agreement between model outputs and experimental EPS, SMP, and X(STO) data from distinctly different experiments supports that the expanded unified model properly captures the relationships among the forms of microbial products.

  9. Hard X-ray tests of the unified model for an ultraviolet-detected sample of Seyfert 2 galaxies

    NASA Technical Reports Server (NTRS)

    Mulchaey, John S.; Myshotzky, Richard F.; Weaver, Kimberly A.

    1992-01-01

    An ultraviolet-detected sample of Seyfert 2 galaxies shows heavy photoelectric absorption in the hard X-ray band. The presence of UV emission combined with hard X-ray absorption argues strongly for a special geometry which must have the general properties of the Antonucci and Miller unified model. The observations of this sample are consistent with the picture in which the hard X-ray photons are viewed directly through the obscuring matter (molecular torus?) and the optical, UV, and soft X-ray continuum are seen in scattered light. The large range in X-ray column densities implies that there must be a large variation in intrinsic thicknesses of molecular tori, an assumption not found in the simplest of unified models. Furthermore, constraints based on the cosmic X-ray background suggest that some of the underlying assumptions of the unified model are wrong.

  10. Unified Modeling Language (UML) for hospital-based cancer registration processes.

    PubMed

    Shiki, Naomi; Ohno, Yuko; Fujii, Ayumi; Murata, Taizo; Matsumura, Yasushi

    2008-01-01

    Hospital-based cancer registry involves complex processing steps that span across multiple departments. In addition, management techniques and registration procedures differ depending on each medical facility. Establishing processes for hospital-based cancer registry requires clarifying specific functions and labor needed. In recent years, the business modeling technique, in which management evaluation is done by clearly spelling out processes and functions, has been applied to business process analysis. However, there are few analytical reports describing the applications of these concepts to medical-related work. In this study, we initially sought to model hospital-based cancer registration processes using the Unified Modeling Language (UML), to clarify functions. The object of this study was the cancer registry of Osaka University Hospital. We organized the hospital-based cancer registration processes based on interview and observational surveys, and produced an As-Is model using activity, use-case, and class diagrams. After drafting every UML model, it was fed-back to practitioners to check its validity and improved. We were able to define the workflow for each department using activity diagrams. In addition, by using use-case diagrams we were able to classify each department within the hospital as a system, and thereby specify the core processes and staff that were responsible for each department. The class diagrams were effective in systematically organizing the information to be used for hospital-based cancer registries. Using UML modeling, hospital-based cancer registration processes were broadly classified into three separate processes, namely, registration tasks, quality control, and filing data. An additional 14 functions were also extracted. Many tasks take place within the hospital-based cancer registry office, but the process of providing information spans across multiple departments. Moreover, additional tasks were required in comparison to using a standardized system because the hospital-based cancer registration system was constructed with the pre-existing computer system in Osaka University Hospital. Difficulty of utilization of useful information for cancer registration processes was shown to increase the task workload. By using UML, we were able to clarify functions and extract the typical processes for a hospital-based cancer registry. Modeling can provide a basis of process analysis for establishment of efficient hospital-based cancer registration processes in each institute.

  11. A Unified Dynamic Model for Learning, Replay, and Sharp-Wave/Ripples.

    PubMed

    Jahnke, Sven; Timme, Marc; Memmesheimer, Raoul-Martin

    2015-12-09

    Hippocampal activity is fundamental for episodic memory formation and consolidation. During phases of rest and sleep, it exhibits sharp-wave/ripple (SPW/R) complexes, which are short episodes of increased activity with superimposed high-frequency oscillations. Simultaneously, spike sequences reflecting previous behavior, such as traversed trajectories in space, are replayed. Whereas these phenomena are thought to be crucial for the formation and consolidation of episodic memory, their neurophysiological mechanisms are not well understood. Here we present a unified model showing how experience may be stored and thereafter replayed in association with SPW/Rs. We propose that replay and SPW/Rs are tightly interconnected as they mutually generate and support each other. The underlying mechanism is based on the nonlinear dendritic computation attributable to dendritic sodium spikes that have been prominently found in the hippocampal regions CA1 and CA3, where SPW/Rs and replay are also generated. Besides assigning SPW/Rs a crucial role for replay and thus memory processing, the proposed mechanism also explains their characteristic features, such as the oscillation frequency and the overall wave form. The results shed a new light on the dynamical aspects of hippocampal circuit learning. During phases of rest and sleep, the hippocampus, the "memory center" of the brain, generates intermittent patterns of strongly increased overall activity with high-frequency oscillations, the so-called sharp-wave/ripples. We investigate their role in learning and memory processing. They occur together with replay of activity sequences reflecting previous behavior. Developing a unifying computational model, we propose that both phenomena are tightly linked, by mutually generating and supporting each other. The underlying mechanism depends on nonlinear amplification of synchronous inputs that has been prominently found in the hippocampus. Besides assigning sharp-wave/ripples a crucial role for replay generation and thus memory processing, the proposed mechanism also explains their characteristic features, such as the oscillation frequency and the overall wave form. Copyright © 2015 the authors 0270-6474/15/3516236-23$15.00/0.

  12. Preliminary Report Regarding State Allocation Board Funding of the Los Angeles Unified School District's Belmont Learning Complex.

    ERIC Educational Resources Information Center

    Armoudian, Maria; Carman, Georgann; Havan, Artineh; Heron, Frank

    A preliminary report of the California Legislature's Joint Legislative Audit Committee presents findings on the construction team selection process for the Los Angeles Unified School District's (LAUSD's) Belmont Learning Complex. Evidence reveals a seriously flawed process that directly conflicted with existing law and practice. The report…

  13. A unified framework for group independent component analysis for multi-subject fMRI data

    PubMed Central

    Guo, Ying; Pagnoni, Giuseppe

    2008-01-01

    Independent component analysis (ICA) is becoming increasingly popular for analyzing functional magnetic resonance imaging (fMRI) data. While ICA has been successfully applied to single-subject analysis, the extension of ICA to group inferences is not straightforward and remains an active topic of research. Current group ICA models, such as the GIFT (Calhoun et al., 2001) and tensor PICA (Beckmann and Smith, 2005), make different assumptions about the underlying structure of the group spatio-temporal processes and are thus estimated using algorithms tailored for the assumed structure, potentially leading to diverging results. To our knowledge, there are currently no methods for assessing the validity of different model structures in real fMRI data and selecting the most appropriate one among various choices. In this paper, we propose a unified framework for estimating and comparing group ICA models with varying spatio-temporal structures. We consider a class of group ICA models that can accommodate different group structures and include existing models, such as the GIFT and tensor PICA, as special cases. We propose a maximum likelihood (ML) approach with a modified Expectation-Maximization (EM) algorithm for the estimation of the proposed class of models. Likelihood ratio tests (LRT) are presented to compare between different group ICA models. The LRT can be used to perform model comparison and selection, to assess the goodness-of-fit of a model in a particular data set, and to test group differences in the fMRI signal time courses between subject subgroups. Simulation studies are conducted to evaluate the performance of the proposed method under varying structures of group spatio-temporal processes. We illustrate our group ICA method using data from an fMRI study that investigates changes in neural processing associated with the regular practice of Zen meditation. PMID:18650105

  14. On the possible cycles via the unified perspective of cryocoolers. Part A: The Joule-Thomson cryocooler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maytal, Ben-Zion; Pfotenhauer, John M.

    2014-01-29

    Joule-Thomson (JT) cryocoolers possess a self adjusting effect, which preserves the state of the returning stream from the evaporator as a saturated vapor. The heat load can be entirely absorbed at constant temperature by evaporation even for different sized heat exchangers. It is not possible for the steady state flow resulting from a gradual cool down to penetrate 'deeper' into the two-phase dome, and produce a two phase return flow even with a heat exchanger of unlimited size. Such behavior was implicitly taken for granted in the literature but never clearly stated nor questioned and therefore never systematically proven. Themore » discussion provided below provides such a proof via the unified model of cryocoolers. This model portrays all cryocoolers as magnifiers of their respective elementary temperature reducing mechanism through the process of 'interchanging'.« less

  15. The anchoring bias reflects rational use of cognitive resources.

    PubMed

    Lieder, Falk; Griffiths, Thomas L; M Huys, Quentin J; Goodman, Noah D

    2018-02-01

    Cognitive biases, such as the anchoring bias, pose a serious challenge to rational accounts of human cognition. We investigate whether rational theories can meet this challenge by taking into account the mind's bounded cognitive resources. We asked what reasoning under uncertainty would look like if people made rational use of their finite time and limited cognitive resources. To answer this question, we applied a mathematical theory of bounded rationality to the problem of numerical estimation. Our analysis led to a rational process model that can be interpreted in terms of anchoring-and-adjustment. This model provided a unifying explanation for ten anchoring phenomena including the differential effect of accuracy motivation on the bias towards provided versus self-generated anchors. Our results illustrate the potential of resource-rational analysis to provide formal theories that can unify a wide range of empirical results and reconcile the impressive capacities of the human mind with its apparently irrational cognitive biases.

  16. Synergy of the Developed 6D BIM Framework and Conception of the nD BIM Framework and nD BIM Process Ontology

    ERIC Educational Resources Information Center

    O'Keeffe, Shawn Edward

    2013-01-01

    The author developed a unified nD framework and process ontology for Building Information Modeling (BIM). The research includes a framework developed for 6D BIM, nD BIM, and nD ontology that defines the domain and sub-domain constructs for future nD BIM dimensions. The nD ontology defines the relationships of kinds within any new proposed…

  17. Unified Tri-Services Cognitive Performance Assessment Battery: Review and Methodology

    DTIC Science & Technology

    1987-03-01

    sections in this report. 11 ,i The present report provides extensive documentattbn for Pach test; in the UTC-PAB to aid in the selection and...memory storage (e.g., Wanner and Shiner, 19/6) aid processing. Previous research (e.g., Perez, 1982) has shown that transitions from one operation to...information processing model. Two antidepressant drugs, amoxapine and amitriptyline , were given to depressed outpatients whose rpactlon times on the memory

  18. A model for calculating expected performance of the Apollo unified S-band (USB) communication system

    NASA Technical Reports Server (NTRS)

    Schroeder, N. W.

    1971-01-01

    A model for calculating the expected performance of the Apollo unified S-band (USB) communication system is presented. The general organization of the Apollo USB is described. The mathematical model is reviewed and the computer program for implementation of the calculations is included.

  19. Dynamic Cognitive Tracing: Towards Unified Discovery of Student and Cognitive Models

    ERIC Educational Resources Information Center

    Gonzalez-Brenes, Jose P.; Mostow, Jack

    2012-01-01

    This work describes a unified approach to two problems previously addressed separately in Intelligent Tutoring Systems: (i) Cognitive Modeling, which factorizes problem solving steps into the latent set of skills required to perform them; and (ii) Student Modeling, which infers students' learning by observing student performance. The practical…

  20. Revisiting the flocculation kinetics of destabilized asphaltenes.

    PubMed

    Vilas Bôas Fávero, Cláudio; Maqbool, Tabish; Hoepfner, Michael; Haji-Akbari, Nasim; Fogler, H Scott

    2017-06-01

    A comprehensive review of the recently published work on asphaltene destabilization and flocculation kinetics is presented. Four different experimental techniques were used to study asphaltenes undergoing flocculation process in crude oils and model oils. The asphaltenes were destabilized by different n-alkanes and a geometric population balance with the Smoluchowski collision kernel was used to model the asphaltene aggregation process. Additionally, by postulating a relation between the aggregation collision efficiency and the solubility parameter of asphaltenes and the solution, a unified model of asphaltene aggregation model was developed. When the aggregation model is applied to the experimental data obtained from several different crude oil and model oils, the detection time curves collapsed onto a universal single line, indicating that the model successfully captures the underlying physics of the observed process. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Nonlinear adaptive inverse control via the unified model neural network

    NASA Astrophysics Data System (ADS)

    Jeng, Jin-Tsong; Lee, Tsu-Tian

    1999-03-01

    In this paper, we propose a new nonlinear adaptive inverse control via a unified model neural network. In order to overcome nonsystematic design and long training time in nonlinear adaptive inverse control, we propose the approximate transformable technique to obtain a Chebyshev Polynomials Based Unified Model (CPBUM) neural network for the feedforward/recurrent neural networks. It turns out that the proposed method can use less training time to get an inverse model. Finally, we apply this proposed method to control magnetic bearing system. The experimental results show that the proposed nonlinear adaptive inverse control architecture provides a greater flexibility and better performance in controlling magnetic bearing systems.

  2. The mechanism and design of sequencing batch reactor systems for nutrient removal--the state of the art.

    PubMed

    Artan, N; Wilderer, P; Orhon, D; Morgenroth, E; Ozgür, N

    2001-01-01

    The Sequencing Batch Reactor (SBR) process for carbon and nutrient removal is subject to extensive research, and it is finding a wider application in full-scale installations. Despite the growing popularity, however, a widely accepted approach to process analysis and modeling, a unified design basis, and even a common terminology are still lacking; this situation is now regarded as the major obstacle hindering broader practical application of the SBR. In this paper a rational dimensioning approach is proposed for nutrient removal SBRs based on scientific information on process stoichiometry and modelling, also emphasizing practical constraints in design and operation.

  3. Unified phenology model with Bayesian calibration for several European species in Belgium

    NASA Astrophysics Data System (ADS)

    Fu, Y. S. H.; Demarée, G.; Hamdi, R.; Deckmyn, A.; Deckmyn, G.; Janssens, I. A.

    2009-04-01

    Plant phenology is a good bio-indicator for climate change, and this has brought a significant increase of interest. Many kinds of phenology models have been developed to analyze and predict the phenological response to climate change, and those models have been summarized into one kind of unified model, which could be applied to different species and environments. In our study, we selected seven European woody plant species (Betula verrucosa, Quercus robur pedunculata, Fagus sylvatica, Fraxinus excelsior, Symphoricarpus racemosus, Aesculus hippocastanum, Robinia pseudoacacia) occurring in five sites distributed across Belgium. For those sites and tree species, phenological observations such as bud burst were available for the period 1956 - 2002. We also obtained regional downscaled climatic data for each of these sites, and combined both data sets to test the unified model. We used a Bayesian approach to generate distributions of model parameters from the observation data. In this poster presentation, we compare parameter distributions between different species and between different sites for individual species. The results of the unified model show a good agreement with the observations, except for Fagus sylvatica. The failure to reproduce the bud burst data for Fagus sylvatica suggest that the other factors not included in the unified model affect the phenology of this species. The parameter series show differences among species as we expected. However, they also differed strongly for the same species among sites.Further work should elucidate the mechanism that explains why model parameters differ among species and sites.

  4. SUMMA and Model Mimicry: Understanding Differences Among Land Models

    NASA Astrophysics Data System (ADS)

    Nijssen, B.; Nearing, G. S.; Ou, G.; Clark, M. P.

    2016-12-01

    Model inter-comparison and model ensemble experiments suffer from an inability to explain the mechanisms behind differences in model outcomes. We can clearly demonstrate that the models are different, but we cannot necessarily identify the reasons why, because most models exhibit myriad differences in process representations, model parameterizations, model parameters and numerical solution methods. This inability to identify the reasons for differences in model performance hampers our understanding and limits model improvement, because we cannot easily identify the most promising paths forward. We have developed the Structure for Unifying Multiple Modeling Alternatives (SUMMA) to allow for controlled experimentation with model construction, numerical techniques, and parameter values and therefore isolate differences in model outcomes to specific choices during the model development process. In developing SUMMA, we recognized that hydrologic models can be thought of as individual instantiations of a master modeling template that is based on a common set of conservation equations for energy and water. Given this perspective, SUMMA provides a unified approach to hydrologic modeling that integrates different modeling methods into a consistent structure with the ability to instantiate alternative hydrologic models at runtime. Here we employ SUMMA to revisit a previous multi-model experiment and demonstrate its use for understanding differences in model performance. Specifically, we implement SUMMA to mimic the spread of behaviors exhibited by the land models that participated in the Protocol for the Analysis of Land Surface Models (PALS) Land Surface Model Benchmarking Evaluation Project (PLUMBER) and draw conclusions about the relative performance of specific model parameterizations for water and energy fluxes through the soil-vegetation continuum. SUMMA's ability to mimic the spread of model ensembles and the behavior of individual models can be an important tool in focusing model development and improvement efforts.

  5. Modeling the Development of Audiovisual Cue Integration in Speech Perception

    PubMed Central

    Getz, Laura M.; Nordeen, Elke R.; Vrabic, Sarah C.; Toscano, Joseph C.

    2017-01-01

    Adult speech perception is generally enhanced when information is provided from multiple modalities. In contrast, infants do not appear to benefit from combining auditory and visual speech information early in development. This is true despite the fact that both modalities are important to speech comprehension even at early stages of language acquisition. How then do listeners learn how to process auditory and visual information as part of a unified signal? In the auditory domain, statistical learning processes provide an excellent mechanism for acquiring phonological categories. Is this also true for the more complex problem of acquiring audiovisual correspondences, which require the learner to integrate information from multiple modalities? In this paper, we present simulations using Gaussian mixture models (GMMs) that learn cue weights and combine cues on the basis of their distributional statistics. First, we simulate the developmental process of acquiring phonological categories from auditory and visual cues, asking whether simple statistical learning approaches are sufficient for learning multi-modal representations. Second, we use this time course information to explain audiovisual speech perception in adult perceivers, including cases where auditory and visual input are mismatched. Overall, we find that domain-general statistical learning techniques allow us to model the developmental trajectory of audiovisual cue integration in speech, and in turn, allow us to better understand the mechanisms that give rise to unified percepts based on multiple cues. PMID:28335558

  6. Modeling the Development of Audiovisual Cue Integration in Speech Perception.

    PubMed

    Getz, Laura M; Nordeen, Elke R; Vrabic, Sarah C; Toscano, Joseph C

    2017-03-21

    Adult speech perception is generally enhanced when information is provided from multiple modalities. In contrast, infants do not appear to benefit from combining auditory and visual speech information early in development. This is true despite the fact that both modalities are important to speech comprehension even at early stages of language acquisition. How then do listeners learn how to process auditory and visual information as part of a unified signal? In the auditory domain, statistical learning processes provide an excellent mechanism for acquiring phonological categories. Is this also true for the more complex problem of acquiring audiovisual correspondences, which require the learner to integrate information from multiple modalities? In this paper, we present simulations using Gaussian mixture models (GMMs) that learn cue weights and combine cues on the basis of their distributional statistics. First, we simulate the developmental process of acquiring phonological categories from auditory and visual cues, asking whether simple statistical learning approaches are sufficient for learning multi-modal representations. Second, we use this time course information to explain audiovisual speech perception in adult perceivers, including cases where auditory and visual input are mismatched. Overall, we find that domain-general statistical learning techniques allow us to model the developmental trajectory of audiovisual cue integration in speech, and in turn, allow us to better understand the mechanisms that give rise to unified percepts based on multiple cues.

  7. Incubation, Insight, and Creative Problem Solving: A Unified Theory and a Connectionist Model

    ERIC Educational Resources Information Center

    Helie, Sebastien; Sun, Ron

    2010-01-01

    This article proposes a unified framework for understanding creative problem solving, namely, the explicit-implicit interaction theory. This new theory of creative problem solving constitutes an attempt at providing a more unified explanation of relevant phenomena (in part by reinterpreting/integrating various fragmentary existing theories of…

  8. Kinetic Theory and Simulation of Single-Channel Water Transport

    NASA Astrophysics Data System (ADS)

    Tajkhorshid, Emad; Zhu, Fangqiang; Schulten, Klaus

    Water translocation between various compartments of a system is a fundamental process in biology of all living cells and in a wide variety of technological problems. The process is of interest in different fields of physiology, physical chemistry, and physics, and many scientists have tried to describe the process through physical models. Owing to advances in computer simulation of molecular processes at an atomic level, water transport has been studied in a variety of molecular systems ranging from biological water channels to artificial nanotubes. While simulations have successfully described various kinetic aspects of water transport, offering a simple, unified model to describe trans-channel translocation of water turned out to be a nontrivial task.

  9. The methodology of database design in organization management systems

    NASA Astrophysics Data System (ADS)

    Chudinov, I. L.; Osipova, V. V.; Bobrova, Y. V.

    2017-01-01

    The paper describes the unified methodology of database design for management information systems. Designing the conceptual information model for the domain area is the most important and labor-intensive stage in database design. Basing on the proposed integrated approach to design, the conceptual information model, the main principles of developing the relation databases are provided and user’s information needs are considered. According to the methodology, the process of designing the conceptual information model includes three basic stages, which are defined in detail. Finally, the article describes the process of performing the results of analyzing user’s information needs and the rationale for use of classifiers.

  10. Creating a library holding group: an approach to large system integration.

    PubMed

    Huffman, Isaac R; Martin, Heather J; Delawska-Elliott, Basia

    2016-10-01

    Faced with resource constraints, many hospital libraries have considered joint operations. This case study describes how Providence Health & Services created a single group to provide library services. Using a holding group model, staff worked to unify more than 6,100 nonlibrary subscriptions and 14 internal library sites. Our library services grew by unifying 2,138 nonlibrary subscriptions and 11 library sites and hiring more library staff. We expanded access to 26,018 more patrons. A model with built-in flexibility allowed successful library expansion. Although challenges remain, this success points to a viable model of unified operations.

  11. Adapting Rational Unified Process (RUP) approach in designing a secure e-Tendering model

    NASA Astrophysics Data System (ADS)

    Mohd, Haslina; Robie, Muhammad Afdhal Muhammad; Baharom, Fauziah; Darus, Norida Muhd; Saip, Mohamed Ali; Yasin, Azman

    2016-08-01

    e-Tendering is an electronic processing of the tender document via internet and allow tenderer to publish, communicate, access, receive and submit all tender related information and documentation via internet. This study aims to design the e-Tendering system using Rational Unified Process approach. RUP provides a disciplined approach on how to assign tasks and responsibilities within the software development process. RUP has four phases that can assist researchers to adjust the requirements of various projects with different scope, problem and the size of projects. RUP is characterized as a use case driven, architecture centered, iterative and incremental process model. However the scope of this study only focusing on Inception and Elaboration phases as step to develop the model and perform only three of nine workflows (business modeling, requirements, analysis and design). RUP has a strong focus on documents and the activities in the inception and elaboration phases mainly concern the creation of diagrams and writing of textual descriptions. The UML notation and the software program, Star UML are used to support the design of e-Tendering. The e-Tendering design based on the RUP approach can contribute to e-Tendering developers and researchers in e-Tendering domain. In addition, this study also shows that the RUP is one of the best system development methodology that can be used as one of the research methodology in Software Engineering domain related to secured design of any observed application. This methodology has been tested in various studies in certain domains, such as in Simulation-based Decision Support, Security Requirement Engineering, Business Modeling and Secure System Requirement, and so forth. As a conclusion, these studies showed that the RUP one of a good research methodology that can be adapted in any Software Engineering (SE) research domain that required a few artifacts to be generated such as use case modeling, misuse case modeling, activity diagram, and initial class diagram from a list of requirements as identified earlier by the SE researchers

  12. An Extended Petri-Net Based Approach for Supply Chain Process Enactment in Resource-Centric Web Service Environment

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Zhang, Xiaoyu; Cai, Hongming; Xu, Boyi

    Enacting a supply-chain process involves variant partners and different IT systems. REST receives increasing attention for distributed systems with loosely coupled resources. Nevertheless, resource model incompatibilities and conflicts prevent effective process modeling and deployment in resource-centric Web service environment. In this paper, a Petri-net based framework for supply-chain process integration is proposed. A resource meta-model is constructed to represent the basic information of resources. Then based on resource meta-model, XML schemas and documents are derived, which represent resources and their states in Petri-net. Thereafter, XML-net, a high level Petri-net, is employed for modeling control and data flow of process. From process model in XML-net, RESTful services and choreography descriptions are deduced. Therefore, unified resource representation and RESTful services description are proposed for cross-system integration in a more effective way. A case study is given to illustrate the approach and the desirable features of the approach are discussed.

  13. Modelling tidewater glacier calving: from detailed process models to simple calving laws

    NASA Astrophysics Data System (ADS)

    Benn, Doug; Åström, Jan; Zwinger, Thomas; Todd, Joe; Nick, Faezeh

    2017-04-01

    The simple calving laws currently used in ice sheet models do not adequately reflect the complexity and diversity of calving processes. To be effective, calving laws must be grounded in a sound understanding of how calving actually works. We have developed a new approach to formulating calving laws, using a) the Helsinki Discrete Element Model (HiDEM) to explicitly model fracture and calving processes, and b) the full-Stokes continuum model Elmer/Ice to identify critical stress states associated with HiDEM calving events. A range of observed calving processes emerges spontaneously from HiDEM in response to variations in ice-front buoyancy and the size of subaqueous undercuts, and we show that HiDEM calving events are associated with characteristic stress patterns simulated in Elmer/Ice. Our results open the way to developing calving laws that properly reflect the diversity of calving processes, and provide a framework for a unified theory of the calving process continuum.

  14. GFS-10/10/2007-12Z

    Science.gov Websites

    Mountainous Coasts: A change to the GFS post codes will remove a persistent, spurious high pressure system ENVIRONMENTAL PREDICTION /NCEP/ WILL UPGRADE THE GFS POST PROCESSOR. THE PRIMARY EFFORT BEHIND THIS UPGRADE WILL BE TO UNIFY THE POST PROCESSING CODE FOR THE NORTH AMERICAN MESO SCALE /NAM/ MODEL AND THE GFS INTO

  15. The Ned IIS project - forest ecosystem management

    Treesearch

    W. Potter; D. Nute; J. Wang; F. Maier; Michael Twery; H. Michael Rauscher; P. Knopp; S. Thomasma; M. Dass; H. Uchiyama

    2002-01-01

    For many years we have held to the notion that an Intelligent Information System (IIS) is composed of a unified knowledge base, database, and model base. The main idea behind this notion is the transparent processing of user queries. The system is responsible for "deciding" which information sources to access in order to fulfil a query regardless of whether...

  16. Delay of Gratification and Delay Discounting: A Unifying Feedback Model of Delay-Related Impulsive Behavior

    ERIC Educational Resources Information Center

    Reynolds, Brady; Schiffbauer, Ryan

    2005-01-01

    Delay of Gratification (DG) and Delay Discounting (DD) represent two indices of impulsive behavior often treated as though they represent equivalent or the same underlying processes. However, there are key differences between DG and DD procedures, and between certain research findings with each procedure, that suggest they are not equivalent. In…

  17. Competition and Cooperation among Similar Representations: Toward a Unified Account of Facilitative and Inhibitory Effects of Lexical Neighbors

    ERIC Educational Resources Information Center

    Chen, Qi; Mirman, Daniel

    2012-01-01

    One of the core principles of how the mind works is the graded, parallel activation of multiple related or similar representations. Parallel activation of multiple representations has been particularly important in the development of theories and models of language processing, where coactivated representations ("neighbors") have been shown to…

  18. Using the UTAUT Model to Examine the Acceptance Behavior of Synchronous Collaboration to Support Peer Translation

    ERIC Educational Resources Information Center

    Liu, Yi Chun; Huang, Yong-Ming

    2015-01-01

    The teaching of translation has received considerable attention in recent years. Research on translation in collaborative learning contexts, however, has been less studied. In this study, we use a tool of synchronous collaboration to assist students in experiencing a peer translation process. Afterward, the unified theory of acceptance and use of…

  19. A cloudy planetary boundary layer oscillation arising from the coupling of turbulence with precipitation in climate simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, X.; Klein, S. A.; Ma, H. -Y.

    The Community Atmosphere Model (CAM) adopts Cloud Layers Unified By Binormals (CLUBB) scheme and an updated microphysics (MG2) scheme for a more unified treatment of cloud processes. This makes interactions between parameterizations tighter and more explicit. In this study, a cloudy planetary boundary layer (PBL) oscillation related to interaction between CLUBB and MG2 is identified in CAM. This highlights the need for consistency between the coupled subgrid processes in climate model development. This oscillation occurs most often in the marine cumulus cloud regime. The oscillation occurs only if the modeled PBL is strongly decoupled and precipitation evaporates below the cloud.more » Two aspects of the parameterized coupling assumptions between CLUBB and MG2 schemes cause the oscillation: (1) a parameterized relationship between rain evaporation and CLUBB's subgrid spatial variance of moisture and heat that induces an extra cooling in the lower PBL and (2) rain evaporation which happens at a too low an altitude because of the precipitation fraction parameterization in MG2. Either one of these two conditions can overly stabilize the PBL and reduce the upward moisture transport to the cloud layer so that the PBL collapses. Global simulations prove that turning off the evaporation-variance coupling and improving the precipitation fraction parameterization effectively reduces the cloudy PBL oscillation in marine cumulus clouds. By evaluating the causes of the oscillation in CAM, we have identified the PBL processes that should be examined in models having similar oscillations. This study may draw the attention of the modeling and observational communities to the issue of coupling between parameterized physical processes.« less

  20. A cloudy planetary boundary layer oscillation arising from the coupling of turbulence with precipitation in climate simulations

    DOE PAGES

    Zheng, X.; Klein, S. A.; Ma, H. -Y.; ...

    2017-08-24

    The Community Atmosphere Model (CAM) adopts Cloud Layers Unified By Binormals (CLUBB) scheme and an updated microphysics (MG2) scheme for a more unified treatment of cloud processes. This makes interactions between parameterizations tighter and more explicit. In this study, a cloudy planetary boundary layer (PBL) oscillation related to interaction between CLUBB and MG2 is identified in CAM. This highlights the need for consistency between the coupled subgrid processes in climate model development. This oscillation occurs most often in the marine cumulus cloud regime. The oscillation occurs only if the modeled PBL is strongly decoupled and precipitation evaporates below the cloud.more » Two aspects of the parameterized coupling assumptions between CLUBB and MG2 schemes cause the oscillation: (1) a parameterized relationship between rain evaporation and CLUBB's subgrid spatial variance of moisture and heat that induces an extra cooling in the lower PBL and (2) rain evaporation which happens at a too low an altitude because of the precipitation fraction parameterization in MG2. Either one of these two conditions can overly stabilize the PBL and reduce the upward moisture transport to the cloud layer so that the PBL collapses. Global simulations prove that turning off the evaporation-variance coupling and improving the precipitation fraction parameterization effectively reduces the cloudy PBL oscillation in marine cumulus clouds. By evaluating the causes of the oscillation in CAM, we have identified the PBL processes that should be examined in models having similar oscillations. This study may draw the attention of the modeling and observational communities to the issue of coupling between parameterized physical processes.« less

  1. Initial English Language Teacher Education: Processes and Tensions towards a Unifying Curriculum in an Argentinian Province

    ERIC Educational Resources Information Center

    Banegas, Dario Luis

    2014-01-01

    In this reflective piece I discuss the process of developing a new unifying initial English language teacher education curriculum in the province of Chubut (Argentina). Trainers and trainees from different institutions were called to work on it with the aim of democratising curriculum development and enhancing involvement among agents. In the…

  2. Evaluating the Turkish Higher Education Law and Proposals in the Light of ERASMUS Goals

    ERIC Educational Resources Information Center

    Dolasir, Semiyha; Tuncel, Fehmi

    2006-01-01

    Education unity among Europan Community countries is very important in the process of unifying Europe. Hence, with the thoughts of strengthening a regular determined and democratic society, the education ministries of 29 European countries, started the unifying education process by signing the Bologna Declaration in June 19, 1999. SOCRATES and…

  3. Conservation of Life as a Unifying Theme for Process Safety in Chemical Engineering Education

    ERIC Educational Resources Information Center

    Klein, James A.; Davis, Richard A.

    2011-01-01

    This paper explores the use of "conservation of life" as a concept and unifying theme for increasing awareness, application, and integration of process safety in chemical engineering education. Students need to think of conservation of mass, conservation of energy, and conservation of life as equally important in engineering design and analysis.…

  4. Foundations to the unified psycho-cognitive engine.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernard, Michael Lewis; Bier, Asmeret Brooke; Backus, George A.

    This document outlines the key features of the SNL psychological engine. The engine is designed to be a generic presentation of cognitive entities interacting among themselves and with the external world. The engine combines the most accepted theories of behavioral psychology with those of behavioral economics to produce a unified simulation of human response from stimuli through executed behavior. The engine explicitly recognizes emotive and reasoned contributions to behavior and simulates the dynamics associated with cue processing, learning, and choice selection. Most importantly, the model parameterization can come from available media or survey information, as well subject-matter-expert information. The frameworkmore » design allows the use of uncertainty quantification and sensitivity analysis to manage confidence in using the analysis results for intervention decisions.« less

  5. Nonequilibrium Energy Transfer at Nanoscale: A Unified Theory from Weak to Strong Coupling

    NASA Astrophysics Data System (ADS)

    Wang, Chen; Ren, Jie; Cao, Jianshu

    2015-07-01

    Unraveling the microscopic mechanism of quantum energy transfer across two-level systems provides crucial insights to the optimal design and potential applications of low-dimensional nanodevices. Here, we study the non-equilibrium spin-boson model as a minimal prototype and develop a fluctuation-decoupled quantum master equation approach that is valid ranging from the weak to the strong system-bath coupling regime. The exact expression of energy flux is analytically established, which dissects the energy transfer as multiple boson processes with even and odd parity. Our analysis provides a unified interpretation of several observations, including coherence-enhanced heat flux and negative differential thermal conductance. The results will have broad implications for the fine control of energy transfer in nano-structural devices.

  6. Unified Model for Academic Competence, Social Adjustment, and Psychopathology.

    ERIC Educational Resources Information Center

    Schaefer, Earl S.; And Others

    A unified conceptual model is needed to integrate the extensive research on (1) social competence and adaptive behavior, (2) converging conceptualizations of social adjustment and psychopathology, and (3) emerging concepts and measures of academic competence. To develop such a model, a study was conducted in which teacher ratings were collected on…

  7. Short-Range Prediction of Monsoon Precipitation by NCMRWF Regional Unified Model with Explicit Convection

    NASA Astrophysics Data System (ADS)

    Mamgain, Ashu; Rajagopal, E. N.; Mitra, A. K.; Webster, S.

    2018-03-01

    There are increasing efforts towards the prediction of high-impact weather systems and understanding of related dynamical and physical processes. High-resolution numerical model simulations can be used directly to model the impact at fine-scale details. Improvement in forecast accuracy can help in disaster management planning and execution. National Centre for Medium Range Weather Forecasting (NCMRWF) has implemented high-resolution regional unified modeling system with explicit convection embedded within coarser resolution global model with parameterized convection. The models configurations are based on UK Met Office unified seamless modeling system. Recent land use/land cover data (2012-2013) obtained from Indian Space Research Organisation (ISRO) are also used in model simulations. Results based on short-range forecast of both the global and regional models over India for a month indicate that convection-permitting simulations by the high-resolution regional model is able to reduce the dry bias over southern parts of West Coast and monsoon trough zone with more intense rainfall mainly towards northern parts of monsoon trough zone. Regional model with explicit convection has significantly improved the phase of the diurnal cycle of rainfall as compared to the global model. Results from two monsoon depression cases during study period show substantial improvement in details of rainfall pattern. Many categories in rainfall defined for operational forecast purposes by Indian forecasters are also well represented in case of convection-permitting high-resolution simulations. For the statistics of number of days within a range of rain categories between `No-Rain' and `Heavy Rain', the regional model is outperforming the global model in all the ranges. In the very heavy and extremely heavy categories, the regional simulations show overestimation of rainfall days. Global model with parameterized convection have tendency to overestimate the light rainfall days and underestimate the heavy rain days compared to the observation data.

  8. Development of an integrated chemical weather prediction system for environmental applications at meso to global scales: NMMB/BSC-CHEM

    NASA Astrophysics Data System (ADS)

    Jorba, O.; Pérez, C.; Karsten, K.; Janjic, Z.; Dabdub, D.; Baldasano, J. M.

    2009-09-01

    This contribution presents the ongoing developments of a new fully on-line chemical weather prediction system for meso to global scale applications. The modeling system consists of a mineral dust module and a gas-phase chemistry module coupled on-line to a unified global-regional atmospheric driver. This approach allows solving small scale processes and their interactions at local to global scales. Its unified environment maintains the consistency of all the physico-chemical processes involved. The atmospheric driver is the NCEP/NMMB numerical weather prediction model (Janjic and Black, 2007) developed at National Centers for Environmental Prediction (NCEP). It represents an evolution of the operational WRF-NMME model extending from meso to global scales. Its unified non-hydrostatic dynamical core supports regional and global simulations. The Barcelona Supercomputing Center is currently designing and implementing a chemistry transport model coupled online with the new global/regional NMMB. The new modeling system is intended to be a powerful tool for research and to provide efficient global and regional chemical weather forecasts at sub-synoptic and mesoscale resolutions. The online coupling of the chemistry follows the approach similar to that of the mineral dust module already coupled to the atmospheric driver, NMMB/BSC-DUST (Pérez et al., 2008). Chemical species are advected and mixed at the corresponding time steps of the meteorological tracers using the same numerical scheme. Advection is eulerian, positive definite and monotone. The chemical mechanism and chemistry solver is based on the Kinetic PreProcessor KPP (Damian et al., 2002) package with the main purpose of maintaining a wide flexibility when configuring the model. Such approach will allow using a simplified chemical mechanism for global applications or a more complete mechanism for high-resolution local or regional studies. Moreover, it will permit the implementation of a specific configuration for forecasting applications in regional or global domains. An emission process allows the coupling of different emission inventories sources such as RETRO, EDGAR and GEIA for the global domain, EMEP for Europe and HERMES for Spain. The photolysis scheme is based on the Fast-J scheme, coupled with physics of each model layer (e.g., aerosols, clouds, absorbers as ozone) and it considers grid-scale clouds from the atmospheric driver. The dry deposition scheme follows the deposition velocity analogy for gases, enabling the calculation of deposition fluxes from airborne concentrations. No cloud-chemistry processes are included in the system yet (no wet deposition, scavenging and aqueous chemistry). The modeling system developments will be presented and first results of the gas-phase chemistry at global scale will be discussed. REFERENCES Janjic, Z.I., and Black, T.L., 2007. An ESMF unified model for a broad range of spatial and temporal scales, Geophysical Research Abstracts, 9, 05025. Pérez, C., Haustein, K., Janjic, Z.I., Jorba, O., Baldasano, J.M., Black, T.L., and Nickovic, S., 2008. An online dust model within the meso to global NMMB: current progress and plans. AGU Fall Meeting, San Francisco, A41K-03, 2008. Damian, V., Sandu, A., Damian, M., Potra, F., and Carmichael, G.R., 2002. The kinetic preprocessor KPP - A software environment for solving chemical kinetics. Comp. Chem. Eng., 26, 1567-1579. Sandu, A., and Sander, R., 2006. Technical note:Simulating chemical systems in Fortran90 and Matlab with the Kinetic PreProcessor KPP-2.1. Atmos. Chem. and Phys., 6, 187-195.

  9. Classifying clinical decision making: a unifying approach.

    PubMed

    Buckingham, C D; Adams, A

    2000-10-01

    This is the first of two linked papers exploring decision making in nursing which integrate research evidence from different clinical and academic disciplines. Currently there are many decision-making theories, each with their own distinctive concepts and terminology, and there is a tendency for separate disciplines to view their own decision-making processes as unique. Identifying good nursing decisions and where improvements can be made is therefore problematic, and this can undermine clinical and organizational effectiveness, as well as nurses' professional status. Within the unifying framework of psychological classification, the overall aim of the two papers is to clarify and compare terms, concepts and processes identified in a diversity of decision-making theories, and to demonstrate their underlying similarities. It is argued that the range of explanations used across disciplines can usefully be re-conceptualized as classification behaviour. This paper explores problems arising from multiple theories of decision making being applied to separate clinical disciplines. Attention is given to detrimental effects on nursing practice within the context of multidisciplinary health-care organizations and the changing role of nurses. The different theories are outlined and difficulties in applying them to nursing decisions highlighted. An alternative approach based on a general model of classification is then presented in detail to introduce its terminology and the unifying framework for interpreting all types of decisions. The classification model is used to provide the context for relating alternative philosophical approaches and to define decision-making activities common to all clinical domains. This may benefit nurses by improving multidisciplinary collaboration and weakening clinical elitism.

  10. Allergic rhinitis and inflammatory airway disease: interactions within the unified airspace.

    PubMed

    Marple, Bradley F

    2010-01-01

    Allergic rhinitis (AR), the most common chronic allergic condition in outpatient medicine, is associated with immense health care costs and socioeconomic consequences. AR's impact may be partly from interacting of respiratory conditions via allergic inflammation. This study was designed to review potential interactive mechanisms of AR and associated conditions and consider the relevance of a bidirectional "unified airway" respiratory inflammation model on diagnosis and treatment of inflammatory airway disease. MEDLINE was searched for pathophysiology and pathophysiological and epidemiologic links between AR and diseases of the sinuses, lungs, middle ear, and nasopharynx. Allergic-related inflammatory responses or neural and systemic processes fostering inflammatory changes distant from initial allergen provocation may link AR and comorbidities. Treating AR may benefit associated respiratory tract comorbidities. Besides improving AR outcomes, treatment inhibiting eosinophil recruitment and migration, normalizing cytokine profiles, and reducing asthma-associated health care use in atopic subjects would likely ameliorate other upper airway diseases such as acute rhinosinusitis, chronic rhinosinusitis (CRS) with nasal polyposis (NP), adenoidal hypertrophy, and otitis media with effusion. Epidemiological concordance of AR with several airway diseases conforms to a bidirectional "unified airway" respiratory inflammation model based on anatomic and histological upper and lower airway connections. Epidemiology and current understanding of inflammatory, humoral, and neural processes make links between AR and disorders including asthma, otitis media, NP, and CRS plausible. Combining AR with associated conditions increases disease burden; worsened associated illness may accompany worsened AR. AR pharmacotherapies include antihistamines, leukotriene antagonists, intranasal corticosteroids, and immunotherapy; treatments attenuating proinflammatory responses may also benefit associated conditions.

  11. Agent based simulations in disease modeling Comment on "Towards a unified approach in the modeling of fibrosis: A review with research perspectives" by Martine Ben Amar and Carlo Bianca

    NASA Astrophysics Data System (ADS)

    Pappalardo, Francesco; Pennisi, Marzio

    2016-07-01

    Fibrosis represents a process where an excessive tissue formation in an organ follows the failure of a physiological reparative or reactive process. Mathematical and computational techniques may be used to improve the understanding of the mechanisms that lead to the disease and to test potential new treatments that may directly or indirectly have positive effects against fibrosis [1]. In this scenario, Ben Amar and Bianca [2] give us a broad picture of the existing mathematical and computational tools that have been used to model fibrotic processes at the molecular, cellular, and tissue levels. Among such techniques, agent based models (ABM) can give a valuable contribution in the understanding and better management of fibrotic diseases.

  12. Collaborative Research: Reducing tropical precipitation biases in CESM — Tests of unified parameterizations with ARM observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larson, Vincent; Gettelman, Andrew; Morrison, Hugh

    In state-of-the-art climate models, each cloud type is treated using its own separate cloud parameterization and its own separate microphysics parameterization. This use of separate schemes for separate cloud regimes is undesirable because it is theoretically unfounded, it hampers interpretation of results, and it leads to the temptation to overtune parameters. In this grant, we are creating a climate model that contains a unified cloud parameterization and a unified microphysics parameterization. This model will be used to address the problems of excessive frequency of drizzle in climate models and excessively early onset of deep convection in the Tropics over land.more » The resulting model will be compared with ARM observations.« less

  13. The brain, self and society: a social-neuroscience model of predictive processing.

    PubMed

    Kelly, Michael P; Kriznik, Natasha M; Kinmonth, Ann Louise; Fletcher, Paul C

    2018-05-10

    This paper presents a hypothesis about how social interactions shape and influence predictive processing in the brain. The paper integrates concepts from neuroscience and sociology where a gulf presently exists between the ways that each describe the same phenomenon - how the social world is engaged with by thinking humans. We combine the concepts of predictive processing models (also called predictive coding models in the neuroscience literature) with ideal types, typifications and social practice - concepts from the sociological literature. This generates a unified hypothetical framework integrating the social world and hypothesised brain processes. The hypothesis combines aspects of neuroscience and psychology with social theory to show how social behaviors may be "mapped" onto brain processes. It outlines a conceptual framework that connects the two disciplines and that may enable creative dialogue and potential future research.

  14. Sandia/Stanford Unified Creep Plasticity Damage Model for ANSYS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pierce, David M.; Vianco, Paul T.; Fossum, Arlo F.

    2006-09-03

    A unified creep plasticity (UCP) model was developed, based upon the time-dependent and time-independent deformation properties of the 95.5Sn-3.9Ag-0.6Cu (wt.%) soldier that were measured at Sandia. Then, a damage parameter, D, was added to the equation to develop the unified creep plasticity damage (UCPD) model. The parameter, D, was parameterized, using data obtained at Sandia from isothermal fatigue experiments on a double-lap shear test. The softwae was validated against a BGA solder joint exposed to thermal cycling. The UCPD model was put into the ANSYS finite element as a subroutine. So, the softwae is the subroutine for ANSYS 8.1.

  15. Forecasting of monsoon heavy rains: challenges in NWP

    NASA Astrophysics Data System (ADS)

    Sharma, Kuldeep; Ashrit, Raghavendra; Iyengar, Gopal; Bhatla, R.; Rajagopal, E. N.

    2016-05-01

    Last decade has seen a tremendous improvement in the forecasting skill of numerical weather prediction (NWP) models. This is attributed to increased sophistication in NWP models, which resolve complex physical processes, advanced data assimilation, increased grid resolution and satellite observations. However, prediction of heavy rains is still a challenge since the models exhibit large error in amounts as well as spatial and temporal distribution. Two state-of-art NWP models have been investigated over the Indian monsoon region to assess their ability in predicting the heavy rainfall events. The unified model operational at National Center for Medium Range Weather Forecasting (NCUM) and the unified model operational at the Australian Bureau of Meteorology (Australian Community Climate and Earth-System Simulator -- Global (ACCESS-G)) are used in this study. The recent (JJAS 2015) Indian monsoon season witnessed 6 depressions and 2 cyclonic storms which resulted in heavy rains and flooding. The CRA method of verification allows the decomposition of forecast errors in terms of error in the rainfall volume, pattern and location. The case by case study using CRA technique shows that contribution to the rainfall errors come from pattern and displacement is large while contribution due to error in predicted rainfall volume is least.

  16. The Physics of Earthquakes: In the Quest for a Unified Theory (or Model) That Quantitatively Describes the Entire Process of an Earthquake Rupture, From its Nucleation to the Dynamic Regime and to its Arrest

    NASA Astrophysics Data System (ADS)

    Ohnaka, M.

    2004-12-01

    For the past four decades, great progress has been made in understanding earthquake source processes. In particular, recent progress in the field of the physics of earthquakes has contributed substantially to unraveling the earthquake generation process in quantitative terms. Yet, a fundamental problem remains unresolved in this field. The constitutive law that governs the behavior of earthquake ruptures is the basis of earthquake physics, and the governing law plays a fundamental role in accounting for the entire process of an earthquake rupture, from its nucleation to the dynamic propagation to its arrest, quantitatively in a unified and consistent manner. Therefore, without establishing the rational constitutive law, the physics of earthquakes cannot be a quantitative science in a true sense, and hence it is urgent to establish the rational constitutive law. However, it has been controversial over the past two decades, and it is still controversial, what the constitutive law for earthquake ruptures ought to be, and how it should be formulated. To resolve the controversy is a necessary step towards a more complete, unified theory of earthquake physics, and now the time is ripe to do so. Because of its fundamental importance, we have to discuss thoroughly and rigorously what the constitutive law ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid evidence. There are prerequisites for the constitutive formulation. The brittle, seismogenic layer and individual faults therein are characterized by inhomogeneity, and fault inhomogeneity has profound implications for earthquake ruptures. In addition, rupture phenomena including earthquakes are inherently scale dependent; indeed, some of the physical quantities inherent in rupture exhibit scale dependence. To treat scale-dependent physical quantities inherent in the rupture over a broad scale range quantitatively in a unified and consistent manner, it is critical to formulate the governing law properly so as to incorporate the scaling property. Thus, the properties of fault inhomogeneity and physical scaling are indispensable prerequisites to be incorporated into the constitutive formulation. Thorough discussion in this context necessarily leads to the consistent conclusion that the constitutive law must be formulated in such a manner that the shear traction is a primary function of the slip displacement, with the secondary effect of slip rate or stationary contact time. This constitutive formulation makes it possible to account for the entire process of an earthquake rupture over a broad scale range quantitatively in a unified and consistent manner.

  17. A Framework for Distributed Problem Solving

    NASA Astrophysics Data System (ADS)

    Leone, Joseph; Shin, Don G.

    1989-03-01

    This work explores a distributed problem solving (DPS) approach, namely the AM/AG model, to cooperative memory recall. The AM/AG model is a hierarchic social system metaphor for DPS based on the Mintzberg's model of organizations. At the core of the model are information flow mechanisms, named amplification and aggregation. Amplification is a process of expounding a given task, called an agenda, into a set of subtasks with magnified degree of specificity and distributing them to multiple processing units downward in the hierarchy. Aggregation is a process of combining the results reported from multiple processing units into a unified view, called a resolution, and promoting the conclusion upward in the hierarchy. The combination of amplification and aggregation can account for a memory recall process which primarily relies on the ability of making associations between vast amounts of related concepts, sorting out the combined results, and promoting the most plausible ones. The amplification process is discussed in detail. An implementation of the amplification process is presented. The process is illustrated by an example.

  18. The Perfect Storm: Preterm Birth, Neurodevelopmental Mechanisms, and Autism Causation.

    PubMed

    Erdei, Carmina; Dammann, Olaf

    2014-01-01

    A unifying model of autism causation remains elusive, and thus well-designed explanatory models are needed to develop appropriate therapeutic and preventive interventions. This essay argues that autism is not a static disorder, but rather an ongoing process. We discuss the link between preterm birth and autism and briefly review the evidence supporting the link between immune system characteristics and both prematurity and autism. We then propose a causation process model of autism etiology and pathogenesis, in which both neurodevelopment and ongoing/prolonged neuroinflammation are necessary pathogenetic component mechanisms. We suggest that an existing model of sufficient cause and component causes can be interpreted as a mechanistic view of etiology and pathogenesis and can serve as an explanatory model for autism causal pathways.

  19. Creation of a Unified Educational Space within a SLA University Classroom Using Cloud Storage and On-Line Applications

    ERIC Educational Resources Information Center

    Karabayeva, Kamilya Zhumartovna

    2016-01-01

    In the present article the author gives evidence of effective application of cloud storage and on-line applications in the educational process of the higher education institution, as well as considers the problems and prospects of using cloud technologies in the educational process, when creating a unified educational space in the foreign language…

  20. A general modeling framework for describing spatially structured population dynamics

    USGS Publications Warehouse

    Sample, Christine; Fryxell, John; Bieri, Joanna; Federico, Paula; Earl, Julia; Wiederholt, Ruscena; Mattsson, Brady; Flockhart, Tyler; Nicol, Sam; Diffendorfer, James E.; Thogmartin, Wayne E.; Erickson, Richard A.; Norris, D. Ryan

    2017-01-01

    Variation in movement across time and space fundamentally shapes the abundance and distribution of populations. Although a variety of approaches model structured population dynamics, they are limited to specific types of spatially structured populations and lack a unifying framework. Here, we propose a unified network-based framework sufficiently novel in its flexibility to capture a wide variety of spatiotemporal processes including metapopulations and a range of migratory patterns. It can accommodate different kinds of age structures, forms of population growth, dispersal, nomadism and migration, and alternative life-history strategies. Our objective was to link three general elements common to all spatially structured populations (space, time and movement) under a single mathematical framework. To do this, we adopt a network modeling approach. The spatial structure of a population is represented by a weighted and directed network. Each node and each edge has a set of attributes which vary through time. The dynamics of our network-based population is modeled with discrete time steps. Using both theoretical and real-world examples, we show how common elements recur across species with disparate movement strategies and how they can be combined under a unified mathematical framework. We illustrate how metapopulations, various migratory patterns, and nomadism can be represented with this modeling approach. We also apply our network-based framework to four organisms spanning a wide range of life histories, movement patterns, and carrying capacities. General computer code to implement our framework is provided, which can be applied to almost any spatially structured population. This framework contributes to our theoretical understanding of population dynamics and has practical management applications, including understanding the impact of perturbations on population size, distribution, and movement patterns. By working within a common framework, there is less chance that comparative analyses are colored by model details rather than general principles

  1. Unified Program Design: Organizing Existing Programming Models, Delivery Options, and Curriculum

    ERIC Educational Resources Information Center

    Rubenstein, Lisa DaVia; Ridgley, Lisa M.

    2017-01-01

    A persistent problem in the field of gifted education has been the lack of categorization and delineation of gifted programming options. To address this issue, we propose Unified Program Design as a structural framework for gifted program models. This framework defines gifted programs as the combination of delivery methods and curriculum models.…

  2. Unified Deep Learning Architecture for Modeling Biology Sequence.

    PubMed

    Wu, Hongjie; Cao, Chengyuan; Xia, Xiaoyan; Lu, Qiang

    2017-10-09

    Prediction of the spatial structure or function of biological macromolecules based on their sequence remains an important challenge in bioinformatics. When modeling biological sequences using traditional sequencing models, characteristics, such as long-range interactions between basic units, the complicated and variable output of labeled structures, and the variable length of biological sequences, usually lead to different solutions on a case-by-case basis. This study proposed the use of bidirectional recurrent neural networks based on long short-term memory or a gated recurrent unit to capture long-range interactions by designing the optional reshape operator to adapt to the diversity of the output labels and implementing a training algorithm to support the training of sequence models capable of processing variable-length sequences. Additionally, the merge and pooling operators enhanced the ability to capture short-range interactions between basic units of biological sequences. The proposed deep-learning model and its training algorithm might be capable of solving currently known biological sequence-modeling problems through the use of a unified framework. We validated our model on one of the most difficult biological sequence-modeling problems currently known, with our results indicating the ability of the model to obtain predictions of protein residue interactions that exceeded the accuracy of current popular approaches by 10% based on multiple benchmarks.

  3. Fusion of Imperfect Information in the Unified Framework of Random Sets Theory: Application to Target Identification

    DTIC Science & Technology

    2007-11-01

    Florea, Anne-Laure Jousselme, Éloi Bossé ; DRDC Valcartier TR 2003-319 ; R & D pour la défense Canada – Valcartier ; novembre 2007. Contexte : Pour...12 3.3.2 Imprecise information . . . . . . . . . . . . . . . . . . . . . 13 3.3.3 Uncertain and imprecise information...information proposed by Philippe Smets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 Figure 5: The process of information modelling

  4. An Intelligent Information System for forest management: NED/FVS integration

    Treesearch

    J. Wang; W.D. Potter; D. Nute; F. Maier; H. Michael Rauscher; M.J. Twery; S. Thomasma; P. Knopp

    2002-01-01

    An Intelligent Information System (IIS) is viewed as composed of a unified knowledge base, database, and model base. This allows an IIS to provide responses to user queries regardless of whether the query process involves a data retrieval, an inference, a computational method, a problem solving module, or some combination of these. NED-2 is a full-featured intelligent...

  5. School Resegregation: Residential and School Process Study. A Collaborative Leadership Planning/Training Project, Third Year: 1978-79. Final Report.

    ERIC Educational Resources Information Center

    Williams, Georgia

    This report summarizes the work undertaken by the Berkeley Unified School District's (BUSD) project to define a collaborative leadership planning/training model to combat school resegregation. In 1972, four years after full desegregation, the BUSD experienced a marked shift in the school population and its distribution. In 1976, the BUSD committed…

  6. Students Perception towards the Implementation of Computer Graphics Technology in Class via Unified Theory of Acceptance and Use of Technology (UTAUT) Model

    NASA Astrophysics Data System (ADS)

    Binti Shamsuddin, Norsila

    Technology advancement and development in a higher learning institution is a chance for students to be motivated to learn in depth in the information technology areas. Students should take hold of the opportunity to blend their skills towards these technologies as preparation for them when graduating. The curriculum itself can rise up the students' interest and persuade them to be directly involved in the evolvement of the technology. The aim of this study is to see how deep is the students' involvement as well as their acceptance towards the adoption of the technology used in Computer Graphics and Image Processing subjects. The study will be towards the Bachelor students in Faculty of Industrial Information Technology (FIIT), Universiti Industri Selangor (UNISEL); Bac. In Multimedia Industry, BSc. Computer Science and BSc. Computer Science (Software Engineering). This study utilizes the new Unified Theory of Acceptance and Use of Technology (UTAUT) to further validate the model and enhance our understanding of the adoption of Computer Graphics and Image Processing Technologies. Four (4) out of eight (8) independent factors in UTAUT will be studied towards the dependent factor.

  7. Unified anomaly suppression and boundary extraction in laser radar range imagery based on a joint curve-evolution and expectation-maximization algorithm.

    PubMed

    Feng, Haihua; Karl, William Clem; Castañon, David A

    2008-05-01

    In this paper, we develop a new unified approach for laser radar range anomaly suppression, range profiling, and segmentation. This approach combines an object-based hybrid scene model for representing the range distribution of the field and a statistical mixture model for the range data measurement noise. The image segmentation problem is formulated as a minimization problem which jointly estimates the target boundary together with the target region range variation and background range variation directly from the noisy and anomaly-filled range data. This formulation allows direct incorporation of prior information concerning the target boundary, target ranges, and background ranges into an optimal reconstruction process. Curve evolution techniques and a generalized expectation-maximization algorithm are jointly employed as an efficient solver for minimizing the objective energy, resulting in a coupled pair of object and intensity optimization tasks. The method directly and optimally extracts the target boundary, avoiding a suboptimal two-step process involving image smoothing followed by boundary extraction. Experiments are presented demonstrating that the proposed approach is robust to anomalous pixels (missing data) and capable of producing accurate estimation of the target boundary and range values from noisy data.

  8. Neurobiological roots of language in primate audition: common computational properties.

    PubMed

    Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias; Small, Steven L; Rauschecker, Josef P

    2015-03-01

    Here, we present a new perspective on an old question: how does the neurobiology of human language relate to brain systems in nonhuman primates? We argue that higher-order language combinatorics, including sentence and discourse processing, can be situated in a unified, cross-species dorsal-ventral streams architecture for higher auditory processing, and that the functions of the dorsal and ventral streams in higher-order language processing can be grounded in their respective computational properties in primate audition. This view challenges an assumption, common in the cognitive sciences, that a nonhuman primate model forms an inherently inadequate basis for modeling higher-level language functions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Unified Approach to Modeling and Simulation of Space Communication Networks and Systems

    NASA Technical Reports Server (NTRS)

    Barritt, Brian; Bhasin, Kul; Eddy, Wesley; Matthews, Seth

    2010-01-01

    Network simulator software tools are often used to model the behaviors and interactions of applications, protocols, packets, and data links in terrestrial communication networks. Other software tools that model the physics, orbital dynamics, and RF characteristics of space systems have matured to allow for rapid, detailed analysis of space communication links. However, the absence of a unified toolset that integrates the two modeling approaches has encumbered the systems engineers tasked with the design, architecture, and analysis of complex space communication networks and systems. This paper presents the unified approach and describes the motivation, challenges, and our solution - the customization of the network simulator to integrate with astronautical analysis software tools for high-fidelity end-to-end simulation. Keywords space; communication; systems; networking; simulation; modeling; QualNet; STK; integration; space networks

  10. Systemic risk in a unifying framework for cascading processes on networks

    NASA Astrophysics Data System (ADS)

    Lorenz, J.; Battiston, S.; Schweitzer, F.

    2009-10-01

    We introduce a general framework for models of cascade and contagion processes on networks, to identify their commonalities and differences. In particular, models of social and financial cascades, as well as the fiber bundle model, the voter model, and models of epidemic spreading are recovered as special cases. To unify their description, we define the net fragility of a node, which is the difference between its fragility and the threshold that determines its failure. Nodes fail if their net fragility grows above zero and their failure increases the fragility of neighbouring nodes, thus possibly triggering a cascade. In this framework, we identify three classes depending on the way the fragility of a node is increased by the failure of a neighbour. At the microscopic level, we illustrate with specific examples how the failure spreading pattern varies with the node triggering the cascade, depending on its position in the network and its degree. At the macroscopic level, systemic risk is measured as the final fraction of failed nodes, X*, and for each of the three classes we derive a recursive equation to compute its value. The phase diagram of X* as a function of the initial conditions, thus allows for a prediction of the systemic risk as well as a comparison of the three different model classes. We could identify which model class leads to a first-order phase transition in systemic risk, i.e. situations where small changes in the initial conditions determine a global failure. Eventually, we generalize our framework to encompass stochastic contagion models. This indicates the potential for further generalizations.

  11. A Global 3D P-Velocity Model of the Earth’s Crust and Mantle for Improved Event Location

    DTIC Science & Technology

    2011-09-01

    starting model, we use a simplified layer crustal model derived from the NNSA Unified model in Eurasia and Crust 2.0 model everywhere else, over a...geographic and radial dimensions. For our starting model, we use a simplified layer crustal model derived from the NNSA Unified model in Eurasia and...tessellation with 4° triangles to the transition zone and upper mantle, and a third tessellation with variable resolution to all crustal layers. The

  12. A unifying framework for quantifying the nature of animal interactions.

    PubMed

    Potts, Jonathan R; Mokross, Karl; Lewis, Mark A

    2014-07-06

    Collective phenomena, whereby agent-agent interactions determine spatial patterns, are ubiquitous in the animal kingdom. On the other hand, movement and space use are also greatly influenced by the interactions between animals and their environment. Despite both types of interaction fundamentally influencing animal behaviour, there has hitherto been no unifying framework for the models proposed in both areas. Here, we construct a general method for inferring population-level spatial patterns from underlying individual movement and interaction processes, a key ingredient in building a statistical mechanics for ecological systems. We show that resource selection functions, as well as several examples of collective motion models, arise as special cases of our framework, thus bringing together resource selection analysis and collective animal behaviour into a single theory. In particular, we focus on combining the various mechanistic models of territorial interactions in the literature with step selection functions, by incorporating interactions into the step selection framework and demonstrating how to derive territorial patterns from the resulting models. We demonstrate the efficacy of our model by application to a population of insectivore birds in the Amazon rainforest. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  13. A computational modeling of semantic knowledge in reading comprehension: Integrating the landscape model with latent semantic analysis.

    PubMed

    Yeari, Menahem; van den Broek, Paul

    2016-09-01

    It is a well-accepted view that the prior semantic (general) knowledge that readers possess plays a central role in reading comprehension. Nevertheless, computational models of reading comprehension have not integrated the simulation of semantic knowledge and online comprehension processes under a unified mathematical algorithm. The present article introduces a computational model that integrates the landscape model of comprehension processes with latent semantic analysis representation of semantic knowledge. In three sets of simulations of previous behavioral findings, the integrated model successfully simulated the activation and attenuation of predictive and bridging inferences during reading, as well as centrality estimations and recall of textual information after reading. Analyses of the computational results revealed new theoretical insights regarding the underlying mechanisms of the various comprehension phenomena.

  14. Challenges and insights for situated language processing: Comment on "Towards a computational comparative neuroprimatology: Framing the language-ready brain" by Michael A. Arbib

    NASA Astrophysics Data System (ADS)

    Knoeferle, Pia

    2016-03-01

    In his review article [19], Arbib outlines an ambitious research agenda: to accommodate within a unified framework the evolution, the development, and the processing of language in natural settings (implicating other systems such as vision). He does so with neuro-computationally explicit modeling in mind [1,2] and inspired by research on the mirror neuron system in primates. Similar research questions have received substantial attention also among other scientists [3,4,12].

  15. Psychophysical Laws and the Superorganism.

    PubMed

    Reina, Andreagiovanni; Bose, Thomas; Trianni, Vito; Marshall, James A R

    2018-03-12

    Through theoretical analysis, we show how a superorganism may react to stimulus variations according to psychophysical laws observed in humans and other animals. We investigate an empirically-motivated honeybee house-hunting model, which describes a value-sensitive decision process over potential nest-sites, at the level of the colony. In this study, we show how colony decision time increases with the number of available nests, in agreement with the Hick-Hyman law of psychophysics, and decreases with mean nest quality, in agreement with Piéron's law. We also show that colony error rate depends on mean nest quality, and difference in quality, in agreement with Weber's law. Psychophysical laws, particularly Weber's law, have been found in diverse species, including unicellular organisms. Our theoretical results predict that superorganisms may also exhibit such behaviour, suggesting that these laws arise from fundamental mechanisms of information processing and decision-making. Finally, we propose a combined psychophysical law which unifies Hick-Hyman's law and Piéron's law, traditionally studied independently; this unified law makes predictions that can be empirically tested.

  16. The P-chain: relating sentence production and its disorders to comprehension and acquisition

    PubMed Central

    Dell, Gary S.; Chang, Franklin

    2014-01-01

    This article introduces the P-chain, an emerging framework for theory in psycholinguistics that unifies research on comprehension, production and acquisition. The framework proposes that language processing involves incremental prediction, which is carried out by the production system. Prediction necessarily leads to prediction error, which drives learning, including both adaptive adjustment to the mature language processing system as well as language acquisition. To illustrate the P-chain, we review the Dual-path model of sentence production, a connectionist model that explains structural priming in production and a number of facts about language acquisition. The potential of this and related models for explaining acquired and developmental disorders of sentence production is discussed. PMID:24324238

  17. The P-chain: relating sentence production and its disorders to comprehension and acquisition.

    PubMed

    Dell, Gary S; Chang, Franklin

    2014-01-01

    This article introduces the P-chain, an emerging framework for theory in psycholinguistics that unifies research on comprehension, production and acquisition. The framework proposes that language processing involves incremental prediction, which is carried out by the production system. Prediction necessarily leads to prediction error, which drives learning, including both adaptive adjustment to the mature language processing system as well as language acquisition. To illustrate the P-chain, we review the Dual-path model of sentence production, a connectionist model that explains structural priming in production and a number of facts about language acquisition. The potential of this and related models for explaining acquired and developmental disorders of sentence production is discussed.

  18. An object-oriented approach for harmonization of multimedia markup languages

    NASA Astrophysics Data System (ADS)

    Chen, Yih-Feng; Kuo, May-Chen; Sun, Xiaoming; Kuo, C.-C. Jay

    2003-12-01

    An object-oriented methodology is proposed to harmonize several different markup languages in this research. First, we adopt the Unified Modelling Language (UML) as the data model to formalize the concept and the process of the harmonization process between the eXtensible Markup Language (XML) applications. Then, we design the Harmonization eXtensible Markup Language (HXML) based on the data model and formalize the transformation between the Document Type Definitions (DTDs) of the original XML applications and HXML. The transformation between instances is also discussed. We use the harmonization of SMIL and X3D as an example to demonstrate the proposed methodology. This methodology can be generalized to various application domains.

  19. Nonequilibrium Energy Transfer at Nanoscale: A Unified Theory from Weak to Strong Coupling

    PubMed Central

    Wang, Chen; Ren, Jie; Cao, Jianshu

    2015-01-01

    Unraveling the microscopic mechanism of quantum energy transfer across two-level systems provides crucial insights to the optimal design and potential applications of low-dimensional nanodevices. Here, we study the non-equilibrium spin-boson model as a minimal prototype and develop a fluctuation-decoupled quantum master equation approach that is valid ranging from the weak to the strong system-bath coupling regime. The exact expression of energy flux is analytically established, which dissects the energy transfer as multiple boson processes with even and odd parity. Our analysis provides a unified interpretation of several observations, including coherence-enhanced heat flux and negative differential thermal conductance. The results will have broad implications for the fine control of energy transfer in nano-structural devices. PMID:26152705

  20. Phase noise suppression for coherent optical block transmission systems: a unified framework.

    PubMed

    Yang, Chuanchuan; Yang, Feng; Wang, Ziyu

    2011-08-29

    A unified framework for phase noise suppression is proposed in this paper, which could be applied in any coherent optical block transmission systems, including coherent optical orthogonal frequency-division multiplexing (CO-OFDM), coherent optical single-carrier frequency-domain equalization block transmission (CO-SCFDE), etc. Based on adaptive modeling of phase noise, unified observation equations for different coherent optical block transmission systems are constructed, which lead to unified phase noise estimation and suppression. Numerical results demonstrate that the proposal is powerful in mitigating laser phase noise.

  1. Unifying Model-Based and Reactive Programming within a Model-Based Executive

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)

    1999-01-01

    Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.

  2. Effective equilibrium states in the colored-noise model for active matter I. Pairwise forces in the Fox and unified colored noise approximations

    NASA Astrophysics Data System (ADS)

    Wittmann, René; Maggi, C.; Sharma, A.; Scacchi, A.; Brader, J. M.; Marini Bettolo Marconi, U.

    2017-11-01

    The equations of motion of active systems can be modeled in terms of Ornstein-Uhlenbeck processes (OUPs) with appropriate correlators. For further theoretical studies, these should be approximated to yield a Markovian picture for the dynamics and a simplified steady-state condition. We perform a comparative study of the unified colored noise approximation (UCNA) and the approximation scheme by Fox recently employed within this context. We review the approximations necessary to define effective interaction potentials in the low-density limit and study the conditions for which these represent the behavior observed in two-body simulations for the OUPs model and active Brownian particles. The demonstrated limitations of the theory for potentials with a negative slope or curvature can be qualitatively corrected by a new empirical modification. In general, we find that in the presence of translational white noise the Fox approach is more accurate. Finally, we examine an alternative way to define a force-balance condition in the limit of small activity.

  3. Bloom syndrome helicase in meiosis: Pro-crossover functions of an anti-crossover protein.

    PubMed

    Hatkevich, Talia; Sekelsky, Jeff

    2017-09-01

    The functions of the Bloom syndrome helicase (BLM) and its orthologs are well characterized in mitotic DNA damage repair, but their roles within the context of meiotic recombination are less clear. In meiotic recombination, multiple repair pathways are used to repair meiotic DSBs, and current studies suggest that BLM may regulate the use of these pathways. Based on literature from Saccharomyces cerevisiae, Arabidopsis thaliana, Mus musculus, Drosophila melanogaster, and Caenorhabditis elegans, we present a unified model for a critical meiotic role of BLM and its orthologs. In this model, BLM and its orthologs utilize helicase activity to regulate the use of various pathways in meiotic recombination by continuously disassembling recombination intermediates. This unwinding activity provides the meiotic program with a steady pool of early recombination substrates, increasing the probability for a DSB to be processed by the appropriate pathway. As a result of BLM activity, crossovers are properly placed throughout the genome, promoting proper chromosomal disjunction at the end of meiosis. This unified model can be used to further refine the complex role of BLM and its orthologs in meiotic recombination. © 2017 WILEY Periodicals, Inc.

  4. Clinical, information and business process modeling to promote development of safe and flexible software.

    PubMed

    Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn

    2006-09-01

    Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.

  5. A Unified Model Exploring Parenting Practices as Mediators of Marital Conflict and Children's Adjustment

    ERIC Educational Resources Information Center

    Coln, Kristen L.; Jordan, Sara S.; Mercer, Sterett H.

    2013-01-01

    We examined positive and negative parenting practices and psychological control as mediators of the relations between constructive and destructive marital conflict and children's internalizing and externalizing problems in a unified model. Married mothers of 121 children between the ages of 6 and 12 completed questionnaires measuring marital…

  6. Can (should) theories of crowding be unified?

    PubMed Central

    Agaoglu, Mehmet N.; Chung, Susana T. L.

    2016-01-01

    Objects in clutter are difficult to recognize, a phenomenon known as crowding. There is little consensus on the underlying mechanisms of crowding, and a large number of models have been proposed. There have also been attempts at unifying the explanations of crowding under a single model, such as the weighted feature model of Harrison and Bex (2015) and the texture synthesis model of Rosenholtz and colleagues (Balas, Nakano, & Rosenholtz, 2009; Keshvari & Rosenholtz, 2016). The goal of this work was to test various models of crowding and to assess whether a unifying account can be developed. Adopting Harrison and Bex's (2015) experimental paradigm, we asked observers to report the orientation of two concentric C-stimuli. Contrary to the predictions of their model, observers' recognition accuracy was worse for the inner C-stimulus. In addition, we demonstrated that the stimulus paradigm used by Harrison and Bex has a crucial confounding factor, eccentricity, which limits its usage to a very narrow range of stimulus parameters. Nevertheless, reporting the orientations of both C-stimuli in this paradigm proved very useful in pitting different crowding models against each other. Specifically, we tested deterministic and probabilistic versions of averaging, substitution, and attentional resolution models as well as the texture synthesis model. None of the models alone was able to explain the entire set of data. Based on these findings, we discuss whether the explanations of crowding can (should) be unified. PMID:27936273

  7. An OpenACC-Based Unified Programming Model for Multi-accelerator Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jungwon; Lee, Seyong; Vetter, Jeffrey S

    2015-01-01

    This paper proposes a novel SPMD programming model of OpenACC. Our model integrates the different granularities of parallelism from vector-level parallelism to node-level parallelism into a single, unified model based on OpenACC. It allows programmers to write programs for multiple accelerators using a uniform programming model whether they are in shared or distributed memory systems. We implement a prototype of our model and evaluate its performance with a GPU-based supercomputer using three benchmark applications.

  8. On the reachable cycles via the unified perspective of cryocoolers. Part B: Cryocoolers with isentropic expanders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maytal, Ben-Zion; Pfotenhauer, John M.

    2014-01-29

    Solvay, Stirling and Gifford-McMahon types of cryocoolers employ an isentropic expander which is their elementary mechanism for temperature reduction (following the unified model of cryocoolers as described in a previous paper, Part A). Solvay and Stirling cryocoolers are driven by a larger temperature reduction than that of the Gifford-McMahon cycle, for a similar compression ratio. These cryocoolers are compared from the view of the unified model, in terms of the lowest attainable temperature, compression ratio, the size of the interchanger and the applied heat load.

  9. Unified viscoelasticity: Applying discrete element models to soft tissues with two characteristic times.

    PubMed

    Anssari-Benam, Afshin; Bucchi, Andrea; Bader, Dan L

    2015-09-18

    Discrete element models have often been the primary tool in investigating and characterising the viscoelastic behaviour of soft tissues. However, studies have employed varied configurations of these models, based on the choice of the number of elements and the utilised formation, for different subject tissues. This approach has yielded a diverse array of viscoelastic models in the literature, each seemingly resulting in different descriptions of viscoelastic constitutive behaviour and/or stress-relaxation and creep functions. Moreover, most studies do not apply a single discrete element model to characterise both stress-relaxation and creep behaviours of tissues. The underlying assumption for this disparity is the implicit perception that the viscoelasticity of soft tissues cannot be described by a universal behaviour or law, resulting in the lack of a unified approach in the literature based on discrete element representations. This paper derives the constitutive equation for different viscoelastic models applicable to soft tissues with two characteristic times. It demonstrates that all possible configurations exhibit a unified and universal behaviour, captured by a single constitutive relationship between stress, strain and time as: σ+Aσ̇+Bσ¨=Pε̇+Qε¨. The ensuing stress-relaxation G(t) and creep J(t) functions are also unified and universal, derived as [Formula: see text] and J(t)=c2+(ε0-c2)e(-PQt)+σ0Pt, respectively. Application of these relationships to experimental data is illustrated for various tissues including the aortic valve, ligament and cerebral artery. The unified model presented in this paper may be applied to all tissues with two characteristic times, obviating the need for employing varied configurations of discrete element models in preliminary investigation of the viscoelastic behaviour of soft tissues. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. [Recombinant granulocyte-colony stimulating factor (filgrastim): optimization of conditions of isolation and purification from inclusion body].

    PubMed

    Kononova, N V; Iakovlev, A V; Zhuravko, A M; Pankeev, N N; Minaev, S V; Bobruskin, A I; Mart'ianov, V A

    2014-01-01

    We developed a unified process platform for two recombinant human GCSF medicines--one with the non-prolonged and the other with prolonged action. This unified technology led to a simpler and cheaper production while introduction of the additional pegylation stage to the technological line eased obtaining of the medicines with different action and allowed to standardize technological process documenting according to GMP requirements.

  11. Control of Distributed Parameter Systems

    DTIC Science & Technology

    1990-08-01

    vari- ant of the general Lotka - Volterra model for interspecific competition. The variant described the emergence of one subpopulation from another as a...distribut ion unlimited. I&. ARSTRACT (MAUMUnw2O1 A unified arioroximation framework for Parameter estimation In general linear POE models has been completed...unified approximation framework for parameter estimation in general linear PDE models. This framework has provided the theoretical basis for a number of

  12. (In)dependence of 𝜃 in the Higgs regime without axions

    NASA Astrophysics Data System (ADS)

    Shifman, Mikhail; Vainshtein, Arkady

    2017-05-01

    We revisit the issue of the vacuum angle 𝜃 dependence in weakly coupled (Higgsed) Yang-Mills theories. Two most popular mechanisms for eliminating physical 𝜃 dependence are massless quarks and axions. Anselm and Johansen noted that the vacuum angle 𝜃EW, associated with the electroweak SU(2) in the Glashow-Weinberg-Salam model (Standard Model, SM), is unobservable although all fermion fields obtain masses through Higgsing and there is no axion. We generalize this idea to a broad class of Higgsed Yang-Mills theories. In the second part, we consider the consequences of Grand Unification. We start from a unifying group, e.g. SU(5), at a high ultraviolet scale and evolve the theory down within the Wilson procedure. If on the way to infrared the unifying group is broken down into a few factors, all factor groups inherit one and the same 𝜃 angle — that of the unifying group. We show that embedding the SM in SU(5) drastically changes the Anselm-Johansen conclusion: the electroweak vacuum angle 𝜃EW, equal to 𝜃QCD becomes in principle observable in ΔB = ΔL = ±1 processes. We also note in passing that if the axion mechanism is set up above the unification scale, we have one and the same axion in the electroweak theory and QCD, and their impacts are interdependent.

  13. A unified computational model of the development of object unity, object permanence, and occluded object trajectory perception.

    PubMed

    Franz, A; Triesch, J

    2010-12-01

    The perception of the unity of objects, their permanence when out of sight, and the ability to perceive continuous object trajectories even during occlusion belong to the first and most important capacities that infants have to acquire. Despite much research a unified model of the development of these abilities is still missing. Here we make an attempt to provide such a unified model. We present a recurrent artificial neural network that learns to predict the motion of stimuli occluding each other and that develops representations of occluded object parts. It represents completely occluded, moving objects for several time steps and successfully predicts their reappearance after occlusion. This framework allows us to account for a broad range of experimental data. Specifically, the model explains how the perception of object unity develops, the role of the width of the occluders, and it also accounts for differences between data for moving and stationary stimuli. We demonstrate that these abilities can be acquired by learning to predict the sensory input. The model makes specific predictions and provides a unifying framework that has the potential to be extended to other visual event categories. Copyright © 2010 Elsevier Inc. All rights reserved.

  14. Advances in heat conduction models and approaches for the prediction of lattice thermal conductivity of dielectric materials

    NASA Astrophysics Data System (ADS)

    Saikia, Banashree

    2017-03-01

    An overview of predominant theoretical models used for predicting the thermal conductivities of dielectric materials is given. The criteria used for different theoretical models are explained. This overview highlights a unified theory based on temperature-dependent thermal-conductivity theories, and a drifting of the equilibrium phonon distribution function due to normal three-phonon scattering processes causes transfer of phonon momentum to (a) the same phonon modes (KK-S model) and (b) across the phonon modes (KK-H model). Estimates of the lattice thermal conductivities of LiF and Mg2Sn for the KK-H model are presented graphically.

  15. Investigation of Turbulent Entrainment-Mixing Processes With a New Particle-Resolved Direct Numerical Simulation Model

    DOE PAGES

    Gao, Zheng; Liu, Yangang; Li, Xiaolin; ...

    2018-02-19

    Here, a new particle-resolved three dimensional direct numerical simulation (DNS) model is developed that combines Lagrangian droplet tracking with the Eulerian field representation of turbulence near the Kolmogorov microscale. Six numerical experiments are performed to investigate the processes of entrainment of clear air and subsequent mixing with cloudy air and their interactions with cloud microphysics. The experiments are designed to represent different combinations of three configurations of initial cloudy area and two turbulence modes (decaying and forced turbulence). Five existing measures of microphysical homogeneous mixing degree are examined, modified, and compared in terms of their ability as a unifying measuremore » to represent the effect of various entrainment-mixing mechanisms on cloud microphysics. Also examined and compared are the conventional Damköhler number and transition scale number as a dynamical measure of different mixing mechanisms. Relationships between the various microphysical measures and dynamical measures are investigated in search for a unified parameterization of entrainment-mixing processes. The results show that even with the same cloud water fraction, the thermodynamic and microphysical properties are different, especially for the decaying cases. Further analysis confirms that despite the detailed differences in cloud properties among the six simulation scenarios, the variety of turbulent entrainment-mixing mechanisms can be reasonably represented with power-law relationships between the microphysical homogeneous mixing degrees and the dynamical measures.« less

  16. Investigation of Turbulent Entrainment-Mixing Processes With a New Particle-Resolved Direct Numerical Simulation Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Zheng; Liu, Yangang; Li, Xiaolin

    Here, a new particle-resolved three dimensional direct numerical simulation (DNS) model is developed that combines Lagrangian droplet tracking with the Eulerian field representation of turbulence near the Kolmogorov microscale. Six numerical experiments are performed to investigate the processes of entrainment of clear air and subsequent mixing with cloudy air and their interactions with cloud microphysics. The experiments are designed to represent different combinations of three configurations of initial cloudy area and two turbulence modes (decaying and forced turbulence). Five existing measures of microphysical homogeneous mixing degree are examined, modified, and compared in terms of their ability as a unifying measuremore » to represent the effect of various entrainment-mixing mechanisms on cloud microphysics. Also examined and compared are the conventional Damköhler number and transition scale number as a dynamical measure of different mixing mechanisms. Relationships between the various microphysical measures and dynamical measures are investigated in search for a unified parameterization of entrainment-mixing processes. The results show that even with the same cloud water fraction, the thermodynamic and microphysical properties are different, especially for the decaying cases. Further analysis confirms that despite the detailed differences in cloud properties among the six simulation scenarios, the variety of turbulent entrainment-mixing mechanisms can be reasonably represented with power-law relationships between the microphysical homogeneous mixing degrees and the dynamical measures.« less

  17. Unified space--time trigonometry and its applications to relativistic kinematics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jaccarini, A.

    1973-06-15

    A geometrical approach to relativistic kinematics is presented. Owing to a unified space-time trigonometry, the spherical trigonometry formalism may be used to describe and study the kinematics of any collision process. Lorentz transformations may thus lie treated as purely geometrical problems. A different way to define a unified trigonometry is also proposed, which is based on the spinor representation of the Lorentz group. This leads to a different and more general formalism than the former one. (auth)

  18. Unified dead-time compensation structure for SISO processes with multiple dead times.

    PubMed

    Normey-Rico, Julio E; Flesch, Rodolfo C C; Santos, Tito L M

    2014-11-01

    This paper proposes a dead-time compensation structure for processes with multiple dead times. The controller is based on the filtered Smith predictor (FSP) dead-time compensator structure and it is able to control stable, integrating, and unstable processes with multiple input/output dead times. An equivalent model of the process is first computed in order to define the predictor structure. Using this equivalent model, the primary controller and the predictor filter are tuned to obtain an internally stable closed-loop system which also attempts some closed-loop specifications in terms of set-point tracking, disturbance rejection, and robustness. Some simulation case studies are used to illustrate the good properties of the proposed approach. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  19. The jABC Approach to Rigorous Collaborative Development of SCM Applications

    NASA Astrophysics Data System (ADS)

    Hörmann, Martina; Margaria, Tiziana; Mender, Thomas; Nagel, Ralf; Steffen, Bernhard; Trinh, Hong

    Our approach to the model-driven collaborative design of IKEA's P3 Delivery Management Process uses the jABC [9] for model driven mediation and choreography to complement a RUP-based (Rational Unified Process) development process. jABC is a framework for service development based on Lightweight Process Coordination. Users (product developers and system/software designers) easily develop services and applications by composing reusable building-blocks into (flow-) graph structures that can be animated, analyzed, simulated, verified, executed, and compiled. This way of handling the collaborative design of complex embedded systems has proven to be effective and adequate for the cooperation of non-programmers and non-technical people, which is the focus of this contribution, and it is now being rolled out in the operative practice.

  20. Earthquake cycles and physical modeling of the process leading up to a large earthquake

    NASA Astrophysics Data System (ADS)

    Ohnaka, Mitiyasu

    2004-08-01

    A thorough discussion is made on what the rational constitutive law for earthquake ruptures ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid facts observed in the laboratory. From this standpoint, it is concluded that the constitutive law should be a slip-dependent law with parameters that may depend on slip rate or time. With the long-term goal of establishing a rational methodology of forecasting large earthquakes, the entire process of one cycle for a typical, large earthquake is modeled, and a comprehensive scenario that unifies individual models for intermediate-and short-term (immediate) forecasts is presented within the framework based on the slip-dependent constitutive law and the earthquake cycle model. The earthquake cycle includes the phase of accumulation of elastic strain energy with tectonic loading (phase II), and the phase of rupture nucleation at the critical stage where an adequate amount of the elastic strain energy has been stored (phase III). Phase II plays a critical role in physical modeling of intermediate-term forecasting, and phase III in physical modeling of short-term (immediate) forecasting. The seismogenic layer and individual faults therein are inhomogeneous, and some of the physical quantities inherent in earthquake ruptures exhibit scale-dependence. It is therefore critically important to incorporate the properties of inhomogeneity and physical scaling, in order to construct realistic, unified scenarios with predictive capability. The scenario presented may be significant and useful as a necessary first step for establishing the methodology for forecasting large earthquakes.

  1. Towards a unified understanding of event-related changes in the EEG: the firefly model of synchronization through cross-frequency phase modulation.

    PubMed

    Burgess, Adrian P

    2012-01-01

    Although event-related potentials (ERPs) are widely used to study sensory, perceptual and cognitive processes, it remains unknown whether they are phase-locked signals superimposed upon the ongoing electroencephalogram (EEG) or result from phase-alignment of the EEG. Previous attempts to discriminate between these hypotheses have been unsuccessful but here a new test is presented based on the prediction that ERPs generated by phase-alignment will be associated with event-related changes in frequency whereas evoked-ERPs will not. Using empirical mode decomposition (EMD), which allows measurement of narrow-band changes in the EEG without predefining frequency bands, evidence was found for transient frequency slowing in recognition memory ERPs but not in simulated data derived from the evoked model. Furthermore, the timing of phase-alignment was frequency dependent with the earliest alignment occurring at high frequencies. Based on these findings, the Firefly model was developed, which proposes that both evoked and induced power changes derive from frequency-dependent phase-alignment of the ongoing EEG. Simulated data derived from the Firefly model provided a close match with empirical data and the model was able to account for i) the shape and timing of ERPs at different scalp sites, ii) the event-related desynchronization in alpha and synchronization in theta, and iii) changes in the power density spectrum from the pre-stimulus baseline to the post-stimulus period. The Firefly Model, therefore, provides not only a unifying account of event-related changes in the EEG but also a possible mechanism for cross-frequency information processing.

  2. Towards a Unified Understanding of Event-Related Changes in the EEG: The Firefly Model of Synchronization through Cross-Frequency Phase Modulation

    PubMed Central

    Burgess, Adrian P.

    2012-01-01

    Although event-related potentials (ERPs) are widely used to study sensory, perceptual and cognitive processes, it remains unknown whether they are phase-locked signals superimposed upon the ongoing electroencephalogram (EEG) or result from phase-alignment of the EEG. Previous attempts to discriminate between these hypotheses have been unsuccessful but here a new test is presented based on the prediction that ERPs generated by phase-alignment will be associated with event-related changes in frequency whereas evoked-ERPs will not. Using empirical mode decomposition (EMD), which allows measurement of narrow-band changes in the EEG without predefining frequency bands, evidence was found for transient frequency slowing in recognition memory ERPs but not in simulated data derived from the evoked model. Furthermore, the timing of phase-alignment was frequency dependent with the earliest alignment occurring at high frequencies. Based on these findings, the Firefly model was developed, which proposes that both evoked and induced power changes derive from frequency-dependent phase-alignment of the ongoing EEG. Simulated data derived from the Firefly model provided a close match with empirical data and the model was able to account for i) the shape and timing of ERPs at different scalp sites, ii) the event-related desynchronization in alpha and synchronization in theta, and iii) changes in the power density spectrum from the pre-stimulus baseline to the post-stimulus period. The Firefly Model, therefore, provides not only a unifying account of event-related changes in the EEG but also a possible mechanism for cross-frequency information processing. PMID:23049827

  3. A Global System for Transportation Simulation and Visualization in Emergency Evacuation Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Wei; Liu, Cheng; Thomas, Neil

    2015-01-01

    Simulation-based studies are frequently used for evacuation planning and decision making processes. Given the transportation systems complexity and data availability, most evacuation simulation models focus on certain geographic areas. With routine improvement of OpenStreetMap road networks and LandScanTM global population distribution data, we present WWEE, a uniform system for world-wide emergency evacuation simulations. WWEE uses unified data structure for simulation inputs. It also integrates a super-node trip distribution model as the default simulation parameter to improve the system computational performance. Two levels of visualization tools are implemented for evacuation performance analysis, including link-based macroscopic visualization and vehicle-based microscopic visualization. Formore » left-hand and right-hand traffic patterns in different countries, the authors propose a mirror technique to experiment with both scenarios without significantly changing traffic simulation models. Ten cities in US, Europe, Middle East, and Asia are modeled for demonstration. With default traffic simulation models for fast and easy-to-use evacuation estimation and visualization, WWEE also retains the capability of interactive operation for users to adopt customized traffic simulation models. For the first time, WWEE provides a unified platform for global evacuation researchers to estimate and visualize their strategies performance of transportation systems under evacuation scenarios.« less

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Yinan; Shi Handuo; Xiong Zhaoxi

    We present a unified universal quantum cloning machine, which combines several different existing universal cloning machines together, including the asymmetric case. In this unified framework, the identical pure states are projected equally into each copy initially constituted by input and one half of the maximally entangled states. We show explicitly that the output states of those universal cloning machines are the same. One importance of this unified cloning machine is that the cloning procession is always the symmetric projection, which reduces dramatically the difficulties for implementation. Also, it is found that this unified cloning machine can be directly modified tomore » the general asymmetric case. Besides the global fidelity and the single-copy fidelity, we also present all possible arbitrary-copy fidelities.« less

  5. [Review of current classification and terminology of vulvar disorders].

    PubMed

    Sláma, J

    2012-08-01

    To summarize current terminology and classification of vulvar disorders. Review article. Gynecologic oncology center, Department of Gynecology and Obstetrics, General Faculty Hospital and 1st Medical School of Charles University, Prague. Vulvar disorders include wide spectrum of different diagnoses. Multidisciplinary collaboration is frequently needed in diagnostical and therapeutical process. It is essential to use unified terminology using standard dermatological terms, and unified classification for comprehensible communication between different medical professions. Current classification, which is based on Clinical-pathological criteria, was established by International Society for the Study of Vulvovaginal Disease. Recently, there was introduced Clinical classification, which groups disorders according to main morphological finding. Adequate and unified classification and terminology are necessary for effective communication during the diagnostical process.

  6. PIM Pedagogy: Toward a Loosely Unified Model for Teaching and Studying Comics and Graphic Novels

    ERIC Educational Resources Information Center

    Carter, James B.

    2015-01-01

    The article debuts and explains "PIM" pedagogy, a construct for teaching comics at the secondary- and post-secondary levels and for deep reading/studying comics. The PIM model for considering comics is actually based in major precepts of education studies, namely constructivist foundations of learning, and loosely unifies constructs…

  7. Integration Defended: Berkeley Unified's Strategy to Maintain School Diversity

    ERIC Educational Resources Information Center

    Chavez, Lisa; Frankenberg, Erica

    2009-01-01

    In June 2007, the Supreme Court limited the tools that school districts could use to voluntarily integrate schools. In the aftermath of the decision, educators around the country have sought models of successful plans that would also be legal. One such model may be Berkeley Unified School District's (BUSD) plan. Earlier this year, the California…

  8. Theoretical Foundation of Copernicus: A Unified System for Trajectory Design and Optimization

    NASA Technical Reports Server (NTRS)

    Ocampo, Cesar; Senent, Juan S.; Williams, Jacob

    2010-01-01

    The fundamental methods are described for the general spacecraft trajectory design and optimization software system called Copernicus. The methods rely on a unified framework that is used to model, design, and optimize spacecraft trajectories that may operate in complex gravitational force fields, use multiple propulsion systems, and involve multiple spacecraft. The trajectory model, with its associated equations of motion and maneuver models, are discussed.

  9. Standard model of knowledge representation

    NASA Astrophysics Data System (ADS)

    Yin, Wensheng

    2016-09-01

    Knowledge representation is the core of artificial intelligence research. Knowledge representation methods include predicate logic, semantic network, computer programming language, database, mathematical model, graphics language, natural language, etc. To establish the intrinsic link between various knowledge representation methods, a unified knowledge representation model is necessary. According to ontology, system theory, and control theory, a standard model of knowledge representation that reflects the change of the objective world is proposed. The model is composed of input, processing, and output. This knowledge representation method is not a contradiction to the traditional knowledge representation method. It can express knowledge in terms of multivariate and multidimensional. It can also express process knowledge, and at the same time, it has a strong ability to solve problems. In addition, the standard model of knowledge representation provides a way to solve problems of non-precision and inconsistent knowledge.

  10. Electron heating in a Monte Carlo model of a high Mach number, supercritical, collisionless shock

    NASA Technical Reports Server (NTRS)

    Ellison, Donald C.; Jones, Frank C.

    1987-01-01

    Preliminary work in the investigation of electron injection and acceleration at parallel shocks is presented. A simple model of electron heating that is derived from a unified shock model which includes the effects of an electrostatic potential jump is described. The unified shock model provides a kinetic description of the injection and acceleration of ions and a fluid description of electron heating at high Mach number, supercritical, and parallel shocks.

  11. Toward a Unified Theory of Human Reasoning.

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    1986-01-01

    The goal of this unified theory of human reasoning is to specify what constitutes reasoning and to characterize the psychological distinction between inductive and deductive reasoning. The theory views reasoning as the controlled and mediated application of three processes (encoding, comparison and selective combination) to inferential rules. (JAZ)

  12. Models and methods in delay discounting.

    PubMed

    Tesch, Aaron D; Sanfey, Alan G

    2008-04-01

    Delay discounting (DD) is a term typically used to describe the devaluation of rewards over time, and much research across a wide variety of domains has illustrated that people in general prefer a smaller reward delivered soon as opposed to a larger reward delivered at a later stage. Despite numerous attempts, a single unified model of DD that accounts for the varied pattern of results typically observed has been elusive. One of the difficulties in deriving a unified model is the presence of many framing and context effects, situations in which changing, apparently irrelevant, aspects of the choice scenarios lead to different selections. Additionally, different paradigms of DD research use quite different methodology, which poses challenges for a unified model. This chapter describes some of the difficulties in creating a single DD model and suggests some experiments that would help integrate different paradigms to create a clearer picture of DD.

  13. Unified constitutive material models for nonlinear finite-element structural analysis. [gas turbine engine blades and vanes

    NASA Technical Reports Server (NTRS)

    Kaufman, A.; Laflen, J. H.; Lindholm, U. S.

    1985-01-01

    Unified constitutive material models were developed for structural analyses of aircraft gas turbine engine components with particular application to isotropic materials used for high-pressure stage turbine blades and vanes. Forms or combinations of models independently proposed by Bodner and Walker were considered. These theories combine time-dependent and time-independent aspects of inelasticity into a continuous spectrum of behavior. This is in sharp contrast to previous classical approaches that partition inelastic strain into uncoupled plastic and creep components. Predicted stress-strain responses from these models were evaluated against monotonic and cyclic test results for uniaxial specimens of two cast nickel-base alloys, B1900+Hf and Rene' 80. Previously obtained tension-torsion test results for Hastelloy X alloy were used to evaluate multiaxial stress-strain cycle predictions. The unified models, as well as appropriate algorithms for integrating the constitutive equations, were implemented in finite-element computer codes.

  14. Large ensemble and large-domain hydrologic modeling: Insights from SUMMA applications in the Columbia River Basin

    NASA Astrophysics Data System (ADS)

    Ou, G.; Nijssen, B.; Nearing, G. S.; Newman, A. J.; Mizukami, N.; Clark, M. P.

    2016-12-01

    The Structure for Unifying Multiple Modeling Alternatives (SUMMA) provides a unifying modeling framework for process-based hydrologic modeling by defining a general set of conservation equations for mass and energy, with the capability to incorporate multiple choices for spatial discretizations and flux parameterizations. In this study, we provide a first demonstration of large-scale hydrologic simulations using SUMMA through an application to the Columbia River Basin (CRB) in the northwestern United States and Canada for a multi-decadal simulation period. The CRB is discretized into 11,723 hydrologic response units (HRUs) according to the United States Geologic Service Geospatial Fabric. The soil parameters are derived from the Natural Resources Conservation Service Soil Survey Geographic (SSURGO) Database. The land cover parameters are based on the National Land Cover Database from the year 2001 created by the Multi-Resolution Land Characteristics (MRLC) Consortium. The forcing data, including hourly air pressure, temperature, specific humidity, wind speed, precipitation, shortwave and longwave radiations, are based on Phase 2 of the North American Land Data Assimilation System (NLDAS-2) and averaged for each HRU. The simulation results are compared to simulations with the Variable Infiltration Capacity (VIC) model and the Precipitation Runoff Modeling System (PRMS). We are particularly interested in SUMMA's capability to mimic model behaviors of the other two models through the selection of appropriate model parameterizations in SUMMA.

  15. Implicit gas-kinetic unified algorithm based on multi-block docking grid for multi-body reentry flows covering all flow regimes

    NASA Astrophysics Data System (ADS)

    Peng, Ao-Ping; Li, Zhi-Hui; Wu, Jun-Lin; Jiang, Xin-Yu

    2016-12-01

    Based on the previous researches of the Gas-Kinetic Unified Algorithm (GKUA) for flows from highly rarefied free-molecule transition to continuum, a new implicit scheme of cell-centered finite volume method is presented for directly solving the unified Boltzmann model equation covering various flow regimes. In view of the difficulty in generating the single-block grid system with high quality for complex irregular bodies, a multi-block docking grid generation method is designed on the basis of data transmission between blocks, and the data structure is constructed for processing arbitrary connection relations between blocks with high efficiency and reliability. As a result, the gas-kinetic unified algorithm with the implicit scheme and multi-block docking grid has been firstly established and used to solve the reentry flow problems around the multi-bodies covering all flow regimes with the whole range of Knudsen numbers from 10 to 3.7E-6. The implicit and explicit schemes are applied to computing and analyzing the supersonic flows in near-continuum and continuum regimes around a circular cylinder with careful comparison each other. It is shown that the present algorithm and modelling possess much higher computational efficiency and faster converging properties. The flow problems including two and three side-by-side cylinders are simulated from highly rarefied to near-continuum flow regimes, and the present computed results are found in good agreement with the related DSMC simulation and theoretical analysis solutions, which verify the good accuracy and reliability of the present method. It is observed that the spacing of the multi-body is smaller, the cylindrical throat obstruction is greater with the flow field of single-body asymmetrical more obviously and the normal force coefficient bigger. While in the near-continuum transitional flow regime of near-space flying surroundings, the spacing of the multi-body increases to six times of the diameter of the single-body, the interference effects of the multi-bodies tend to be negligible. The computing practice has confirmed that it is feasible for the present method to compute the aerodynamics and reveal flow mechanism around complex multi-body vehicles covering all flow regimes from the gas-kinetic point of view of solving the unified Boltzmann model velocity distribution function equation.

  16. How High Pressure Unifies Solvation Processes in Liquid Chromatography.

    PubMed

    Bocian, Szymon; Škrinjar, Tea; Bolanca, Tomislav; Buszewski, Bogusław

    2017-11-01

    A series of core-shell-based stationary phases of varying surface chemistry were subjected to solvent adsorption investigation under ultra-HPLC conditions. Acetonitrile and water excess isotherms were measured using a minor disturbance method. It was observed that adsorption of organic solvent is unified under high pressure. Preferential solvation due to specific interactions between the stationary phases and solvent molecules was limited. The obtained results showed that the solvation process is almost independent of surface chemistry, in contrast to HPLC conditions in which specific interactions differentiate solvation processes.

  17. Recent Theoretical Studies On Excitation and Recombination

    NASA Technical Reports Server (NTRS)

    Pradhan, Anil K.

    2000-01-01

    New advances in the theoretical treatment of atomic processes in plasmas are described. These enable not only an integrated, unified, and self-consistent treatment of important radiative and collisional processes, but also large-scale computation of atomic data with high accuracy. An extension of the R-matrix work, from excitation and photoionization to electron-ion recombination, includes a unified method that subsumes both the radiative and the di-electronic recombination processes in an ab initio manner. The extensive collisional calculations for iron and iron-peak elements under the Iron Project are also discussed.

  18. Kinetics of heavy metal adsorption and desorption in soil: Developing a unified model based on chemical speciation

    NASA Astrophysics Data System (ADS)

    Peng, Lanfang; Liu, Paiyu; Feng, Xionghan; Wang, Zimeng; Cheng, Tao; Liang, Yuzhen; Lin, Zhang; Shi, Zhenqing

    2018-03-01

    Predicting the kinetics of heavy metal adsorption and desorption in soil requires consideration of multiple heterogeneous soil binding sites and variations of reaction chemistry conditions. Although chemical speciation models have been developed for predicting the equilibrium of metal adsorption on soil organic matter (SOM) and important mineral phases (e.g. Fe and Al (hydr)oxides), there is still a lack of modeling tools for predicting the kinetics of metal adsorption and desorption reactions in soil. In this study, we developed a unified model for the kinetics of heavy metal adsorption and desorption in soil based on the equilibrium models WHAM 7 and CD-MUSIC, which specifically consider metal kinetic reactions with multiple binding sites of SOM and soil minerals simultaneously. For each specific binding site, metal adsorption and desorption rate coefficients were constrained by the local equilibrium partition coefficients predicted by WHAM 7 or CD-MUSIC, and, for each metal, the desorption rate coefficients of various binding sites were constrained by their metal binding constants with those sites. The model had only one fitting parameter for each soil binding phase, and all other parameters were derived from WHAM 7 and CD-MUSIC. A stirred-flow method was used to study the kinetics of Cd, Cu, Ni, Pb, and Zn adsorption and desorption in multiple soils under various pH and metal concentrations, and the model successfully reproduced most of the kinetic data. We quantitatively elucidated the significance of different soil components and important soil binding sites during the adsorption and desorption kinetic processes. Our model has provided a theoretical framework to predict metal adsorption and desorption kinetics, which can be further used to predict the dynamic behavior of heavy metals in soil under various natural conditions by coupling other important soil processes.

  19. Towards a unified theory of neocortex: laminar cortical circuits for vision and cognition.

    PubMed

    Grossberg, Stephen

    2007-01-01

    A key goal of computational neuroscience is to link brain mechanisms to behavioral functions. The present article describes recent progress towards explaining how laminar neocortical circuits give rise to biological intelligence. These circuits embody two new and revolutionary computational paradigms: Complementary Computing and Laminar Computing. Circuit properties include a novel synthesis of feedforward and feedback processing, of digital and analog processing, and of preattentive and attentive processing. This synthesis clarifies the appeal of Bayesian approaches but has a far greater predictive range that naturally extends to self-organizing processes. Examples from vision and cognition are summarized. A LAMINART architecture unifies properties of visual development, learning, perceptual grouping, attention, and 3D vision. A key modeling theme is that the mechanisms which enable development and learning to occur in a stable way imply properties of adult behavior. It is noted how higher-order attentional constraints can influence multiple cortical regions, and how spatial and object attention work together to learn view-invariant object categories. In particular, a form-fitting spatial attentional shroud can allow an emerging view-invariant object category to remain active while multiple view categories are associated with it during sequences of saccadic eye movements. Finally, the chapter summarizes recent work on the LIST PARSE model of cognitive information processing by the laminar circuits of prefrontal cortex. LIST PARSE models the short-term storage of event sequences in working memory, their unitization through learning into sequence, or list, chunks, and their read-out in planned sequential performance that is under volitional control. LIST PARSE provides a laminar embodiment of Item and Order working memories, also called Competitive Queuing models, that have been supported by both psychophysical and neurobiological data. These examples show how variations of a common laminar cortical design can embody properties of visual and cognitive intelligence that seem, at least on the surface, to be mechanistically unrelated.

  20. Two-stage unified stretched-exponential model for time-dependence of threshold voltage shift under positive-bias-stresses in amorphous indium-gallium-zinc oxide thin-film transistors

    NASA Astrophysics Data System (ADS)

    Jeong, Chan-Yong; Kim, Hee-Joong; Hong, Sae-Young; Song, Sang-Hun; Kwon, Hyuck-In

    2017-08-01

    In this study, we show that the two-stage unified stretched-exponential model can more exactly describe the time-dependence of threshold voltage shift (ΔV TH) under long-term positive-bias-stresses compared to the traditional stretched-exponential model in amorphous indium-gallium-zinc oxide (a-IGZO) thin-film transistors (TFTs). ΔV TH is mainly dominated by electron trapping at short stress times, and the contribution of trap state generation becomes significant with an increase in the stress time. The two-stage unified stretched-exponential model can provide useful information not only for evaluating the long-term electrical stability and lifetime of the a-IGZO TFT but also for understanding the stress-induced degradation mechanism in a-IGZO TFTs.

  1. A Unified Mathematical Definition of Classical Information Retrieval.

    ERIC Educational Resources Information Center

    Dominich, Sandor

    2000-01-01

    Presents a unified mathematical definition for the classical models of information retrieval and identifies a mathematical structure behind relevance feedback. Highlights include vector information retrieval; probabilistic information retrieval; and similarity information retrieval. (Contains 118 references.) (Author/LRW)

  2. Concept of Draft International Standard for a Unified Approach to Space Program Quality Assurance

    NASA Astrophysics Data System (ADS)

    Stryzhak, Y.; Vasilina, V.; Kurbatov, V.

    2002-01-01

    For want of the unified approach to guaranteed space project and product quality assurance, implementation of many international space programs has become a challenge. Globalization of aerospace industry and participation of various international ventures with diverse quality assurance requirements in big international space programs requires for urgent generation of unified international standards related to this field. To ensure successful fulfillment of space missions, aerospace companies should design and process reliable and safe products with properties complying or bettering User's (or Customer's) requirements. Quality of the products designed or processed by subcontractors (or other suppliers) should also be in compliance with the main user (customer)'s requirements. Implementation of this involved set of unified requirements will be made possible by creating and approving a system (series) of international standards under a generic title Space Product Quality Assurance based on a system consensus principle. Conceptual features of the baseline standard in this system (series) should comprise: - Procedures for ISO 9000, CEN and ECSS requirements adaptation and introduction into space product creation, design, manufacture, testing and operation; - Procedures for quality assurance at initial (design) phases of space programs, with a decision on the end product made based on the principle of independence; - Procedures to arrange incoming inspection of products delivered by subcontractors (including testing, audit of supplier's procedures, review of supplier's documentation), and space product certification; - Procedures to identify materials and primary products applied; - Procedures for quality system audit at the component part, primary product and materials supplier facilities; - Unified procedures to form a list of basic performances to be under configuration management; - Unified procedures to form a list of critical space product components, and unified procedures to define risks related to the specific component application and evaluate safety for the entire program implementation. In the eyes of the authors, those features together with a number of other conceptual proposals should constitute a unified standard-technical basis for implementing international space programs.

  3. Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with Behavioral Signal Processing Framework.

    PubMed

    Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S

    2016-05-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.

  4. Creativity and Quantum Physics: a New World View Unifying Current Theories of Creativity and Pointing Toward New Research Methodologies.

    NASA Astrophysics Data System (ADS)

    McCarthy, Kimberly Ann

    1990-01-01

    Divisions in definitions of creativity have centered primarily on the working definition of discontinuity and the inclusion of intrinsic features such as unconscious processing and intrinsic motivation and reinforcement. These differences generally result from Cohen's two world views underlying theories of creativity: Organismic, oriented toward holism; or mechanistic, oriented toward cause-effect reductionism. The quantum world view is proposed which theoretically and empirically unifies organismic and mechanistic elements of creativity. Based on Goswami's Idealistic Interpretation of quantum physics, the quantum view postulates the mind -brain as consisting of both classical and quantum structures and functions. The quantum domain accesses the transcendent order through coherent superpositions (a state of potentialities), while the classical domain performs the function of measuring apparatus through amplifying and recording the result of the collapse of the pure mental state. A theoretical experiment, based on the 1980 Marcel study of conscious and unconscious word-sense disambiguation, is conducted which compares the predictions of the quantum model with those of the 1975 Posner and Snyder Facilitation and Inhibition model. Each model agrees that while conscious access to information is limited, unconscious access is unlimited. However, each model differently defines the connection between these states: The Posner model postulates a central processing mechanism while the quantum model postulates a self-referential consciousness. Consequently, the two models predict differently. The strength of the quantum model lies in its ability to distinguish between classical and quantum definitions of discontinuity, as well as clarifying the function of consciousness, without added assumptions or ad-hoc analysis: Consciousness is an essential, valid feature of quantum mechanisms independent of the field of cognitive psychology. According to the quantum model, through a cycle of conscious and unconscious processing, various contexts are accessed, specifically, coherent superposition states and the removal of the subject-object dichotomy in unconscious processing. Coupled with a high tolerance for ambiguity, the individual has access not only to an increased quantity of information, but is exposed to this information in the absence of a self-referential or biased context, the result of which is an increase in creative behavior.

  5. A unified model of the hierarchical and stochastic theories of gastric cancer

    PubMed Central

    Song, Yanjing; Wang, Yao; Tong, Chuan; Xi, Hongqing; Zhao, Xudong; Wang, Yi; Chen, Lin

    2017-01-01

    Gastric cancer (GC) is a life-threatening disease worldwide. Despite remarkable advances in treatments for GC, it is still fatal to many patients due to cancer progression, recurrence and metastasis. Regarding the development of novel therapeutic techniques, many studies have focused on the biological mechanisms that initiate tumours and cause treatment resistance. Tumours have traditionally been considered to result from somatic mutations, either via clonal evolution or through a stochastic model. However, emerging evidence has characterised tumours using a hierarchical organisational structure, with cancer stem cells (CSCs) at the apex. Both stochastic and hierarchical models are reasonable systems that have been hypothesised to describe tumour heterogeneity. Although each model alone inadequately explains tumour diversity, the two models can be integrated to provide a more comprehensive explanation. In this review, we discuss existing evidence supporting a unified model of gastric CSCs, including the regulatory mechanisms of this unified model in addition to the current status of stemness-related targeted therapy in GC patients. PMID:28301871

  6. Temporal cognition: Connecting subjective time to perception, attention, and memory.

    PubMed

    Matthews, William J; Meck, Warren H

    2016-08-01

    Time is a universal psychological dimension, but time perception has often been studied and discussed in relative isolation. Increasingly, researchers are searching for unifying principles and integrated models that link time perception to other domains. In this review, we survey the links between temporal cognition and other psychological processes. Specifically, we describe how subjective duration is affected by nontemporal stimulus properties (perception), the allocation of processing resources (attention), and past experience with the stimulus (memory). We show that many of these connections instantiate a "processing principle," according to which perceived time is positively related to perceptual vividity and the ease of extracting information from the stimulus. This empirical generalization generates testable predictions and provides a starting-point for integrated theoretical frameworks. By outlining some of the links between temporal cognition and other domains, and by providing a unifying principle for understanding these effects, we hope to encourage time-perception researchers to situate their work within broader theoretical frameworks, and that researchers from other fields will be inspired to apply their insights, techniques, and theorizing to improve our understanding of the representation and judgment of time. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  7. Tensor Arithmetic, Geometric and Mathematic Principles of Fluid Mechanics in Implementation of Direct Computational Experiments

    NASA Astrophysics Data System (ADS)

    Bogdanov, Alexander; Khramushin, Vasily

    2016-02-01

    The architecture of a digital computing system determines the technical foundation of a unified mathematical language for exact arithmetic-logical description of phenomena and laws of continuum mechanics for applications in fluid mechanics and theoretical physics. The deep parallelization of the computing processes results in functional programming at a new technological level, providing traceability of the computing processes with automatic application of multiscale hybrid circuits and adaptive mathematical models for the true reproduction of the fundamental laws of physics and continuum mechanics.

  8. Retooling Institutional Support Infrastructure for Clinical Research

    PubMed Central

    Snyder, Denise C.; Brouwer, Rebecca N.; Ennis, Cory L.; Spangler, Lindsey L.; Ainsworth, Terry L.; Budinger, Susan; Mullen, Catherine; Hawley, Jeffrey; Uhlenbrauck, Gina; Stacy, Mark

    2016-01-01

    Clinical research activities at academic medical centers are challenging to oversee. Without effective research administration, a continually evolving set of regulatory and institutional requirements can detract investigator and study team attention away from a focus on scientific gain, study conduct, and patient safety. However, even when the need for research administration is recognized, there can be struggles over what form it should take. Central research administration may be viewed negatively, with individual groups preferring to maintain autonomy over processes. Conversely, a proliferation of individualized approaches across an institution can create inefficiencies or invite risk. This article describes experiences establishing a unified research support office at the Duke University School of Medicine based on a framework of customer support. The Duke Office of Clinical Research was formed in 2012 with a vision that research administration at academic medical centers should help clinical investigators navigate the complex research environment and operationalize research ideas. The office provides an array of services that have received high satisfaction ratings. The authors describe the ongoing culture change necessary for success of the unified research support office. Lessons learned from implementation of the Duke Office of Clinical Research may serve as a model for other institutions undergoing a transition to unified research support. PMID:27125563

  9. A Novel Grid SINS/DVL Integrated Navigation Algorithm for Marine Application

    PubMed Central

    Kang, Yingyao; Zhao, Lin; Cheng, Jianhua; Fan, Xiaoliang

    2018-01-01

    Integrated navigation algorithms under the grid frame have been proposed based on the Kalman filter (KF) to solve the problem of navigation in some special regions. However, in the existing study of grid strapdown inertial navigation system (SINS)/Doppler velocity log (DVL) integrated navigation algorithms, the Earth models of the filter dynamic model and the SINS mechanization are not unified. Besides, traditional integrated systems with the KF based correction scheme are susceptible to measurement errors, which would decrease the accuracy and robustness of the system. In this paper, an adaptive robust Kalman filter (ARKF) based hybrid-correction grid SINS/DVL integrated navigation algorithm is designed with the unified reference ellipsoid Earth model to improve the navigation accuracy in middle-high latitude regions for marine application. Firstly, to unify the Earth models, the mechanization of grid SINS is introduced and the error equations are derived based on the same reference ellipsoid Earth model. Then, a more accurate grid SINS/DVL filter model is designed according to the new error equations. Finally, a hybrid-correction scheme based on the ARKF is proposed to resist the effect of measurement errors. Simulation and experiment results show that, compared with the traditional algorithms, the proposed navigation algorithm can effectively improve the navigation performance in middle-high latitude regions by the unified Earth models and the ARKF based hybrid-correction scheme. PMID:29373549

  10. A Framework for Modeling and Simulation of the Artificial

    DTIC Science & Technology

    2012-01-01

    y or n) >> y Name: petra Simple Aspects: face_shape/thin, nose/small, skintone/light, hair_color/black, hair_type/curly Integrated Aspects...Multiconference. Orlando, FL (2012) 23. Mittal, S., Risco- Martin , J.: Netcentric System of Systems Engineering with DEVS Unified Process. CRC Press (2012) 24...Mittal, S., Risco- Martin , J., Zeigler, B.: DEVS-based simulation web services for net-centric T&E. In: Proceedings of the 2007 summer computer

  11. Aerosols and Aerosol-related haze forecasting in China Meteorological Adminstration

    NASA Astrophysics Data System (ADS)

    Zhou, Chunhong; Zhang, Xiaoye; Gong, Sunling; Liu, Hongli; Xue, Min

    2017-04-01

    CMA Unified Atmospheric Chemistry Environmental Forecasting System (CUACE) is a unified numerical chemical weather forecasting system with BC, OC, Sulfate, Nitrate, Ammonia, Dust and Sea-Salt aerosols and their sources, gas to particle processes, SOA, microphysics and transformation. With an open interface, CUACE has been online coupled to mesoscale model MM5 and the new NWP system GRAPES (Global/Regional Assimilation and Prediction Enhanced System)min CMA. With Chinese Emissions from Cao and Zhang(2012 and 2013), a forecasting system called CUACE/Haze-fog has been running in real time in CMA and issue 5-days PM10, O3 and Visibility forecasts. A comprehensive ACI scheme has also been developed in CUACE Calculated by a sectional aerosol activation scheme based on the information of size and mass from CUACE and the thermal-dynamic and humid states from the weather model at each time step, the cloud condensation nuclei (CCN) is fed online interactively into a two-moment cloud scheme (WDM6) and a convective parameterization to drive the cloud physics and precipitation formation processes. The results show that interactive aerosols with the WDM6 in CUACE obviously improve the clouds properties and the precipitation, showing 24% to 48% enhancements of TS scoring for 6-h precipitation .

  12. New Model of Mobile Learning for the High School Students Preparing for the Unified State Exam

    ERIC Educational Resources Information Center

    Khasianov, Airat; Shakhova, Irina

    2017-01-01

    In this paper we study a new model of mobile learning for the Unified State Exam ("USE") preparation in Russian Federation. "USE"--is the test school graduates need to pass in order to obtain Russian matura. In recent years the efforts teachers put for preparation of their students to the "USE" diminish how well the…

  13. A unified structural/terminological interoperability framework based on LexEVS: application to TRANSFoRm.

    PubMed

    Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita

    2013-01-01

    Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration.

  14. A unified structural/terminological interoperability framework based on LexEVS: application to TRANSFoRm

    PubMed Central

    Ethier, Jean-François; Dameron, Olivier; Curcin, Vasa; McGilchrist, Mark M; Verheij, Robert A; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C; Burgun, Anita

    2013-01-01

    Objective Biomedical research increasingly relies on the integration of information from multiple heterogeneous data sources. Despite the fact that structural and terminological aspects of interoperability are interdependent and rely on a common set of requirements, current efforts typically address them in isolation. We propose a unified ontology-based knowledge framework to facilitate interoperability between heterogeneous sources, and investigate if using the LexEVS terminology server is a viable implementation method. Materials and methods We developed a framework based on an ontology, the general information model (GIM), to unify structural models and terminologies, together with relevant mapping sets. This allowed a uniform access to these resources within LexEVS to facilitate interoperability by various components and data sources from implementing architectures. Results Our unified framework has been tested in the context of the EU Framework Program 7 TRANSFoRm project, where it was used to achieve data integration in a retrospective diabetes cohort study. The GIM was successfully instantiated in TRANSFoRm as the clinical data integration model, and necessary mappings were created to support effective information retrieval for software tools in the project. Conclusions We present a novel, unifying approach to address interoperability challenges in heterogeneous data sources, by representing structural and semantic models in one framework. Systems using this architecture can rely solely on the GIM that abstracts over both the structure and coding. Information models, terminologies and mappings are all stored in LexEVS and can be accessed in a uniform manner (implementing the HL7 CTS2 service functional model). The system is flexible and should reduce the effort needed from data sources personnel for implementing and managing the integration. PMID:23571850

  15. A unified parameterization of clouds and turbulence using CLUBB and subcolumns in the Community Atmosphere Model

    DOE PAGES

    Thayer-Calder, K.; Gettelman, A.; Craig, C.; ...

    2015-06-30

    Most global climate models parameterize separate cloud types using separate parameterizations. This approach has several disadvantages, including obscure interactions between parameterizations and inaccurate triggering of cumulus parameterizations. Alternatively, a unified cloud parameterization uses one equation set to represent all cloud types. Such cloud types include stratiform liquid and ice cloud, shallow convective cloud, and deep convective cloud. Vital to the success of a unified parameterization is a general interface between clouds and microphysics. One such interface involves drawing Monte Carlo samples of subgrid variability of temperature, water vapor, cloud liquid, and cloud ice, and feeding the sample points into amore » microphysics scheme.This study evaluates a unified cloud parameterization and a Monte Carlo microphysics interface that has been implemented in the Community Atmosphere Model (CAM) version 5.3. Results describing the mean climate and tropical variability from global simulations are presented. The new model shows a degradation in precipitation skill but improvements in short-wave cloud forcing, liquid water path, long-wave cloud forcing, precipitable water, and tropical wave simulation. Also presented are estimations of computational expense and investigation of sensitivity to number of subcolumns.« less

  16. A unified parameterization of clouds and turbulence using CLUBB and subcolumns in the Community Atmosphere Model

    DOE PAGES

    Thayer-Calder, Katherine; Gettelman, A.; Craig, Cheryl; ...

    2015-12-01

    Most global climate models parameterize separate cloud types using separate parameterizations.This approach has several disadvantages, including obscure interactions between parameterizations and inaccurate triggering of cumulus parameterizations. Alternatively, a unified cloud parameterization uses one equation set to represent all cloud types. Such cloud types include stratiform liquid and ice cloud, shallow convective cloud, and deep convective cloud. Vital to the success of a unified parameterization is a general interface between clouds and microphysics. One such interface involves drawing Monte Carlo samples of subgrid variability of temperature, water vapor, cloud liquid, and cloud ice, and feeding the sample points into a microphysicsmore » scheme. This study evaluates a unified cloud parameterization and a Monte Carlo microphysics interface that has been implemented in the Community Atmosphere Model (CAM) version 5.3. Results describing the mean climate and tropical variability from global simulations are presented. In conclusion, the new model shows a degradation in precipitation skill but improvements in short-wave cloud forcing, liquid water path, long-wave cloud forcing, perceptible water, and tropical wave simulation. Also presented are estimations of computational expense and investigation of sensitivity to number of subcolumns.« less

  17. A constitutive material model for nonlinear finite element structural analysis using an iterative matrix approach

    NASA Technical Reports Server (NTRS)

    Koenig, Herbert A.; Chan, Kwai S.; Cassenti, Brice N.; Weber, Richard

    1988-01-01

    A unified numerical method for the integration of stiff time dependent constitutive equations is presented. The solution process is directly applied to a constitutive model proposed by Bodner. The theory confronts time dependent inelastic behavior coupled with both isotropic hardening and directional hardening behaviors. Predicted stress-strain responses from this model are compared to experimental data from cyclic tests on uniaxial specimens. An algorithm is developed for the efficient integration of the Bodner flow equation. A comparison is made with the Euler integration method. An analysis of computational time is presented for the three algorithms.

  18. Building social cognitive models of language change.

    PubMed

    Hruschka, Daniel J; Christiansen, Morten H; Blythe, Richard A; Croft, William; Heggarty, Paul; Mufwene, Salikoko S; Pierrehumbert, Janet B; Poplack, Shana

    2009-11-01

    Studies of language change have begun to contribute to answering several pressing questions in cognitive sciences, including the origins of human language capacity, the social construction of cognition and the mechanisms underlying culture change in general. Here, we describe recent advances within a new emerging framework for the study of language change, one that models such change as an evolutionary process among competing linguistic variants. We argue that a crucial and unifying element of this framework is the use of probabilistic, data-driven models both to infer change and to compare competing claims about social and cognitive influences on language change.

  19. Visuomotor control, eye movements, and steering: A unified approach for incorporating feedback, feedforward, and internal models.

    PubMed

    Lappi, Otto; Mole, Callum

    2018-06-11

    The authors present an approach to the coordination of eye movements and locomotion in naturalistic steering tasks. It is based on recent empirical research, in particular, on driver eye movements, that poses challenges for existing accounts of how we visually steer a course. They first analyze how the ideas of feedback and feedforward processes and internal models are treated in control theoretical steering models within vision science and engineering, which share an underlying architecture but have historically developed in very separate ways. The authors then show how these traditions can be naturally (re)integrated with each other and with contemporary neuroscience, to better understand the skill and gaze strategies involved. They then propose a conceptual model that (a) gives a unified account to the coordination of gaze and steering control, (b) incorporates higher-level path planning, and (c) draws on the literature on paired forward and inverse models in predictive control. Although each of these (a-c) has been considered before (also in the context of driving), integrating them into a single framework and the authors' multiple waypoint identification hypothesis within that framework are novel. The proposed hypothesis is relevant to all forms of visually guided locomotion. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  20. Exposure Time Distributions reveal Denitrification Rates along Groundwater Flow Path of an Agricultural Unconfined Aquifer

    NASA Astrophysics Data System (ADS)

    Kolbe, T.; Abbott, B. W.; Thomas, Z.; Labasque, T.; Aquilina, L.; Laverman, A.; Babey, T.; Marçais, J.; Fleckenstein, J. H.; Peiffer, S.; De Dreuzy, J. R.; Pinay, G.

    2016-12-01

    Groundwater contamination by nitrate is nearly ubiquitous in agricultural regions. Nitrate is highly mobile in groundwater and though it can be denitrified in the aquifer (reduced to inert N2 gas), this process requires the simultaneous occurrence of anoxia, an electron donor (e.g. organic carbon, pyrite), nitrate, and microorganisms capable of denitrification. In addition to this the ratio of the time groundwater spent in a denitrifying environment (exposure time) to the characteristic denitrification reaction time plays an important role, because denitrification can only occur if the exposure time is longer than the characteristic reaction time. Despite a long history of field studies and numerical models, it remains exceedingly difficult to measure or model exposure times in the subsurface at the catchment scale. To approach this problem, we developed a unified modelling approach combining measured environmental proxies with an exposure time based reactive transport model. We measured groundwater age, nitrogen and sulfur isotopes, and water chemistry from agricultural wells in an unconfined aquifer in Brittany, France, to quantify changes in nitrate concentration due to dilution and denitrification. Field data showed large differences in nitrate concentrations among wells, associated with differences in the exposure time distributions. By constraining a catchment-scale characteristic reaction time for denitrification with water chemistry proxies and exposure times, we were able to assess rates of denitrification along groundwater flow paths. This unified modeling approach is transferable to other catchments and could be further used to investigate how catchment structure and flow dynamics interact with biogeochemical processes such as denitrification.

  1. Exploring Environmental Factors in Nursing Workplaces That Promote Psychological Resilience: Constructing a Unified Theoretical Model.

    PubMed

    Cusack, Lynette; Smith, Morgan; Hegney, Desley; Rees, Clare S; Breen, Lauren J; Witt, Regina R; Rogers, Cath; Williams, Allison; Cross, Wendy; Cheung, Kin

    2016-01-01

    Building nurses' resilience to complex and stressful practice environments is necessary to keep skilled nurses in the workplace and ensuring safe patient care. A unified theoretical framework titled Health Services Workplace Environmental Resilience Model (HSWERM), is presented to explain the environmental factors in the workplace that promote nurses' resilience. The framework builds on a previously-published theoretical model of individual resilience, which identified the key constructs of psychological resilience as self-efficacy, coping and mindfulness, but did not examine environmental factors in the workplace that promote nurses' resilience. This unified theoretical framework was developed using a literary synthesis drawing on data from international studies and literature reviews on the nursing workforce in hospitals. The most frequent workplace environmental factors were identified, extracted and clustered in alignment with key constructs for psychological resilience. Six major organizational concepts emerged that related to a positive resilience-building workplace and formed the foundation of the theoretical model. Three concepts related to nursing staff support (professional, practice, personal) and three related to nursing staff development (professional, practice, personal) within the workplace environment. The unified theoretical model incorporates these concepts within the workplace context, linking to the nurse, and then impacting on personal resilience and workplace outcomes, and its use has the potential to increase staff retention and quality of patient care.

  2. Dark Matter from SUGRA GUTs: mSUGRA, NUSUGRA and Yukawa-unified SUGRA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baer, Howard

    2009-09-08

    Gravity-mediated SUSY breaking models with R-parity conservation give rise to dark matter in the universe. I review neutralino dark matter in the minimal supergravity model (mSUGRA), models with non-universal soft SUSY breaking terms (NUSUGRA) which yield a well-tempered neutralino, and models with unified Yukawa couplings at the GUT scale (as may occur in an SO(10) SUSY GUT theory). These latter models have difficulty accomodating neutralino dark matter, but work very well if the dark matter particles are axions and axinos.

  3. Development of a unified constitutive model for an isotropic nickel base superalloy Rene 80

    NASA Technical Reports Server (NTRS)

    Ramaswamy, V. G.; Vanstone, R. H.; Laflen, J. H.; Stouffer, D. C.

    1988-01-01

    Accurate analysis of stress-strain behavior is of critical importance in the evaluation of life capabilities of hot section turbine engine components such as turbine blades and vanes. The constitutive equations used in the finite element analysis of such components must be capable of modeling a variety of complex behavior exhibited at high temperatures by cast superalloys. The classical separation of plasticity and creep employed in most of the finite element codes in use today is known to be deficient in modeling elevated temperature time dependent phenomena. Rate dependent, unified constitutive theories can overcome many of these difficulties. A new unified constitutive theory was developed to model the high temperature, time dependent behavior of Rene' 80 which is a cast turbine blade and vane nickel base superalloy. Considerations in model development included the cyclic softening behavior of Rene' 80, rate independence at lower temperatures and the development of a new model for static recovery.

  4. Epidemic Percolation Networks, Epidemic Outcomes, and Interventions

    DOE PAGES

    Kenah, Eben; Miller, Joel C.

    2011-01-01

    Epidemic percolation networks (EPNs) are directed random networks that can be used to analyze stochastic “Susceptible-Infectious-Removed” (SIR) and “Susceptible-Exposed-Infectious-Removed” (SEIR) epidemic models, unifying and generalizing previous uses of networks and branching processes to analyze mass-action and network-based S(E)IR models. This paper explains the fundamental concepts underlying the definition and use of EPNs, using them to build intuition about the final outcomes of epidemics. We then show how EPNs provide a novel and useful perspective on the design of vaccination strategies.

  5. Epidemic Percolation Networks, Epidemic Outcomes, and Interventions

    PubMed Central

    Kenah, Eben; Miller, Joel C.

    2011-01-01

    Epidemic percolation networks (EPNs) are directed random networks that can be used to analyze stochastic “Susceptible-Infectious-Removed” (SIR) and “Susceptible-Exposed-Infectious-Removed” (SEIR) epidemic models, unifying and generalizing previous uses of networks and branching processes to analyze mass-action and network-based S(E)IR models. This paper explains the fundamental concepts underlying the definition and use of EPNs, using them to build intuition about the final outcomes of epidemics. We then show how EPNs provide a novel and useful perspective on the design of vaccination strategies. PMID:21437002

  6. The evolution of the federal funding policies for the public health surveillance component of Brazil's Unified Health System (SUS).

    PubMed

    Pinto, Vitor Laerte; Cerbino Neto, José; Penna, Gerson Oliveira

    2014-12-01

    Health surveillance (HS) is one of the key components of the Brazilian Unified Health System (SUS). This article describes recent changes in health surveillance funding models and the role these changes have had in the reorganization and decentralization of health actions. Federal law no. 8.080 of 1990 defined health surveillance as a fundamental pillar of the SUS, and an exclusive fund with equitable distribution criteria was created in the Basic Operational Norm of 1996 to pay for health surveillance actions. This step facilitated the decentralization of health care at the municipal level, giving local authorities autonomy to plan and provide services. The Health Pact of 2006 and its regulation under federal decree No. 3252 in 2009 bolstered the processes of decentralization, regionalization and integration of health care. Further changes in the basic concepts of health surveillance around the world and in the funding policies negotiated by different spheres of government in Brazil have been catalysts for the process of HS institutionalization in recent years.

  7. A Unified Framework for Analyzing and Designing for Stationary Arterial Networks

    DOT National Transportation Integrated Search

    2017-05-17

    This research aims to develop a unified theoretical and simulation framework for analyzing and designing signals for stationary arterial networks. Existing traffic flow models used in design and analysis of signal control strategies are either too si...

  8. A combined model for pseudo-rapidity distributions in Cu-Cu collisions at BNL-RHIC energies

    NASA Astrophysics Data System (ADS)

    Jiang, Z. J.; Wang, J.; Huang, Y.

    2016-04-01

    The charged particles produced in nucleus-nucleus collisions come from leading particles and those frozen out from the hot and dense matter created in collisions. The leading particles are conventionally supposed having Gaussian rapidity distributions normalized to the number of participants. The hot and dense matter is assumed to expand according to the unified hydrodynamics, a hydro model which unifies the features of Landau and Hwa-Bjorken model, and freeze out into charged particles from a time-like hypersurface with a proper time of τFO. The rapidity distribution of this part of charged particles can be derived analytically. The combined contribution from both leading particles and unified hydrodynamics is then compared against the experimental data performed by BNL-RHIC-PHOBOS Collaboration in different centrality Cu-Cu collisions at sNN = 200 and 62.4GeV, respectively. The model predictions are consistent with experimental measurements.

  9. Framework for Design of Traceability System on Organic Rice Certification

    NASA Astrophysics Data System (ADS)

    Purwandoko, P. B.; Seminar, K. B.; Sutrisno; Sugiyanta

    2018-05-01

    Nowadays, the preferences of organic products such as organic rice have been increased. It because of the people awareness of the healthy and eco-friendly food product consumption has grown. Therefore, it is very important to ensure organic quality of the product that will be produced. Certification is a series of process that holds to ensure the quality of products meets all criteria of organic standards. Currently, there is a problem that traceability information system for organic rice certification has been not available. The current system still conducts manually caused the loss of information during storage process. This paper aimed at developing a traceability framework on organic rice certification process. First, the main discussed issues are organic certification process. Second, unified modeling language (UML) is used to build the model of user requirement in order to develop traceability system for all actors in the certification process. Furthermore, the information captured model along certification process will be explained in this paper. The model shows the information flow that has to be recorded for each actor. Finally, the challenges in the implementation system will be discussed in this paper.

  10. Unified Models of Turbulence and Nonlinear Wave Evolution in the Extended Solar Corona and Solar Wind

    NASA Technical Reports Server (NTRS)

    Cranmer, Steven R.; Wagner, William (Technical Monitor)

    2003-01-01

    The PI (Cranmer) and Co-I (A. van Ballegooijen) made significant progress toward the goal of building a "unified model" of the dominant physical processes responsible for the acceleration of the solar wind. The approach outlined in the original proposal comprised two complementary pieces: (1) to further investigate individual physical processes under realistic coronal and solar wind conditions, and (2) to extract the dominant physical effects from simulations and apply them to a one-dimensional and time-independent model of plasma heating and acceleration. The accomplishments in the report period are thus divided into these two categories: 1a. Focused Study of Kinetic MHD Turbulence. We have developed a model of magnetohydrodynamic (MHD) turbulence in the extended solar corona that contains the effects of collisionless dissipation and anisotropic particle heating. A turbulent cascade is one possible way of generating small-scale fluctuations (easy to dissipate/heat) from a pre-existing population of low-frequency Alfven waves (difficult to dissipate/heat). We modeled the cascade as a combination of advection and diffusion in wavenumber space. The dominant spectral transfer occurs in the direction perpendicular to the background magnetic field. As expected from earlier models, this leads to a highly anisotropic fluctuation spectrum with a rapidly decaying tail in the parallel wavenumber direction. The wave power that decays to high enough frequencies to become ion cyclotron resonant depends on the relative strengths of advection and diffusion in the cascade. For the most realistic values of these parameters, though, there is insufficient power to heat protons and heavy ions. The dominant oblique waves undergo Landau damping, which implies strong parallel electron heating. We thus investigated the nonlinear evolution of the electron velocity distributions (VDFs) into parallel beams and discrete phase-space holes (similar to those seen in the terrestrial magnetosphere) which are an alternate means of heating protons via stochastic interactions similar to particle-particle collisions. 1b. Focused Study of the Multi-Mode Detailed Balance Formalism. The PI began to explore the feasibility of using the "weak turbulence," or detailed-balance theory of Tsytovich, Melrose, and others to encompass the relevant physics of the solar wind. This study did not go far, however, because if the "strong" MHD turbulence discussed above is a dominant player in the wind's acceleration region, this formalism is inherently not applicable to the corona. We will continue to study the various published approaches to the weak turbulence formalism, especially with an eye on ways to parameterize nonlinear wave reflection rates. 2. Building the Unified Model Code Architecture. We have begun developing the computational model of a time-steady open flux tube in the extended corona. The model will be "unified" in the sense that it will include (simultaneously for the first time) as many of the various proposed physical processes as possible, all on equal footing. To retain this generality, we have formulated the problem in two interconnected parts: a completely kinetic model for the particles, using the Monte Carlo approach, and a finite-difference approach for the self-consistent fluctuation spectra. The two codes are run sequentially and iteratively until complete consistency is achieved. The current version of the Monte Carlo code incorporates gravity, the zero-current electric field, magnetic mirroring, and collisions. The fluctuation code incorporates WKJ3 wave action conservation and the cascade/dissipation processes discussed above. The codes are being run for various test problems with known solutions. Planned additions to the codes include prescriptions for nonlinear wave steepening, kinetic velocity-space diffusion, and multi-mode coupling (including reflection and refraction).

  11. A Unified Data Assimilation Strategy for Regional Coupled Atmosphere-Ocean Prediction Systems

    NASA Astrophysics Data System (ADS)

    Xie, Lian; Liu, Bin; Zhang, Fuqing; Weng, Yonghui

    2014-05-01

    Improving tropical cyclone (TC) forecasts is a top priority in weather forecasting. Assimilating various observational data to produce better initial conditions for numerical models using advanced data assimilation techniques has been shown to benefit TC intensity forecasts, whereas assimilating large-scale environmental circulation into regional models by spectral nudging or Scale-Selective Data Assimilation (SSDA) has been demonstrated to improve TC track forecasts. Meanwhile, taking into account various air-sea interaction processes by high-resolution coupled air-sea modelling systems has also been shown to improve TC intensity forecasts. Despite the advances in data assimilation and air-sea coupled models, large errors in TC intensity and track forecasting remain. For example, Hurricane Nate (2011) has brought considerable challenge for the TC operational forecasting community, with very large intensity forecast errors (27, 25, and 40 kts for 48, 72, and 96 h, respectively) for the official forecasts. Considering the slow-moving nature of Hurricane Nate, it is reasonable to hypothesize that air-sea interaction processes played a critical role in the intensity change of the storm, and accurate representation of the upper ocean dynamics and thermodynamics is necessary to quantitatively describe the air-sea interaction processes. Currently, data assimilation techniques are generally only applied to hurricane forecasting in stand-alone atmospheric or oceanic model. In fact, most of the regional hurricane forecasting models only included data assimilation techniques for improving the initial condition of the atmospheric model. In such a situation, the benefit of adjustments in one model (atmospheric or oceanic) by assimilating observational data can be compromised by errors from the other model. Thus, unified data assimilation techniques for coupled air-sea modelling systems, which not only simultaneously assimilate atmospheric and oceanic observations into the coupled air-sea modelling system, but also nudging the large-scale environmental flow in the regional model towards global model forecasts are of increasing necessity. In this presentation, we will outline a strategy for an integrated approach in air-sea coupled data assimilation and discuss its benefits and feasibility from incremental results for select historical hurricane cases.

  12. A theoretical formulation of wave-vortex interactions

    NASA Technical Reports Server (NTRS)

    Wu, J. Z.; Wu, J. M.

    1989-01-01

    A unified theoretical formulation for wave-vortex interaction, designated the '(omega, Pi) framework,' is presented. Based on the orthogonal decomposition of fluid dynamic interactions, the formulation can be used to study a variety of problems, including the interaction of a longitudinal (acoustic) wave and/or transverse (vortical) wave with a main vortex flow. Moreover, the formulation permits a unified treatment of wave-vortex interaction at various approximate levels, where the normal 'piston' process and tangential 'rubbing' process can be approximated dfferently.

  13. Preliminary Development of a Unified Viscoplastic Constitutive Model for Alloy 617 with Special Reference to Long Term Creep Behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sham, Sam; Walker, Kevin P.

    The expected service life of the Next Generation Nuclear Plant is 60 years. Structural analyses of the Intermediate Heat Exchanger (IHX) will require the development of unified viscoplastic constitutive models that address the material behavior of Alloy 617, a construction material of choice, over a wide range of strain rates. Many unified constitutive models employ a yield stress state variable which is used to account for cyclic hardening and softening of the material. For low stress values below the yield stress state variable these constitutive models predict that no inelastic deformation takes place which is contrary to experimental results. Themore » ability to model creep deformation at low stresses for the IHX application is very important as the IHX operational stresses are restricted to very small values due to the low creep strengths at elevated temperatures and long design lifetime. This paper presents some preliminary work in modeling the unified viscoplastic constitutive behavior of Alloy 617 which accounts for the long term, low stress, creep behavior and the hysteretic behavior of the material at elevated temperatures. The preliminary model is presented in one-dimensional form for ease of understanding, but the intent of the present work is to produce a three-dimensional model suitable for inclusion in the user subroutines UMAT and USERPL of the ABAQUS and ANSYS nonlinear finite element codes. Further experiments and constitutive modeling efforts are planned to model the material behavior of Alloy 617 in more detail.« less

  14. Contactless laser viscometer for flowing liquid films

    NASA Astrophysics Data System (ADS)

    Michels, Alexandre F.; Menegotto, Thiago; Grieneisen, Hans-Peter; Horowitz, Flavio

    2005-12-01

    This work briefly reviews recent progress in interferometric monitoring of spin and of dip coating, from a unified point of view, and its application for contactless viscometry of liquid films. Considering the associated models and measurement uncertainties, the method was validated for both coating processes with oil standards of known viscosities and constant refractive indices. Limitations and perspectives for application of the laser viscometer to liquid films with a varying refractive index are also discussed.

  15. Addressing the Big-Earth-Data Variety Challenge with the Hierarchical Triangular Mesh

    NASA Technical Reports Server (NTRS)

    Rilee, Michael L.; Kuo, Kwo-Sen; Clune, Thomas; Oloso, Amidu; Brown, Paul G.; Yu, Honfeng

    2016-01-01

    We have implemented an updated Hierarchical Triangular Mesh (HTM) as the basis for a unified data model and an indexing scheme for geoscience data to address the variety challenge of Big Earth Data. We observe that, in the absence of variety, the volume challenge of Big Data is relatively easily addressable with parallel processing. The more important challenge in achieving optimal value with a Big Data solution for Earth Science (ES) data analysis, however, is being able to achieve good scalability with variety. With HTM unifying at least the three popular data models, i.e. Grid, Swath, and Point, used by current ES data products, data preparation time for integrative analysis of diverse datasets can be drastically reduced and better variety scaling can be achieved. In addition, since HTM is also an indexing scheme, when it is used to index all ES datasets, data placement alignment (or co-location) on the shared nothing architecture, which most Big Data systems are based on, is guaranteed and better performance is ensured. Moreover, our updated HTM encoding turns most geospatial set operations into integer interval operations, gaining further performance advantages.

  16. Computational Analysis and Simulation of Empathic Behaviors: A Survey of Empathy Modeling with Behavioral Signal Processing Framework

    PubMed Central

    Xiao, Bo; Imel, Zac E.; Georgiou, Panayiotis; Atkins, David C.; Narayanan, Shrikanth S.

    2017-01-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation, and offer a series of open problems for future research. PMID:27017830

  17. A unified model explains commonness and rarity on coral reefs.

    PubMed

    Connolly, Sean R; Hughes, Terry P; Bellwood, David R

    2017-04-01

    Abundance patterns in ecological communities have important implications for biodiversity maintenance and ecosystem functioning. However, ecological theory has been largely unsuccessful at capturing multiple macroecological abundance patterns simultaneously. Here, we propose a parsimonious model that unifies widespread ecological relationships involving local aggregation, species-abundance distributions, and species associations, and we test this model against the metacommunity structure of reef-building corals and coral reef fishes across the western and central Pacific. For both corals and fishes, the unified model simultaneously captures extremely well local species-abundance distributions, interspecific variation in the strength of spatial aggregation, patterns of community similarity, species accumulation, and regional species richness, performing far better than alternative models also examined here and in previous work on coral reefs. Our approach contributes to the development of synthetic theory for large-scale patterns of community structure in nature, and to addressing ongoing challenges in biodiversity conservation at macroecological scales. © 2017 The Authors. Ecology Letters published by CNRS and John Wiley & Sons Ltd.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnowitt, R.; Nath, P.

    A survey is given of supersymmetry and supergravity and their phenomenology. Some of the topics discussed are the basic ideas of global supersymmetry, the minimal supersymmetric Standard Model (MSSM) and its phenomenology, the basic ideas of local supersymmetry (supergravity), grand unification, supersymmetry breaking in supergravity grand unified models, radiative breaking of SU(2) {times} U(1), proton decay, cosmological constraints, and predictions of supergravity grand unified models. While the number of detailed derivations are necessarily limited, a sufficient number of results are given so that a reader can get a working knowledge of this field.

  19. Finite element implementation of Robinson's unified viscoplastic model and its application to some uniaxial and multiaxial problems

    NASA Technical Reports Server (NTRS)

    Arya, V. K.; Kaufman, A.

    1989-01-01

    A description of the finite element implementation of Robinson's unified viscoplastic model into the General Purpose Finite Element Program (MARC) is presented. To demonstrate its application, the implementation is applied to some uniaxial and multiaxial problems. A comparison of the results for the multiaxial problem of a thick internally pressurized cylinder, obtained using the finite element implementation and an analytical solution, is also presented. The excellent agreement obtained confirms the correct finite element implementation of Robinson's model.

  20. Finite element implementation of Robinson's unified viscoplastic model and its application to some uniaxial and multiaxial problems

    NASA Technical Reports Server (NTRS)

    Arya, V. K.; Kaufman, A.

    1987-01-01

    A description of the finite element implementation of Robinson's unified viscoplastic model into the General Purpose Finite Element Program (MARC) is presented. To demonstrate its application, the implementation is applied to some uniaxial and multiaxial problems. A comparison of the results for the multiaxial problem of a thick internally pressurized cylinder, obtained using the finite element implementation and an analytical solution, is also presented. The excellent agreement obtained confirms the correct finite element implementation of Robinson's model.

  1. On unified modeling, theory, and method for solving multi-scale global optimization problems

    NASA Astrophysics Data System (ADS)

    Gao, David Yang

    2016-10-01

    A unified model is proposed for general optimization problems in multi-scale complex systems. Based on this model and necessary assumptions in physics, the canonical duality theory is presented in a precise way to include traditional duality theories and popular methods as special applications. Two conjectures on NP-hardness are proposed, which should play important roles for correctly understanding and efficiently solving challenging real-world problems. Applications are illustrated for both nonconvex continuous optimization and mixed integer nonlinear programming.

  2. Toward a Unified Theory of Visual Area V4

    PubMed Central

    Roe, Anna W.; Chelazzi, Leonardo; Connor, Charles E.; Conway, Bevil R.; Fujita, Ichiro; Gallant, Jack L.; Lu, Haidong; Vanduffel, Wim

    2016-01-01

    Visual area V4 is a midtier cortical area in the ventral visual pathway. It is crucial for visual object recognition and has been a focus of many studies on visual attention. However, there is no unifying view of V4’s role in visual processing. Neither is there an understanding of how its role in feature processing interfaces with its role in visual attention. This review captures our current knowledge of V4, largely derived from electrophysiological and imaging studies in the macaque monkey. Based on recent discovery of functionally specific domains in V4, we propose that the unifying function of V4 circuitry is to enable selective extraction of specific functional domain-based networks, whether it be by bottom-up specification of object features or by top-down attentionally driven selection. PMID:22500626

  3. Spray Combustion Modeling with VOF and Finite-Rate Chemistry

    NASA Technical Reports Server (NTRS)

    Chen, Yen-Sen; Shang, Huan-Min; Liaw, Paul; Wang, Ten-See

    1996-01-01

    A spray atomization and combustion model is developed based on the volume-of-fluid (VOF) transport equation with finite-rate chemistry model. The gas-liquid interface mass, momentum and energy conservation laws are modeled by continuum surface force mechanisms. A new solution method is developed such that the present VOF model can be applied for all-speed range flows. The objectives of the present study are: (1) to develop and verify the fractional volume-of-fluid (VOF) cell partitioning approach into a predictor-corrector algorithm to deal with multiphase (gas-liquid) free surface flow problems; (2) to implement the developed unified algorithm in a general purpose computational fluid dynamics (CFD) code, Finite Difference Navier-Stokes (FDNS), with droplet dynamics and finite-rate chemistry models; and (3) to demonstrate the effectiveness of the present approach by simulating benchmark problems of jet breakup/spray atomization and combustion. Modeling multiphase fluid flows poses a significant challenge because a required boundary must be applied to a transient, irregular surface that is discontinuous, and the flow regimes considered can range from incompressible to highspeed compressible flows. The flow-process modeling is further complicated by surface tension, interfacial heat and mass transfer, spray formation and turbulence, and their interactions. The major contribution of the present method is to combine the novel feature of the Volume of Fluid (VOF) method and the Eulerian/Lagrangian method into a unified algorithm for efficient noniterative, time-accurate calculations of multiphase free surface flows valid at all speeds. The proposed method reformulated the VOF equation to strongly couple two distinct phases (liquid and gas), and tracks droplets on a Lagrangian frame when spray model is required, using a unified predictor-corrector technique to account for the non-linear linkages through the convective contributions of VOF. The discontinuities within the sharp interface will be modeled as a volume force to avoid stiffness. Formations of droplets, tracking of droplet dynamics and modeling of the droplet breakup/evaporation, are handled through the same unified predictor-corrector procedure. Thus the new algorithm is non-iterative and is flexible for general geometries with arbitrarily complex topology in free surfaces. The FDNS finite-difference Navier-Stokes code is employed as the baseline of the current development. Benchmark test cases of shear coaxial LOX/H2 liquid jet with atomization/combustion and impinging jet test cases are investigated in the present work. Preliminary data comparisons show good qualitative agreement between data and the present analysis. It is indicative from these results that the present method has great potential to become a general engineering design analysis and diagnostics tool for problems involving spray combustion.

  4. Towards robust quantification and reduction of uncertainty in hydrologic predictions: Integration of particle Markov chain Monte Carlo and factorial polynomial chaos expansion

    NASA Astrophysics Data System (ADS)

    Wang, S.; Huang, G. H.; Baetz, B. W.; Ancell, B. C.

    2017-05-01

    The particle filtering techniques have been receiving increasing attention from the hydrologic community due to its ability to properly estimate model parameters and states of nonlinear and non-Gaussian systems. To facilitate a robust quantification of uncertainty in hydrologic predictions, it is necessary to explicitly examine the forward propagation and evolution of parameter uncertainties and their interactions that affect the predictive performance. This paper presents a unified probabilistic framework that merges the strengths of particle Markov chain Monte Carlo (PMCMC) and factorial polynomial chaos expansion (FPCE) algorithms to robustly quantify and reduce uncertainties in hydrologic predictions. A Gaussian anamorphosis technique is used to establish a seamless bridge between the data assimilation using the PMCMC and the uncertainty propagation using the FPCE through a straightforward transformation of posterior distributions of model parameters. The unified probabilistic framework is applied to the Xiangxi River watershed of the Three Gorges Reservoir (TGR) region in China to demonstrate its validity and applicability. Results reveal that the degree of spatial variability of soil moisture capacity is the most identifiable model parameter with the fastest convergence through the streamflow assimilation process. The potential interaction between the spatial variability in soil moisture conditions and the maximum soil moisture capacity has the most significant effect on the performance of streamflow predictions. In addition, parameter sensitivities and interactions vary in magnitude and direction over time due to temporal and spatial dynamics of hydrologic processes.

  5. Effects of Positive Unified Behavior Support on Instruction

    ERIC Educational Resources Information Center

    Scott, John S.; White, Richard; Algozzine, Bob; Algozzine, Kate

    2009-01-01

    "Positive Unified Behavior Support" (PUBS) is a school-wide intervention designed to establish uniform attitudes, expectations, correction procedures, and roles among faculty, staff, and administration. PUBS is grounded in the general principles of positive behavior support and represents a straightforward, practical implementation model. When…

  6. BRDF profile of Tyvek and its implementation in the Geant4 simulation toolkit.

    PubMed

    Nozka, Libor; Pech, Miroslav; Hiklova, Helena; Mandat, Dusan; Hrabovsky, Miroslav; Schovanek, Petr; Palatka, Miroslav

    2011-02-28

    Diffuse and specular characteristics of the Tyvek 1025-BL material are reported with respect to their implementation in the Geant4 Monte Carlo simulation toolkit. This toolkit incorporates the UNIFIED model. Coefficients defined by the UNIFIED model were calculated from the bidirectional reflectance distribution function (BRDF) profiles measured with a scatterometer for several angles of incidence. Results were amended with profile measurements made by a profilometer.

  7. A Unified Air-Sea Visualization System: Survey on Gridding Structures

    NASA Technical Reports Server (NTRS)

    Anand, Harsh; Moorhead, Robert

    1995-01-01

    The goal is to develop a Unified Air-Sea Visualization System (UASVS) to enable the rapid fusion of observational, archival, and model data for verification and analysis. To design and develop UASVS, modelers were polled to determine the gridding structures and visualization systems used, and their needs with respect to visual analysis. A basic UASVS requirement is to allow a modeler to explore multiple data sets within a single environment, or to interpolate multiple datasets onto one unified grid. From this survey, the UASVS should be able to visualize 3D scalar/vector fields; render isosurfaces; visualize arbitrary slices of the 3D data; visualize data defined on spectral element grids with the minimum number of interpolation stages; render contours; produce 3D vector plots and streamlines; provide unified visualization of satellite images, observations and model output overlays; display the visualization on a projection of the users choice; implement functions so the user can derive diagnostic values; animate the data to see the time-evolution; animate ocean and atmosphere at different rates; store the record of cursor movement, smooth the path, and animate a window around the moving path; repeatedly start and stop the visual time-stepping; generate VHS tape animations; work on a variety of workstations; and allow visualization across clusters of workstations and scalable high performance computer systems.

  8. Unified Mie and fractal scattering by cells and experimental study on application in optical characterization of cellular and subcellular structures.

    PubMed

    Xu, Min; Wu, Tao T; Qu, Jianan Y

    2008-01-01

    A unified Mie and fractal model for light scattering by biological cells is presented. This model is shown to provide an excellent global agreement with the angular dependent elastic light scattering spectroscopy of cells over the whole visible range (400 to 700 nm) and at all scattering angles (1.1 to 165 deg) investigated. Mie scattering from the bare cell and the nucleus is found to dominate light scattering in the forward directions, whereas the random fluctuation of the background refractive index within the cell, behaving as a fractal random continuous medium, is found to dominate light scattering at other angles. Angularly dependent elastic light scattering spectroscopy aided by the unified Mie and fractal model is demonstrated to be an effective noninvasive approach to characterize biological cells and their internal structures. The acetowhitening effect induced by applying acetic acid on epithelial cells is investigated as an example. The changes in morphology and refractive index of epithelial cells, nuclei, and subcellular structures after the application of acetic acid are successfully probed and quantified using the proposed approach. The unified Mie and fractal model may serve as the foundation for optical detection of precancerous and cancerous changes in biological cells and tissues based on light scattering techniques.

  9. POM.gpu-v1.0: a GPU-based Princeton Ocean Model

    NASA Astrophysics Data System (ADS)

    Xu, S.; Huang, X.; Oey, L.-Y.; Xu, F.; Fu, H.; Zhang, Y.; Yang, G.

    2015-09-01

    Graphics processing units (GPUs) are an attractive solution in many scientific applications due to their high performance. However, most existing GPU conversions of climate models use GPUs for only a few computationally intensive regions. In the present study, we redesign the mpiPOM (a parallel version of the Princeton Ocean Model) with GPUs. Specifically, we first convert the model from its original Fortran form to a new Compute Unified Device Architecture C (CUDA-C) code, then we optimize the code on each of the GPUs, the communications between the GPUs, and the I / O between the GPUs and the central processing units (CPUs). We show that the performance of the new model on a workstation containing four GPUs is comparable to that on a powerful cluster with 408 standard CPU cores, and it reduces the energy consumption by a factor of 6.8.

  10. Artificial intelligence applied to the automatic analysis of absorption spectra. Objective measurement of the fine structure constant

    NASA Astrophysics Data System (ADS)

    Bainbridge, Matthew B.; Webb, John K.

    2017-06-01

    A new and automated method is presented for the analysis of high-resolution absorption spectra. Three established numerical methods are unified into one `artificial intelligence' process: a genetic algorithm (Genetic Voigt Profile FIT, gvpfit); non-linear least-squares with parameter constraints (vpfit); and Bayesian model averaging (BMA). The method has broad application but here we apply it specifically to the problem of measuring the fine structure constant at high redshift. For this we need objectivity and reproducibility. gvpfit is also motivated by the importance of obtaining a large statistical sample of measurements of Δα/α. Interactive analyses are both time consuming and complex and automation makes obtaining a large sample feasible. In contrast to previous methodologies, we use BMA to derive results using a large set of models and show that this procedure is more robust than a human picking a single preferred model since BMA avoids the systematic uncertainties associated with model choice. Numerical simulations provide stringent tests of the whole process and we show using both real and simulated spectra that the unified automated fitting procedure out-performs a human interactive analysis. The method should be invaluable in the context of future instrumentation like ESPRESSO on the VLT and indeed future ELTs. We apply the method to the zabs = 1.8389 absorber towards the zem = 2.145 quasar J110325-264515. The derived constraint of Δα/α = 3.3 ± 2.9 × 10-6 is consistent with no variation and also consistent with the tentative spatial variation reported in Webb et al. and King et al.

  11. An Overview Of Wideband Signal Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Speiser, Jeffrey M.; Whitehouse, Harper J.

    1989-11-01

    This paper provides a unifying perspective for several narowband and wideband signal processing techniques. It considers narrowband ambiguity functions and Wigner-Ville distibutions, together with the wideband ambiguity function and several proposed approaches to a wideband version of the Wigner-Ville distribution (WVD). A unifying perspective is provided by the methodology of unitary representations and ray representations of transformation groups.

  12. New Dots Downunder: The Implementation of Unified English Braille (UEB) in Australian Schools

    ERIC Educational Resources Information Center

    Gentle, Frances; Steer, Michael; Howse, Josie

    2012-01-01

    In this article the authors will outline and describe the recent implementation of Unified English Braille (UEB) in Australia's complex school systems. The New South Wales Department of Education and Communities (NSW/DEC) played a leading role in the process. The education sector at all levels in Australia appears to have embraced the introduction…

  13. An integrated radar model solution for mission level performance and cost trades

    NASA Astrophysics Data System (ADS)

    Hodge, John; Duncan, Kerron; Zimmerman, Madeline; Drupp, Rob; Manno, Mike; Barrett, Donald; Smith, Amelia

    2017-05-01

    A fully integrated Mission-Level Radar model is in development as part of a multi-year effort under the Northrop Grumman Mission Systems (NGMS) sector's Model Based Engineering (MBE) initiative to digitally interconnect and unify previously separate performance and cost models. In 2016, an NGMS internal research and development (IR and D) funded multidisciplinary team integrated radio frequency (RF), power, control, size, weight, thermal, and cost models together using a commercial-off-the-shelf software, ModelCenter, for an Active Electronically Scanned Array (AESA) radar system. Each represented model was digitally connected with standard interfaces and unified to allow end-to-end mission system optimization and trade studies. The radar model was then linked to the Air Force's own mission modeling framework (AFSIM). The team first had to identify the necessary models, and with the aid of subject matter experts (SMEs) understand and document the inputs, outputs, and behaviors of the component models. This agile development process and collaboration enabled rapid integration of disparate models and the validation of their combined system performance. This MBE framework will allow NGMS to design systems more efficiently and affordably, optimize architectures, and provide increased value to the customer. The model integrates detailed component models that validate cost and performance at the physics level with high-level models that provide visualization of a platform mission. This connectivity of component to mission models allows hardware and software design solutions to be better optimized to meet mission needs, creating cost-optimal solutions for the customer, while reducing design cycle time through risk mitigation and early validation of design decisions.

  14. Exploring Environmental Factors in Nursing Workplaces That Promote Psychological Resilience: Constructing a Unified Theoretical Model

    PubMed Central

    Cusack, Lynette; Smith, Morgan; Hegney, Desley; Rees, Clare S.; Breen, Lauren J.; Witt, Regina R.; Rogers, Cath; Williams, Allison; Cross, Wendy; Cheung, Kin

    2016-01-01

    Building nurses' resilience to complex and stressful practice environments is necessary to keep skilled nurses in the workplace and ensuring safe patient care. A unified theoretical framework titled Health Services Workplace Environmental Resilience Model (HSWERM), is presented to explain the environmental factors in the workplace that promote nurses' resilience. The framework builds on a previously-published theoretical model of individual resilience, which identified the key constructs of psychological resilience as self-efficacy, coping and mindfulness, but did not examine environmental factors in the workplace that promote nurses' resilience. This unified theoretical framework was developed using a literary synthesis drawing on data from international studies and literature reviews on the nursing workforce in hospitals. The most frequent workplace environmental factors were identified, extracted and clustered in alignment with key constructs for psychological resilience. Six major organizational concepts emerged that related to a positive resilience-building workplace and formed the foundation of the theoretical model. Three concepts related to nursing staff support (professional, practice, personal) and three related to nursing staff development (professional, practice, personal) within the workplace environment. The unified theoretical model incorporates these concepts within the workplace context, linking to the nurse, and then impacting on personal resilience and workplace outcomes, and its use has the potential to increase staff retention and quality of patient care. PMID:27242567

  15. Interactive stereo electron microscopy enhanced with virtual reality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E.Wes; Bastacky, S.Jacob; Schwartz, Kenneth S.

    2001-12-17

    An analytical system is presented that is used to take measurements of objects perceived in stereo image pairs obtained from a scanning electron microscope (SEM). Our system operates by presenting a single stereo view that contains stereo image data obtained from the SEM, along with geometric representations of two types of virtual measurement instruments, a ''protractor'' and a ''caliper''. The measurements obtained from this system are an integral part of a medical study evaluating surfactant, a liquid coating the inner surface of the lung which makes possible the process of breathing. Measurements of the curvature and contact angle of submicronmore » diameter droplets of a fluorocarbon deposited on the surface of airways are performed in order to determine surface tension of the air/liquid interface. This approach has been extended to a microscopic level from the techniques of traditional surface science by measuring submicrometer rather than millimeter diameter droplets, as well as the lengths and curvature of cilia responsible for movement of the surfactant, the airway's protective liquid blanket. An earlier implementation of this approach for taking angle measurements from objects perceived in stereo image pairs using a virtual protractor is extended in this paper to include distance measurements and to use a unified view model. The system is built around a unified view model that is derived from microscope-specific parameters, such as focal length, visible area and magnification. The unified view model ensures that the underlying view models and resultant binocular parallax cues are consistent between synthetic and acquired imagery. When the view models are consistent, it is possible to take measurements of features that are not constrained to lie within the projection plane. The system is first calibrated using non-clinical data of known size and resolution. Using the SEM, stereo image pairs of grids and spheres of known resolution are created to calibrate the measurement system. After calibration, the system is used to take distance and angle measurements of clinical specimens.« less

  16. Kinetic and spectral descriptions of autoionization phenomena associated with atomic processes in plasmas

    NASA Astrophysics Data System (ADS)

    Jacobs, Verne L.

    2017-06-01

    This investigation has been devoted to the theoretical description and computer modeling of atomic processes giving rise to radiative emission in energetic electron and ion beam interactions and in laboratory plasmas. We are also interested in the effects of directed electron and ion collisions and of anisotropic electric and magnetic fields. In the kinetic-theory description, we treat excitation, de-excitation, ionization, and recombination in electron and ion encounters with partially ionized atomic systems, including the indirect contributions from processes involving autoionizing resonances. These fundamental collisional and electromagnetic interactions also provide particle and photon transport mechanisms. From the spectral perspective, the analysis of atomic radiative emission can reveal detailed information on the physical properties in the plasma environment, such as non-equilibrium electron and charge-state distributions as well as electric and magnetic field distributions. In this investigation, a reduced-density-matrix formulation is developed for the microscopic description of atomic electromagnetic interactions in the presence of environmental (collisional and radiative) relaxation and decoherence processes. Our central objective is a fundamental microscopic description of atomic electromagnetic processes, in which both bound-state and autoionization-resonance phenomena can be treated in a unified and self-consistent manner. The time-domain (equation-of-motion) and frequency-domain (resolvent-operator) formulations of the reduced-density-matrix approach are developed in a unified and self-consistent manner. This is necessary for our ultimate goal of a systematic and self-consistent treatment of non-equilibrium (possibly coherent) atomic-state kinetics and high-resolution (possibly overlapping) spectral-line shapes. We thereby propose the introduction of a generalized collisional-radiative atomic-state kinetics model based on a reduced-density-matrix formulation. It will become apparent that the full atomic data needs for the precise modeling of extreme non-equilibrium plasma environments extend beyond the conventional radiative-transition-probability and collisional-cross-section data sets.

  17. The role of the parahippocampal cortex in cognition

    PubMed Central

    Aminoff, Elissa M.; Kveraga, Kestutis; Bar, Moshe

    2013-01-01

    The parahippocampal cortex (PHC) has been associated with many cognitive processes, including visuospatial processing and episodic memory. To characterize the role of PHC in cognition a framework is required that unifies these disparate processes. An overarching account was proposed, whereby the PHC is part of a network of brain regions that processes contextual associations. Contextual associations are the principal element underlying many higher-level cognitive processes, and thus are suitable for unifying the PHC literature. Recent findings are reviewed that provide support for the contextual associations account of PHC function. In addition to reconciling a vast breadth of literature, the synthesis presented expands the implications of the proposed account and gives rise to new and general questions about context and cognition. PMID:23850264

  18. High order ADER schemes for a unified first order hyperbolic formulation of Newtonian continuum mechanics coupled with electro-dynamics

    NASA Astrophysics Data System (ADS)

    Dumbser, Michael; Peshkov, Ilya; Romenski, Evgeniy; Zanotti, Olindo

    2017-11-01

    In this paper, we propose a new unified first order hyperbolic model of Newtonian continuum mechanics coupled with electro-dynamics. The model is able to describe the behavior of moving elasto-plastic dielectric solids as well as viscous and inviscid fluids in the presence of electro-magnetic fields. It is actually a very peculiar feature of the proposed PDE system that viscous fluids are treated just as a special case of elasto-plastic solids. This is achieved by introducing a strain relaxation mechanism in the evolution equations of the distortion matrix A, which in the case of purely elastic solids maps the current configuration to the reference configuration. The model also contains a hyperbolic formulation of heat conduction as well as a dissipative source term in the evolution equations for the electric field given by Ohm's law. Via formal asymptotic analysis we show that in the stiff limit, the governing first order hyperbolic PDE system with relaxation source terms tends asymptotically to the well-known viscous and resistive magnetohydrodynamics (MHD) equations. Furthermore, a rigorous derivation of the model from variational principles is presented, together with the transformation of the Euler-Lagrange differential equations associated with the underlying variational problem from Lagrangian coordinates to Eulerian coordinates in a fixed laboratory frame. The present paper hence extends the unified first order hyperbolic model of Newtonian continuum mechanics recently proposed in [110,42] to the more general case where the continuum is coupled with electro-magnetic fields. The governing PDE system is symmetric hyperbolic and satisfies the first and second principle of thermodynamics, hence it belongs to the so-called class of symmetric hyperbolic thermodynamically compatible systems (SHTC), which have been studied for the first time by Godunov in 1961 [61] and later in a series of papers by Godunov and Romenski [67,69,119]. An important feature of the proposed model is that the propagation speeds of all physical processes, including dissipative processes, are finite. The model is discretized using high order accurate ADER discontinuous Galerkin (DG) finite element schemes with a posteriori subcell finite volume limiter and using high order ADER-WENO finite volume schemes. We show numerical test problems that explore a rather large parameter space of the model ranging from ideal MHD, viscous and resistive MHD over pure electro-dynamics to moving dielectric elastic solids in a magnetic field.

  19. SO(10) × S 4 grand unified theory of flavour and leptogenesis

    NASA Astrophysics Data System (ADS)

    de Anda, Francisco J.; King, Stephen F.; Perdomo, Elena

    2017-12-01

    We propose a Grand Unified Theory of Flavour, based on SO(10) together with a non-Abelian discrete group S 4, under which the unified three quark and lepton 16-plets are unified into a single triplet 3'. The model involves a further discrete group ℤ 4 R × ℤ 4 3 which controls the Higgs and flavon symmetry breaking sectors. The CSD2 flavon vacuum alignment is discussed, along with the GUT breaking potential and the doublet-triplet splitting, and proton decay is shown to be under control. The Yukawa matrices are derived in detail, from renormalisable diagrams, and neutrino masses emerge from the type I seesaw mechanism. A full numerical fit is performed with 15 input parameters generating 19 presently constrained observables, taking into account supersymmetry threshold corrections. The model predicts a normal neutrino mass ordering with a CP oscillation phase of 260°, an atmospheric angle in the first octant and neutrinoless double beta decay with m ββ = 11 meV. We discuss N 2 leptogenesis, which fixes the second right-handed neutrino mass to be M 2 ≃ 2 × 1011 GeV, in the natural range predicted by the model.

  20. The unified model of vegetarian identity: A conceptual framework for understanding plant-based food choices.

    PubMed

    Rosenfeld, Daniel L; Burrow, Anthony L

    2017-05-01

    By departing from social norms regarding food behaviors, vegetarians acquire membership in a distinct social group and can develop a salient vegetarian identity. However, vegetarian identities are diverse, multidimensional, and unique to each individual. Much research has identified fundamental psychological aspects of vegetarianism, and an identity framework that unifies these findings into common constructs and conceptually defines variables is needed. Integrating psychological theories of identity with research on food choices and vegetarianism, this paper proposes a conceptual model for studying vegetarianism: The Unified Model of Vegetarian Identity (UMVI). The UMVI encompasses ten dimensions-organized into three levels (contextual, internalized, and externalized)-that capture the role of vegetarianism in an individual's self-concept. Contextual dimensions situate vegetarianism within contexts; internalized dimensions outline self-evaluations; and externalized dimensions describe enactments of identity through behavior. Together, these dimensions form a coherent vegetarian identity, characterizing one's thoughts, feelings, and behaviors regarding being vegetarian. By unifying dimensions that capture psychological constructs universally, the UMVI can prevent discrepancies in operationalization, capture the inherent diversity of vegetarian identities, and enable future research to generate greater insight into how people understand themselves and their food choices. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Grand unified brane world scenario

    NASA Astrophysics Data System (ADS)

    Arai, Masato; Blaschke, Filip; Eto, Minoru; Sakai, Norisuke

    2017-12-01

    We present a field theoretical model unifying grand unified theory (GUT) and brane world scenario. As a concrete example, we consider S U (5 ) GUT in 4 +1 dimensions where our 3 +1 dimensional spacetime spontaneously arises on five domain walls. A field-dependent gauge kinetic term is used to localize massless non-Abelian gauge fields on the domain walls and to assure the charge universality of matter fields. We find the domain walls with the symmetry breaking S U (5 )→S U (3 )×S U (2 )×U (1 ) as a global minimum and all the undesirable moduli are stabilized with the mass scale of MGUT. Profiles of massless standard model particles are determined as a consequence of wall dynamics. The proton decay can be exponentially suppressed.

  2. Image understanding systems based on the unifying representation of perceptual and conceptual information and the solution of mid-level and high-level vision problems

    NASA Astrophysics Data System (ADS)

    Kuvychko, Igor

    2001-10-01

    Vision is a part of a larger information system that converts visual information into knowledge structures. These structures drive vision process, resolving ambiguity and uncertainty via feedback, and provide image understanding, that is an interpretation of visual information in terms of such knowledge models. A computer vision system based on such principles requires unifying representation of perceptual and conceptual information. Computer simulation models are built on the basis of graphs/networks. The ability of human brain to emulate similar graph/networks models is found. That means a very important shift of paradigm in our knowledge about brain from neural networks to the cortical software. Starting from the primary visual areas, brain analyzes an image as a graph-type spatial structure. Primary areas provide active fusion of image features on a spatial grid-like structure, where nodes are cortical columns. The spatial combination of different neighbor features cannot be described as a statistical/integral characteristic of the analyzed region, but uniquely characterizes such region itself. Spatial logic and topology naturally present in such structures. Mid-level vision processes like clustering, perceptual grouping, multilevel hierarchical compression, separation of figure from ground, etc. are special kinds of graph/network transformations. They convert low-level image structure into the set of more abstract ones, which represent objects and visual scene, making them easy for analysis by higher-level knowledge structures. Higher-level vision phenomena like shape from shading, occlusion, etc. are results of such analysis. Such approach gives opportunity not only to explain frequently unexplainable results of the cognitive science, but also to create intelligent computer vision systems that simulate perceptional processes in both what and where visual pathways. Such systems can open new horizons for robotic and computer vision industries.

  3. Process simulations for manufacturing of thick composites

    NASA Astrophysics Data System (ADS)

    Kempner, Evan A.

    The availability of manufacturing simulations for composites can significantly reduce the costs associated with process development. Simulations provide a tool for evaluating the effect of processing conditions on the quality of parts produced without requiring numerous experiments. This is especially significant in parts that have troublesome features such as large thickness. The development of simulations for thick walled composites has been approached by examining the mechanics of resin flow and fiber deformation during processing, applying these evaluations to develop simulations, and evaluating the simulation with experimental results. A unified analysis is developed to describe the three-dimensional resin flow and fiber preform deformation during processing regardless of the manufacturing process used. It is shown how the generic governing evaluations in the unified analysis can be applied to autoclave molding, compression molding, pultrusion, filament winding, and resin transfer molding. A comparison is provided with earlier models derived individually for these processes. The evaluations described for autoclave curing were used to produce a one-dimensional cure simulation for autoclave curing of thick composites. The simulation consists of an analysis for heat transfer and resin flow in the composite as well as bleeder plies used to absorb resin removed from the part. Experiments were performed in a hot press to approximate curing in an autoclave. Graphite/epoxy laminates of 3 cm and 5 cm thickness were cured while monitoring temperatures at several points inside the laminate and thickness. The simulation predicted temperatures fairly closely, but difficulties were encountered in correlation of thickness results. This simulation was also used to study the effects of prepreg aging on processing of thick composites. An investigation was also performed on filament winding with prepreg tow. Cylinders were wound of approximately 12 mm thickness with pressure gages at the mandrel-composite interface. Cylinders were hoop wound with tensions ranging from 13-34 N. An analytical model was developed to calculate change in stress due to relaxation during winding. Although compressive circumferential stresses occurred throughout each of the cylinders, the magnitude was fairly low.

  4. A LAI inversion algorithm based on the unified model of canopy bidirectional reflectance distribution function for the Heihe River Basin

    NASA Astrophysics Data System (ADS)

    Ma, B.; Li, J.; Fan, W.; Ren, H.; Xu, X.

    2017-12-01

    Leaf area index (LAI) is one of the important parameters of vegetation canopy structure, which can represent the growth condition of vegetation effectively. The accuracy, availability and timeliness of LAI data can be improved greatly, which is of great importance to vegetation-related research, such as the study of atmospheric, land surface and hydrological processes to obtain LAI by remote sensing method. Heihe River Basin is the inland river basin in northwest China. There are various types of vegetation and all kinds of terrain conditions in the basin, so it is helpful for testing the accuracy of the model under the complex surface and evaluating the correctness of the model to study LAI in this area. On the other hand, located in west arid area of China, the ecological environment of Heihe Basin is fragile, LAI is an important parameter to represent the vegetation growth condition, and can help us understand the status of vegetation in the Heihe River Basin. Different from the previous LAI inversion models, the BRDF (bidirectional reflectance distribution function) unified model can be applied for both continuous vegetation and discrete vegetation, it is appropriate to the complex vegetation distribution. LAI is the key input parameter of the model. We establish the inversion algorithm that can exactly retrieve LAI using remote sensing image based on the unified model. First, we determine the vegetation type through the vegetation classification map to obtain the corresponding G function, leaf and surface reflectivity. Then, we need to determine the leaf area index (LAI), the aggregation index (ζ) and the sky scattered light ratio (β) range and the value of the interval, entering all the parameters into the model to calculate the corresponding reflectivity ρ and establish the lookup table of different vegetation. Finally, we can invert LAI on the basis of the established lookup table. The principle of inversion is least squares method. We have produced 1 km LAI products from 2000 to 2014, once every 8 days. The results show that the algorithm owns good stability and can effectively invert LAI in areas with very complex vegetation and terrain conditions.

  5. Understanding Pitch Perception as a Hierarchical Process with Top-Down Modulation

    PubMed Central

    Balaguer-Ballester, Emili; Clark, Nicholas R.; Coath, Martin; Krumbholz, Katrin; Denham, Susan L.

    2009-01-01

    Pitch is one of the most important features of natural sounds, underlying the perception of melody in music and prosody in speech. However, the temporal dynamics of pitch processing are still poorly understood. Previous studies suggest that the auditory system uses a wide range of time scales to integrate pitch-related information and that the effective integration time is both task- and stimulus-dependent. None of the existing models of pitch processing can account for such task- and stimulus-dependent variations in processing time scales. This study presents an idealized neurocomputational model, which provides a unified account of the multiple time scales observed in pitch perception. The model is evaluated using a range of perceptual studies, which have not previously been accounted for by a single model, and new results from a neurophysiological experiment. In contrast to other approaches, the current model contains a hierarchy of integration stages and uses feedback to adapt the effective time scales of processing at each stage in response to changes in the input stimulus. The model has features in common with a hierarchical generative process and suggests a key role for efferent connections from central to sub-cortical areas in controlling the temporal dynamics of pitch processing. PMID:19266015

  6. A convergent model for distributed processing of Big Sensor Data in urban engineering networks

    NASA Astrophysics Data System (ADS)

    Parygin, D. S.; Finogeev, A. G.; Kamaev, V. A.; Finogeev, A. A.; Gnedkova, E. P.; Tyukov, A. P.

    2017-01-01

    The problems of development and research of a convergent model of the grid, cloud, fog and mobile computing for analytical Big Sensor Data processing are reviewed. The model is meant to create monitoring systems of spatially distributed objects of urban engineering networks and processes. The proposed approach is the convergence model of the distributed data processing organization. The fog computing model is used for the processing and aggregation of sensor data at the network nodes and/or industrial controllers. The program agents are loaded to perform computing tasks for the primary processing and data aggregation. The grid and the cloud computing models are used for integral indicators mining and accumulating. A computing cluster has a three-tier architecture, which includes the main server at the first level, a cluster of SCADA system servers at the second level, a lot of GPU video cards with the support for the Compute Unified Device Architecture at the third level. The mobile computing model is applied to visualize the results of intellectual analysis with the elements of augmented reality and geo-information technologies. The integrated indicators are transferred to the data center for accumulation in a multidimensional storage for the purpose of data mining and knowledge gaining.

  7. Beyond naïve cue combination: salience and social cues in early word learning.

    PubMed

    Yurovsky, Daniel; Frank, Michael C

    2017-03-01

    Children learn their earliest words through social interaction, but it is unknown how much they rely on social information. Some theories argue that word learning is fundamentally social from its outset, with even the youngest infants understanding intentions and using them to infer a social partner's target of reference. In contrast, other theories argue that early word learning is largely a perceptual process in which young children map words onto salient objects. One way of unifying these accounts is to model word learning as weighted cue combination, in which children attend to many potential cues to reference, but only gradually learn the correct weight to assign each cue. We tested four predictions of this kind of naïve cue combination account, using an eye-tracking paradigm that combines social word teaching and two-alternative forced-choice testing. None of the predictions were supported. We thus propose an alternative unifying account: children are sensitive to social information early, but their ability to gather and deploy this information is constrained by domain-general cognitive processes. Developmental changes in children's use of social cues emerge not from learning the predictive power of social cues, but from the gradual development of attention, memory, and speed of information processing. © 2015 John Wiley & Sons Ltd.

  8. Beyond Naïve Cue Combination: Salience and Social Cues in Early Word Learning

    PubMed Central

    Yurovsky, Daniel

    2015-01-01

    Children learn their earliest words through social interaction, but it is unknown how much they rely on social information. Some theories argue that word learning is fundamentally social from its outset, with even the youngest infants understanding intentions and using them to infer a social partner’s target of reference. In contrast, other theories argue that early word learning is largely a perceptual process in which young children map words onto salient objects. One way of unifying these accounts is to model word learning as weighted cue-combination, in which children attend to many potential cues to reference, but only gradually learn the correct weight to assign each cue. We tested four predictions of this kind of naïve cue-combination account, using an eye-tracking paradigm that combines social word-teaching and two-alternative forced-choice testing. None of the predictions were supported. We thus propose an alternative unifying account: children are sensitive to social information early, but their ability to gather and deploy this information is constrained by domain-general cognitive processes. Developmental changes in children’s use of social cues emerge not from learning the predictive power of social cues, but from the gradual development of attention, memory, and speed of information processing. PMID:26575408

  9. Parameterisation of Orographic Cloud Dynamics in a GCM

    DTIC Science & Technology

    2007-01-01

    makes use of both satellite observations of a case study, and a simulation in which the Unified Model is nudged to- wards ERA-40 assimilated winds...this parameterisation makes use of both satellite observations of a case study, and a simulation in which the Unified Model is nudged towards ERA-40...by ANSI Std Z39-18 et al. (1999), predicted the temperature perturbations in the lower stratosphere which can influence polar stratospheric clouds

  10. Unified concept of effective one component plasma for hot dense plasmas

    DOE PAGES

    Clerouin, Jean; Arnault, Philippe; Ticknor, Christopher; ...

    2016-03-17

    Orbital-free molecular dynamics simulations are used to benchmark two popular models for hot dense plasmas: the one component plasma (OCP) and the Yukawa model. A unified concept emerges where an effective OCP (EOCP) is constructed from the short-range structure of the plasma. An unambiguous ionization and the screening length can be defined and used for a Yukawa system, which reproduces the long-range structure with finite compressibility. Similarly, the dispersion relation of longitudinal waves is consistent with the screened model at vanishing wave number but merges with the OCP at high wave number. Additionally, the EOCP reproduces the overall relaxation timemore » scales of the correlation functions associated with ionic motion. Lastly, in the hot dense regime, this unified concept of EOCP can be fruitfully applied to deduce properties such as the equation of state, ionic transport coefficients, and the ion feature in x-ray Thomson scattering experiments.« less

  11. A Unified Model for Predicting the Open Hole Tensile and Compressive Strengths of Composite Laminates for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Davidson, Paul; Pineda, Evan J.; Heinrich, Christian; Waas, Anthony M.

    2013-01-01

    The open hole tensile and compressive strengths are important design parameters in qualifying fiber reinforced laminates for a wide variety of structural applications in the aerospace industry. In this paper, we present a unified model that can be used for predicting both these strengths (tensile and compressive) using the same set of coupon level, material property data. As a prelude to the unified computational model that follows, simplified approaches, referred to as "zeroth order", "first order", etc. with increasing levels of fidelity are first presented. The results and methods presented are practical and validated against experimental data. They serve as an introductory step in establishing a virtual building block, bottom-up approach to designing future airframe structures with composite materials. The results are useful for aerospace design engineers, particularly those that deal with airframe design.

  12. Generating Models of Surgical Procedures using UMLS Concepts and Multiple Sequence Alignment

    PubMed Central

    Meng, Frank; D’Avolio, Leonard W.; Chen, Andrew A.; Taira, Ricky K.; Kangarloo, Hooshang

    2005-01-01

    Surgical procedures can be viewed as a process composed of a sequence of steps performed on, by, or with the patient’s anatomy. This sequence is typically the pattern followed by surgeons when generating surgical report narratives for documenting surgical procedures. This paper describes a methodology for semi-automatically deriving a model of conducted surgeries, utilizing a sequence of derived Unified Medical Language System (UMLS) concepts for representing surgical procedures. A multiple sequence alignment was computed from a collection of such sequences and was used for generating the model. These models have the potential of being useful in a variety of informatics applications such as information retrieval and automatic document generation. PMID:16779094

  13. A Unified Fault-Tolerance Protocol

    NASA Technical Reports Server (NTRS)

    Miner, Paul; Gedser, Alfons; Pike, Lee; Maddalon, Jeffrey

    2004-01-01

    Davies and Wakerly show that Byzantine fault tolerance can be achieved by a cascade of broadcasts and middle value select functions. We present an extension of the Davies and Wakerly protocol, the unified protocol, and its proof of correctness. We prove that it satisfies validity and agreement properties for communication of exact values. We then introduce bounded communication error into the model. Inexact communication is inherent for clock synchronization protocols. We prove that validity and agreement properties hold for inexact communication, and that exact communication is a special case. As a running example, we illustrate the unified protocol using the SPIDER family of fault-tolerant architectures. In particular we demonstrate that the SPIDER interactive consistency, distributed diagnosis, and clock synchronization protocols are instances of the unified protocol.

  14. XFEM modeling of hydraulic fracture in porous rocks with natural fractures

    NASA Astrophysics Data System (ADS)

    Wang, Tao; Liu, ZhanLi; Zeng, QingLei; Gao, Yue; Zhuang, Zhuo

    2017-08-01

    Hydraulic fracture (HF) in porous rocks is a complex multi-physics coupling process which involves fluid flow, diffusion and solid deformation. In this paper, the extended finite element method (XFEM) coupling with Biot theory is developed to study the HF in permeable rocks with natural fractures (NFs). In the recent XFEM based computational HF models, the fluid flow in fractures and interstitials of the porous media are mostly solved separately, which brings difficulties in dealing with complex fracture morphology. In our new model the fluid flow is solved in a unified framework by considering the fractures as a kind of special porous media and introducing Poiseuille-type flow inside them instead of Darcy-type flow. The most advantage is that it is very convenient to deal with fluid flow inside the complex fracture network, which is important in shale gas extraction. The weak formulation for the new coupled model is derived based on virtual work principle, which includes the XFEM formulation for multiple fractures and fractures intersection in porous media and finite element formulation for the unified fluid flow. Then the plane strain Kristianovic-Geertsma-de Klerk (KGD) model and the fluid flow inside the fracture network are simulated to validate the accuracy and applicability of this method. The numerical results show that large injection rate, low rock permeability and isotropic in-situ stresses tend to lead to a more uniform and productive fracture network.

  15. Toward a Unified Science Curriculum.

    ERIC Educational Resources Information Center

    Showalter, Victor M.

    The two major models of science curriculum change, textbook revision and national curriculum projects, are derived from, and reinforce, the present curriculum structure. This is undesirable in a time of increasing fluidity and change, because adaptation to new situations is difficult. Unified science, based on the premise that science is a unity,…

  16. Definition and Proposed Realization of the International Height Reference System (IHRS)

    NASA Astrophysics Data System (ADS)

    Ihde, Johannes; Sánchez, Laura; Barzaghi, Riccardo; Drewes, Hermann; Foerste, Christoph; Gruber, Thomas; Liebsch, Gunter; Marti, Urs; Pail, Roland; Sideris, Michael

    2017-05-01

    Studying, understanding and modelling global change require geodetic reference frames with an order of accuracy higher than the magnitude of the effects to be actually studied and with high consistency and reliability worldwide. The International Association of Geodesy, taking care of providing a precise geodetic infrastructure for monitoring the Earth system, promotes the implementation of an integrated global geodetic reference frame that provides a reliable frame for consistent analysis and modelling of global phenomena and processes affecting the Earth's gravity field, the Earth's surface geometry and the Earth's rotation. The definition, realization, maintenance and wide utilization of the International Terrestrial Reference System guarantee a globally unified geometric reference frame with an accuracy at the millimetre level. An equivalent high-precision global physical reference frame that supports the reliable description of changes in the Earth's gravity field (such as sea level variations, mass displacements, processes associated with geophysical fluids) is missing. This paper addresses the theoretical foundations supporting the implementation of such a physical reference surface in terms of an International Height Reference System and provides guidance for the coming activities required for the practical and sustainable realization of this system. Based on conceptual approaches of physical geodesy, the requirements for a unified global height reference system are derived. In accordance with the practice, its realization as the International Height Reference Frame is designed. Further steps for the implementation are also proposed.

  17. Personality and self-regulation: trait and information-processing perspectives.

    PubMed

    Hoyle, Rick H

    2006-12-01

    This article introduces the special issue of Journal of Personality on personality and self-regulation. The goal of the issue is to illustrate and inspire research that integrates personality and process-oriented accounts of self-regulation. The article begins by discussing the trait perspective on self-regulation--distinguishing between temperament and personality accounts--and the information-processing perspective. Three approaches to integrating these perspectives are then presented. These range from methodological approaches, in which constructs representing the two perspectives are examined in integrated statistical models, to conceptual approaches, in which the two perspectives are unified in a holistic theoretical model of self-regulation. The article concludes with an overview of the special issue contributions, which are organized in four sections: broad, integrative models of personality and self-regulation; models that examine the developmental origins of self-regulation and self-regulatory styles; focused programs of research that concern specific aspects or applications of self-regulation; and strategies for increasing the efficiency and effectiveness of self-regulation.

  18. The Unified Database for BM@N experiment data handling

    NASA Astrophysics Data System (ADS)

    Gertsenberger, Konstantin; Rogachevsky, Oleg

    2018-04-01

    The article describes the developed Unified Database designed as a comprehensive relational data storage for the BM@N experiment at the Joint Institute for Nuclear Research in Dubna. The BM@N experiment, which is one of the main elements of the first stage of the NICA project, is a fixed target experiment at extracted Nuclotron beams of the Laboratory of High Energy Physics (LHEP JINR). The structure and purposes of the BM@N setup are briefly presented. The article considers the scheme of the Unified Database, its attributes and implemented features in detail. The use of the developed BM@N database provides correct multi-user access to actual information of the experiment for data processing. It stores information on the experiment runs, detectors and their geometries, different configuration, calibration and algorithm parameters used in offline data processing. An important part of any database - user interfaces are presented.

  19. Unified tensor model for space-frequency spreading-multiplexing (SFSM) MIMO communication systems

    NASA Astrophysics Data System (ADS)

    de Almeida, André LF; Favier, Gérard

    2013-12-01

    This paper presents a unified tensor model for space-frequency spreading-multiplexing (SFSM) multiple-input multiple-output (MIMO) wireless communication systems that combine space- and frequency-domain spreadings, followed by a space-frequency multiplexing. Spreading across space (transmit antennas) and frequency (subcarriers) adds resilience against deep channel fades and provides space and frequency diversities, while orthogonal space-frequency multiplexing enables multi-stream transmission. We adopt a tensor-based formulation for the proposed SFSM MIMO system that incorporates space, frequency, time, and code dimensions by means of the parallel factor model. The developed SFSM tensor model unifies the tensorial formulation of some existing multiple-access/multicarrier MIMO signaling schemes as special cases, while revealing interesting tradeoffs due to combined space, frequency, and time diversities which are of practical relevance for joint symbol-channel-code estimation. The performance of the proposed SFSM MIMO system using either a zero forcing receiver or a semi-blind tensor-based receiver is illustrated by means of computer simulation results under realistic channel and system parameters.

  20. Measuring and predicting sooting tendencies of oxygenates, alkanes, alkenes, cycloalkanes, and aromatics on a unified scale

    DOE PAGES

    Das, Dhrubajyoti D.; St. John, Peter C.; McEnally, Charles S.; ...

    2017-12-27

    Databases of sooting indices, based on measuring some aspect of sooting behavior in a standardized combustion environment, are useful in providing information on the comparative sooting tendencies of different fuels or pure compounds. However, newer biofuels have varied chemical structures including both aromatic and oxygenated functional groups, which expands the chemical space of relevant compounds. In this work, we propose a unified sooting tendency database for pure compounds, including both regular and oxygenated hydrocarbons, which is based on combining two disparate databases of yield-based sooting tendency measurements in the literature. Unification of the different databases was made possible by leveragingmore » the greater dynamic range of the color ratio pyrometry soot diagnostic. This unified database contains a substantial number of pure compounds (≥ 400 total) from multiple categories of hydrocarbons important in modern fuels and establishes the sooting tendencies of aromatic and oxygenated hydrocarbons on the same numeric scale for the first time. Then, using this unified sooting tendency database, we have developed a predictive model for sooting behavior applicable to a broad range of hydrocarbons and oxygenated hydrocarbons. The model decomposes each compound into single-carbon fragments and assigns a sooting tendency contribution to each fragment based on regression against the unified database. The model’s predictive accuracy (as demonstrated by leave-one-out cross-validation) is comparable to a previously developed, more detailed predictive model. The fitted model provides insight into the effects of chemical structure on soot formation, and cases where its predictions fail reveal the presence of more complicated kinetic sooting mechanisms. Our work will therefore enable the rational design of low-sooting fuel blends from a wide range of feedstocks and chemical functionalities.« less

  1. Measuring and predicting sooting tendencies of oxygenates, alkanes, alkenes, cycloalkanes, and aromatics on a unified scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Das, Dhrubajyoti D.; St. John, Peter C.; McEnally, Charles S.

    Databases of sooting indices, based on measuring some aspect of sooting behavior in a standardized combustion environment, are useful in providing information on the comparative sooting tendencies of different fuels or pure compounds. However, newer biofuels have varied chemical structures including both aromatic and oxygenated functional groups, which expands the chemical space of relevant compounds. In this work, we propose a unified sooting tendency database for pure compounds, including both regular and oxygenated hydrocarbons, which is based on combining two disparate databases of yield-based sooting tendency measurements in the literature. Unification of the different databases was made possible by leveragingmore » the greater dynamic range of the color ratio pyrometry soot diagnostic. This unified database contains a substantial number of pure compounds (≥ 400 total) from multiple categories of hydrocarbons important in modern fuels and establishes the sooting tendencies of aromatic and oxygenated hydrocarbons on the same numeric scale for the first time. Then, using this unified sooting tendency database, we have developed a predictive model for sooting behavior applicable to a broad range of hydrocarbons and oxygenated hydrocarbons. The model decomposes each compound into single-carbon fragments and assigns a sooting tendency contribution to each fragment based on regression against the unified database. The model’s predictive accuracy (as demonstrated by leave-one-out cross-validation) is comparable to a previously developed, more detailed predictive model. The fitted model provides insight into the effects of chemical structure on soot formation, and cases where its predictions fail reveal the presence of more complicated kinetic sooting mechanisms. Our work will therefore enable the rational design of low-sooting fuel blends from a wide range of feedstocks and chemical functionalities.« less

  2. The Sustainability Cycle and Loop: models for a more unified understanding of sustainability.

    PubMed

    Hay, Laura; Duffy, Alex; Whitfield, R I

    2014-01-15

    In spite of the considerable research on sustainability, reports suggest that we are barely any closer to a more sustainable society. As such, there is an urgent need to improve the effectiveness of human efforts towards sustainability. A clearer and more unified understanding of sustainability among different people and sectors could help to facilitate this. This paper presents the results of an inductive literature investigation, aiming to develop models to explain the nature of sustainability in the Earth system, and how humans can effectively strive for it. The major contributions are two general and complementary models, that may be applied in any context to provide a common basis for understanding sustainability: the Sustainability Cycle (S-Cycle), and the Sustainability Loop (S-Loop). Literature spanning multiple sectors is examined from the perspective of three concepts, emerging as significant in relation to our aim. Systems are shown to provide the context for human action towards sustainability, and the nature of the Earth system and its sub-systems is explored. Activities are outlined as a fundamental target that humans need to sustain, since they produce the entities both needed and desired by society. The basic behaviour of activities operating in the Earth system is outlined. Finally, knowledge is positioned as the driver of human action towards sustainability, and the key components of knowledge involved are examined. The S-Cycle and S-Loop models are developed via a process of induction from the reviewed literature. The S-Cycle describes the operation of activities in a system from the perspective of sustainability. The sustainability of activities in a system depends upon the availability of resources, and the availability of resources depends upon the rate that activities consume and produce them. Humans may intervene in these dynamics via an iterative process of interpretation and action, described in the S-Loop model. The models are briefly applied to a system described in the literature. It is shown that the S-Loop may be used to guide efforts towards sustainability in a particular system of interest, by prescribing the basic activities involved. The S-Cycle may be applied complementary to the S-Loop, to support the interpretation of activity behaviour described in the latter. Given their general nature, the models provide the basis for a more unified understanding of sustainability. It is hoped that their use may go some way towards improving the effectiveness of human action towards sustainability. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Critical length scale controls adhesive wear mechanisms

    PubMed Central

    Aghababaei, Ramin; Warner, Derek H.; Molinari, Jean-Francois

    2016-01-01

    The adhesive wear process remains one of the least understood areas of mechanics. While it has long been established that adhesive wear is a direct result of contacting surface asperities, an agreed upon understanding of how contacting asperities lead to wear debris particle has remained elusive. This has restricted adhesive wear prediction to empirical models with limited transferability. Here we show that discrepant observations and predictions of two distinct adhesive wear mechanisms can be reconciled into a unified framework. Using atomistic simulations with model interatomic potentials, we reveal a transition in the asperity wear mechanism when contact junctions fall below a critical length scale. A simple analytic model is formulated to predict the transition in both the simulation results and experiments. This new understanding may help expand use of computer modelling to explore adhesive wear processes and to advance physics-based wear laws without empirical coefficients. PMID:27264270

  4. Blind Source Separation for Unimodal and Multimodal Brain Networks: A Unifying Framework for Subspace Modeling

    PubMed Central

    Silva, Rogers F.; Plis, Sergey M.; Sui, Jing; Pattichis, Marios S.; Adalı, Tülay; Calhoun, Vince D.

    2016-01-01

    In the past decade, numerous advances in the study of the human brain were fostered by successful applications of blind source separation (BSS) methods to a wide range of imaging modalities. The main focus has been on extracting “networks” represented as the underlying latent sources. While the broad success in learning latent representations from multiple datasets has promoted the wide presence of BSS in modern neuroscience, it also introduced a wide variety of objective functions, underlying graphical structures, and parameter constraints for each method. Such diversity, combined with a host of datatype-specific know-how, can cause a sense of disorder and confusion, hampering a practitioner’s judgment and impeding further development. We organize the diverse landscape of BSS models by exposing its key features and combining them to establish a novel unifying view of the area. In the process, we unveil important connections among models according to their properties and subspace structures. Consequently, a high-level descriptive structure is exposed, ultimately helping practitioners select the right model for their applications. Equipped with that knowledge, we review the current state of BSS applications to neuroimaging. The gained insight into model connections elicits a broader sense of generalization, highlighting several directions for model development. In light of that, we discuss emerging multi-dataset multidimensional (MDM) models and summarize their benefits for the study of the healthy brain and disease-related changes. PMID:28461840

  5. Improving the accuracy in detection of clustered microcalcifications with a context-sensitive classification model.

    PubMed

    Wang, Juan; Nishikawa, Robert M; Yang, Yongyi

    2016-01-01

    In computer-aided detection of microcalcifications (MCs), the detection accuracy is often compromised by frequent occurrence of false positives (FPs), which can be attributed to a number of factors, including imaging noise, inhomogeneity in tissue background, linear structures, and artifacts in mammograms. In this study, the authors investigated a unified classification approach for combating the adverse effects of these heterogeneous factors for accurate MC detection. To accommodate FPs caused by different factors in a mammogram image, the authors developed a classification model to which the input features were adapted according to the image context at a detection location. For this purpose, the input features were defined in two groups, of which one group was derived from the image intensity pattern in a local neighborhood of a detection location, and the other group was used to characterize how a MC is different from its structural background. Owing to the distinctive effect of linear structures in the detector response, the authors introduced a dummy variable into the unified classifier model, which allowed the input features to be adapted according to the image context at a detection location (i.e., presence or absence of linear structures). To suppress the effect of inhomogeneity in tissue background, the input features were extracted from different domains aimed for enhancing MCs in a mammogram image. To demonstrate the flexibility of the proposed approach, the authors implemented the unified classifier model by two widely used machine learning algorithms, namely, a support vector machine (SVM) classifier and an Adaboost classifier. In the experiment, the proposed approach was tested for two representative MC detectors in the literature [difference-of-Gaussians (DoG) detector and SVM detector]. The detection performance was assessed using free-response receiver operating characteristic (FROC) analysis on a set of 141 screen-film mammogram (SFM) images (66 cases) and a set of 188 full-field digital mammogram (FFDM) images (95 cases). The FROC analysis results show that the proposed unified classification approach can significantly improve the detection accuracy of two MC detectors on both SFM and FFDM images. Despite the difference in performance between the two detectors, the unified classifiers can reduce their FP rate to a similar level in the output of the two detectors. In particular, with true-positive rate at 85%, the FP rate on SFM images for the DoG detector was reduced from 1.16 to 0.33 clusters/image (unified SVM) and 0.36 clusters/image (unified Adaboost), respectively; similarly, for the SVM detector, the FP rate was reduced from 0.45 clusters/image to 0.30 clusters/image (unified SVM) and 0.25 clusters/image (unified Adaboost), respectively. Similar FP reduction results were also achieved on FFDM images for the two MC detectors. The proposed unified classification approach can be effective for discriminating MCs from FPs caused by different factors (such as MC-like noise patterns and linear structures) in MC detection. The framework is general and can be applicable for further improving the detection accuracy of existing MC detectors.

  6. Food-web based unified model of macro- and microevolution.

    PubMed

    Chowdhury, Debashish; Stauffer, Dietrich

    2003-10-01

    We incorporate the generic hierarchical architecture of foodwebs into a "unified" model that describes both micro- and macroevolutions within a single theoretical framework. This model describes the microevolution in detail by accounting for the birth, ageing, and natural death of individual organisms as well as prey-predator interactions on a hierarchical dynamic food web. It also provides a natural description of random mutations and speciation (origination) of species as well as their extinctions. The distribution of lifetimes of species follows an approximate power law only over a limited regime.

  7. Vector Autoregression, Structural Equation Modeling, and Their Synthesis in Neuroimaging Data Analysis

    PubMed Central

    Chen, Gang; Glen, Daniel R.; Saad, Ziad S.; Hamilton, J. Paul; Thomason, Moriah E.; Gotlib, Ian H.; Cox, Robert W.

    2011-01-01

    Vector autoregression (VAR) and structural equation modeling (SEM) are two popular brain-network modeling tools. VAR, which is a data-driven approach, assumes that connected regions exert time-lagged influences on one another. In contrast, the hypothesis-driven SEM is used to validate an existing connectivity model where connected regions have contemporaneous interactions among them. We present the two models in detail and discuss their applicability to FMRI data, and interpretational limits. We also propose a unified approach that models both lagged and contemporaneous effects. The unifying model, structural vector autoregression (SVAR), may improve statistical and explanatory power, and avoids some prevalent pitfalls that can occur when VAR and SEM are utilized separately. PMID:21975109

  8. Real-time individualization of the unified model of performance.

    PubMed

    Liu, Jianbo; Ramakrishnan, Sridhar; Laxminarayan, Srinivas; Balkin, Thomas J; Reifman, Jaques

    2017-12-01

    Existing mathematical models for predicting neurobehavioural performance are not suited for mobile computing platforms because they cannot adapt model parameters automatically in real time to reflect individual differences in the effects of sleep loss. We used an extended Kalman filter to develop a computationally efficient algorithm that continually adapts the parameters of the recently developed Unified Model of Performance (UMP) to an individual. The algorithm accomplishes this in real time as new performance data for the individual become available. We assessed the algorithm's performance by simulating real-time model individualization for 18 subjects subjected to 64 h of total sleep deprivation (TSD) and 7 days of chronic sleep restriction (CSR) with 3 h of time in bed per night, using psychomotor vigilance task (PVT) data collected every 2 h during wakefulness. This UMP individualization process produced parameter estimates that progressively approached the solution produced by a post-hoc fitting of model parameters using all data. The minimum number of PVT measurements needed to individualize the model parameters depended upon the type of sleep-loss challenge, with ~30 required for TSD and ~70 for CSR. However, model individualization depended upon the overall duration of data collection, yielding increasingly accurate model parameters with greater number of days. Interestingly, reducing the PVT sampling frequency by a factor of two did not notably hamper model individualization. The proposed algorithm facilitates real-time learning of an individual's trait-like responses to sleep loss and enables the development of individualized performance prediction models for use in a mobile computing platform. © 2017 European Sleep Research Society.

  9. A layered abduction model of perception: Integrating bottom-up and top-down processing in a multi-sense agent

    NASA Technical Reports Server (NTRS)

    Josephson, John R.

    1989-01-01

    A layered-abduction model of perception is presented which unifies bottom-up and top-down processing in a single logical and information-processing framework. The process of interpreting the input from each sense is broken down into discrete layers of interpretation, where at each layer a best explanation hypothesis is formed of the data presented by the layer or layers below, with the help of information available laterally and from above. The formation of this hypothesis is treated as a problem of abductive inference, similar to diagnosis and theory formation. Thus this model brings a knowledge-based problem-solving approach to the analysis of perception, treating perception as a kind of compiled cognition. The bottom-up passing of information from layer to layer defines channels of information flow, which separate and converge in a specific way for any specific sense modality. Multi-modal perception occurs where channels converge from more than one sense. This model has not yet been implemented, though it is based on systems which have been successful in medical and mechanical diagnosis and medical test interpretation.

  10. Toward a computational theory for motion understanding: The expert animators model

    NASA Technical Reports Server (NTRS)

    Mohamed, Ahmed S.; Armstrong, William W.

    1988-01-01

    Artificial intelligence researchers claim to understand some aspect of human intelligence when their model is able to emulate it. In the context of computer graphics, the ability to go from motion representation to convincing animation should accordingly be treated not simply as a trick for computer graphics programmers but as important epistemological and methodological goal. In this paper we investigate a unifying model for animating a group of articulated bodies such as humans and robots in a three-dimensional environment. The proposed model is considered in the framework of knowledge representation and processing, with special reference to motion knowledge. The model is meant to help setting the basis for a computational theory for motion understanding applied to articulated bodies.

  11. Dynamical crossover in a stochastic model of cell fate decision

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Hiroki; Kawaguchi, Kyogo; Sagawa, Takahiro

    2017-07-01

    We study the asymptotic behaviors of stochastic cell fate decision between proliferation and differentiation. We propose a model of a self-replicating Langevin system, where cells choose their fate (i.e., proliferation or differentiation) depending on local cell density. Based on this model, we propose a scenario for multicellular organisms to maintain the density of cells (i.e., homeostasis) through finite-ranged cell-cell interactions. Furthermore, we numerically show that the distribution of the number of descendant cells changes over time, thus unifying the previously proposed two models regarding homeostasis: the critical birth death process and the voter model. Our results provide a general platform for the study of stochastic cell fate decision in terms of nonequilibrium statistical mechanics.

  12. Have We Achieved a Unified Model of Photoreceptor Cell Fate Specification in Vertebrates?

    PubMed Central

    Raymond, Pamela A.

    2008-01-01

    How does a retinal progenitor choose to differentiate as a rod or a cone and, if it becomes a cone, which one of their different subtypes? The mechanisms of photoreceptor cell fate specification and differentiation have been extensively investigated in a variety of animal model systems, including human and non-human primates, rodents (mice and rats), chickens, frogs (Xenopus) and fish. It appears timely to discuss whether it is possible to synthesize the resulting information into a unified model applicable to all vertebrates. In this review we focus on several widely used experimental animal model systems to highlight differences in photoreceptor properties among species, the diversity of developmental strategies and solutions that vertebrates use to create retinas with photoreceptors that are adapted to the visual needs of their species, and the limitations of the methods currently available for the investigation of photoreceptor cell fate specification. Based on these considerations, we conclude that we are not yet ready to construct a unified model of photoreceptor cell fate specification in the developing vertebrate retina. PMID:17466954

  13. Unification of the general non-linear sigma model and the Virasoro master equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boer, J. de; Halpern, M.B.

    1997-06-01

    The Virasoro master equation describes a large set of conformal field theories known as the affine-Virasoro constructions, in the operator algebra (affinie Lie algebra) of the WZW model, while the einstein equations of the general non-linear sigma model describe another large set of conformal field theories. This talk summarizes recent work which unifies these two sets of conformal field theories, together with a presumable large class of new conformal field theories. The basic idea is to consider spin-two operators of the form L{sub ij}{partial_derivative}x{sup i}{partial_derivative}x{sup j} in the background of a general sigma model. The requirement that these operators satisfymore » the Virasoro algebra leads to a set of equations called the unified Einstein-Virasoro master equation, in which the spin-two spacetime field L{sub ij} cuples to the usual spacetime fields of the sigma model. The one-loop form of this unified system is presented, and some of its algebraic and geometric properties are discussed.« less

  14. An LMI approach to design H(infinity) controllers for discrete-time nonlinear systems based on unified models.

    PubMed

    Liu, Meiqin; Zhang, Senlin

    2008-10-01

    A unified neural network model termed standard neural network model (SNNM) is advanced. Based on the robust L(2) gain (i.e. robust H(infinity) performance) analysis of the SNNM with external disturbances, a state-feedback control law is designed for the SNNM to stabilize the closed-loop system and eliminate the effect of external disturbances. The control design constraints are shown to be a set of linear matrix inequalities (LMIs) which can be easily solved by various convex optimization algorithms (e.g. interior-point algorithms) to determine the control law. Most discrete-time recurrent neural network (RNNs) and discrete-time nonlinear systems modelled by neural networks or Takagi and Sugeno (T-S) fuzzy models can be transformed into the SNNMs to be robust H(infinity) performance analyzed or robust H(infinity) controller synthesized in a unified SNNM's framework. Finally, some examples are presented to illustrate the wide application of the SNNMs to the nonlinear systems, and the proposed approach is compared with related methods reported in the literature.

  15. A unified tensor level set for image segmentation.

    PubMed

    Wang, Bin; Gao, Xinbo; Tao, Dacheng; Li, Xuelong

    2010-06-01

    This paper presents a new region-based unified tensor level set model for image segmentation. This model introduces a three-order tensor to comprehensively depict features of pixels, e.g., gray value and the local geometrical features, such as orientation and gradient, and then, by defining a weighted distance, we generalized the representative region-based level set method from scalar to tensor. The proposed model has four main advantages compared with the traditional representative method as follows. First, involving the Gaussian filter bank, the model is robust against noise, particularly the salt- and pepper-type noise. Second, considering the local geometrical features, e.g., orientation and gradient, the model pays more attention to boundaries and makes the evolving curve stop more easily at the boundary location. Third, due to the unified tensor pixel representation representing the pixels, the model segments images more accurately and naturally. Fourth, based on a weighted distance definition, the model possesses the capacity to cope with data varying from scalar to vector, then to high-order tensor. We apply the proposed method to synthetic, medical, and natural images, and the result suggests that the proposed method is superior to the available representative region-based level set method.

  16. An Eddy-Diffusivity Mass-flux (EDMF) closure for the unified representation of cloud and convective processes

    NASA Astrophysics Data System (ADS)

    Tan, Z.; Schneider, T.; Teixeira, J.; Lam, R.; Pressel, K. G.

    2014-12-01

    Sub-grid scale (SGS) closures in current climate models are usually decomposed into several largely independent parameterization schemes for different cloud and convective processes, such as boundary layer turbulence, shallow convection, and deep convection. These separate parameterizations usually do not converge as the resolution is increased or as physical limits are taken. This makes it difficult to represent the interactions and smooth transition among different cloud and convective regimes. Here we present an eddy-diffusivity mass-flux (EDMF) closure that represents all sub-grid scale turbulent, convective, and cloud processes in a unified parameterization scheme. The buoyant updrafts and precipitative downdrafts are parameterized with a prognostic multiple-plume mass-flux (MF) scheme. The prognostic term for the mass flux is kept so that the life cycles of convective plumes are better represented. The interaction between updrafts and downdrafts are parameterized with the buoyancy-sorting model. The turbulent mixing outside plumes is represented by eddy diffusion, in which eddy diffusivity (ED) is determined from a turbulent kinetic energy (TKE) calculated from a TKE balance that couples the environment with updrafts and downdrafts. Similarly, tracer variances are decomposed consistently between updrafts, downdrafts and the environment. The closure is internally coupled with a probabilistic cloud scheme and a simple precipitation scheme. We have also developed a relatively simple two-stream radiative scheme that includes the longwave (LW) and shortwave (SW) effects of clouds, and the LW effect of water vapor. We have tested this closure in a single-column model for various regimes spanning stratocumulus, shallow cumulus, and deep convection. The model is also run towards statistical equilibrium with climatologically relevant large-scale forcings. These model tests are validated against large-eddy simulation (LES) with the same forcings. The comparison of results verifies the capacity of this closure to realistically represent different cloud and convective processes. Implementation of the closure in an idealized GCM allows us to study cloud feedbacks to climate change and to study the interactions between clouds, convections, and the large-scale circulation.

  17. Peer-to-peer Monte Carlo simulation of photon migration in topical applications of biomedical optics

    NASA Astrophysics Data System (ADS)

    Doronin, Alexander; Meglinski, Igor

    2012-09-01

    In the framework of further development of the unified approach of photon migration in complex turbid media, such as biological tissues we present a peer-to-peer (P2P) Monte Carlo (MC) code. The object-oriented programming is used for generalization of MC model for multipurpose use in various applications of biomedical optics. The online user interface providing multiuser access is developed using modern web technologies, such as Microsoft Silverlight, ASP.NET. The emerging P2P network utilizing computers with different types of compute unified device architecture-capable graphics processing units (GPUs) is applied for acceleration and to overcome the limitations, imposed by multiuser access in the online MC computational tool. The developed P2P MC was validated by comparing the results of simulation of diffuse reflectance and fluence rate distribution for semi-infinite scattering medium with known analytical results, results of adding-doubling method, and with other GPU-based MC techniques developed in the past. The best speedup of processing multiuser requests in a range of 4 to 35 s was achieved using single-precision computing, and the double-precision computing for floating-point arithmetic operations provides higher accuracy.

  18. Peer-to-peer Monte Carlo simulation of photon migration in topical applications of biomedical optics.

    PubMed

    Doronin, Alexander; Meglinski, Igor

    2012-09-01

    In the framework of further development of the unified approach of photon migration in complex turbid media, such as biological tissues we present a peer-to-peer (P2P) Monte Carlo (MC) code. The object-oriented programming is used for generalization of MC model for multipurpose use in various applications of biomedical optics. The online user interface providing multiuser access is developed using modern web technologies, such as Microsoft Silverlight, ASP.NET. The emerging P2P network utilizing computers with different types of compute unified device architecture-capable graphics processing units (GPUs) is applied for acceleration and to overcome the limitations, imposed by multiuser access in the online MC computational tool. The developed P2P MC was validated by comparing the results of simulation of diffuse reflectance and fluence rate distribution for semi-infinite scattering medium with known analytical results, results of adding-doubling method, and with other GPU-based MC techniques developed in the past. The best speedup of processing multiuser requests in a range of 4 to 35 s was achieved using single-precision computing, and the double-precision computing for floating-point arithmetic operations provides higher accuracy.

  19. The coherence problem with th Unified Neutral Theory of biodiversity

    Treesearch

    James S. Clark

    2012-01-01

    The Unified Neutral Theory of Biodiversity (UNTB), proposed as an alternative to niche theory, has been viewed as a theory that species coexist without niche differences, without fitness differences, or with equal probability of success. Support is claimed when models lacking species differences predict highly aggregated metrics, such as species abundance distributions...

  20. The Unified Core: A "Major" Learning Community Model in Action

    ERIC Educational Resources Information Center

    Powell, Gwynn M.; Johnson, Corey W.; James, J. Joy; Dunlap, Rudy

    2011-01-01

    The Unified Core is an innovative approach to higher education that blends content through linked courses within a major to create a community of learners. This article offers the theoretical background for the approach, describes the implementation, and offers suggestions to educators who would like to design their own version of this innovative…

  1. The Impact of Investments in Additional Preparation on Unified State Exam Results

    ERIC Educational Resources Information Center

    Prakhov, Ilya Arkadyevich

    2015-01-01

    The paper proposes a model of educational strategies for college entrants that makes it possible to assess the investment efficiency in additional preparation as evidenced by the Unified State Exam [USE] scores. It was found that college entrants still use traditional forms of preparation despite the new institutional admission conditions at…

  2. Construction of Critically Transformative Education in the Tucson Unified School District

    ERIC Educational Resources Information Center

    Romero, Augustine F.; Sánchez, H. T.

    2014-01-01

    A critically transformative education continues to be at the center of Tucson Unified School District's (TUSD) equity and academic excellence mission. Through the use of the Social Transformation paradigm and the lesson learned from the implementation of the Critically Compassionate Intellectualism Model, TUSD once again created a cutting edge…

  3. In Search of the Unifying Principles of Psychotherapy: Conceptual, Empirical, and Clinical Convergence

    ERIC Educational Resources Information Center

    Magnavita, Jeffrey J.

    2006-01-01

    The search for the principles of unified psychotherapy is an important stage in the advancement of the field. Converging evidence from various streams of clinical science allows the identification of some of the major domains of human functioning, adaptation, and dysfunction. These principles, supported by animal modeling, neuroscience, and…

  4. Technical Evaluation of Sample-Processing, Collection, and Preservation Methods

    DTIC Science & Technology

    2014-07-01

    For the Gram-positive organism, B. atrophaeus var. globigii (Unified Culture Collection [ UCC ] designation: BACI051) was selected as a surrogate for...the well-known biothreat agent Bacillus anthracis. For the Gram-negative organism, Y. pestis CO92 (pgm–) ( UCC designation: YERS059) was selected...Diagnostics device) TAMRA tetramethylrhodamine TE buffer tris-ethylenediaminetetraacetic acid buffer UCC Unified Culture Collection USG U.S. Government

  5. Groundwater modelling in decision support: reflections on a unified conceptual framework

    NASA Astrophysics Data System (ADS)

    Doherty, John; Simmons, Craig T.

    2013-11-01

    Groundwater models are commonly used as basis for environmental decision-making. There has been discussion and debate in recent times regarding the issue of model simplicity and complexity. This paper contributes to this ongoing discourse. The selection of an appropriate level of model structural and parameterization complexity is not a simple matter. Although the metrics on which such selection should be based are simple, there are many competing, and often unquantifiable, considerations which must be taken into account as these metrics are applied. A unified conceptual framework is introduced and described which is intended to underpin groundwater modelling in decision support with a direct focus on matters regarding model simplicity and complexity.

  6. Pattern-oriented modeling of agent-based complex systems: Lessons from ecology

    USGS Publications Warehouse

    Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.

    2005-01-01

    Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.

  7. Pattern-Oriented Modeling of Agent-Based Complex Systems: Lessons from Ecology

    NASA Astrophysics Data System (ADS)

    Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.

    2005-11-01

    Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.

  8. Small perturbations in a finger-tapping task reveal inherent nonlinearities of the underlying error correction mechanism.

    PubMed

    Bavassi, M Luz; Tagliazucchi, Enzo; Laje, Rodrigo

    2013-02-01

    Time processing in the few hundred milliseconds range is involved in the human skill of sensorimotor synchronization, like playing music in an ensemble or finger tapping to an external beat. In finger tapping, a mechanistic explanation in biologically plausible terms of how the brain achieves synchronization is still missing despite considerable research. In this work we show that nonlinear effects are important for the recovery of synchronization following a perturbation (a step change in stimulus period), even for perturbation magnitudes smaller than 10% of the period, which is well below the amount of perturbation needed to evoke other nonlinear effects like saturation. We build a nonlinear mathematical model for the error correction mechanism and test its predictions, and further propose a framework that allows us to unify the description of the three common types of perturbations. While previous authors have used two different model mechanisms for fitting different perturbation types, or have fitted different parameter value sets for different perturbation magnitudes, we propose the first unified description of the behavior following all perturbation types and magnitudes as the dynamical response of a compound model with fixed terms and a single set of parameter values. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Unified Microscopic-Macroscopic Monte Carlo Simulations of Complex Organic Molecule Chemistry in Cold Cores

    NASA Astrophysics Data System (ADS)

    Chang, Qiang; Herbst, Eric

    2016-03-01

    The recent discovery of methyl formate and dimethyl ether in the gas phase of cold cores with temperatures as cold as 10 K challenges our previous astrochemical models concerning the formation of complex organic molecules (COMs). The strong correlation between the abundances and distributions of methyl formate and dimethyl ether further shows that current astrochemical models may be missing important chemical processes in cold astronomical sources. We investigate a scenario in which COMs and the methoxy radical can be formed on dust grains via a so-called chain reaction mechanism, in a similar manner to CO2. A unified gas-grain microscopic-macroscopic Monte Carlo approach with both normal and interstitial sites for icy grain mantles is used to perform the chemical simulations. Reactive desorption with varying degrees of efficiency is included to enhance the nonthermal desorption of species formed on cold dust grains. In addition, varying degrees of efficiency for the surface formation of methoxy are also included. The observed abundances of a variety of organic molecules in cold cores can be reproduced in our models. The strong correlation between the abundances of methyl formate and dimethyl ether in cold cores can also be explained. Nondiffusive chemical reactions on dust grain surfaces may play a key role in the formation of some COMs.

  10. User's manual for UCAP: Unified Counter-Rotation Aero-Acoustics Program

    NASA Technical Reports Server (NTRS)

    Culver, E. M.; Mccolgan, C. J.

    1993-01-01

    This is the user's manual for the Unified Counter-rotation Aeroacoustics Program (UCAP), the counter-rotation derivative of the UAAP (Unified Aero-Acoustic Program). The purpose of this program is to predict steady and unsteady air loading on the blades and the noise produced by a counter-rotation Prop-Fan. The aerodynamic method is based on linear potential theory with corrections for nonlinearity associated with axial flux induction, vortex lift on the blades, and rotor-to-rotor interference. The theory for acoustics and the theory for individual blade loading and wakes are derived in Unified Aeroacoustics Analysis for High Speed Turboprop Aerodynamics and Noise, Volume 1 (NASA CR-4329). This user's manual also includes a brief explanation of the theory used for the modelling of counter-rotation.

  11. User's manual for UCAP: Unified Counter-Rotation Aero-Acoustics Program

    NASA Astrophysics Data System (ADS)

    Culver, E. M.; McColgan, C. J.

    1993-04-01

    This is the user's manual for the Unified Counter-rotation Aeroacoustics Program (UCAP), the counter-rotation derivative of the UAAP (Unified Aero-Acoustic Program). The purpose of this program is to predict steady and unsteady air loading on the blades and the noise produced by a counter-rotation Prop-Fan. The aerodynamic method is based on linear potential theory with corrections for nonlinearity associated with axial flux induction, vortex lift on the blades, and rotor-to-rotor interference. The theory for acoustics and the theory for individual blade loading and wakes are derived in Unified Aeroacoustics Analysis for High Speed Turboprop Aerodynamics and Noise, Volume 1 (NASA CR-4329). This user's manual also includes a brief explanation of the theory used for the modelling of counter-rotation.

  12. New Constraints on the Unified Model of Seyfert Galaxies

    NASA Astrophysics Data System (ADS)

    Maiolino, R.; Ruiz, M.; Rieke, G. H.; Keller, L. D.

    1995-06-01

    We present new 10 microns (N-band) photometry for 70 Seyfert galaxies, 43 of them previously unobserved. These observations, together with those collected from the literature, complete the 10 microns photometry for the CfA Sy galaxies and cover 80% of the Sy found in the RSA and 70% of the Sy in the IRAS 12 microns sample. From this data set, we find that Sy not showing any evidence for broad lines are systematically weaker in 10 microns nuclear emission than Sy nuclei having broad lines. This result may indicate the existence of a group of very low-luminosity Sy2 galaxies that do not have Sy1 counterparts in equal numbers, contrary to the strict unified theory. Alternately, the result can be reconciled with unified theories if a specific type of geometry is assumed for the circumnuclear obscuring material. By comparing the 10 microns ground-based observations with the IRAS 12 microns fluxes, we also study the properties of the extended mid-IR emission, i.e., the star forming activity of the host galaxy of the Sy nucleus. We find Sy2 to lie preferentially in galaxies experiencing enhanced star-forming activity, while Sy1 lie in normal or quiescent galaxies. This result appears to be inconsistent with the strict unified model, since the host galaxy properties should be independent of the orientation of a circumnuclear torus and therefore should be independent of nuclear type. Our finding could be explained by adding to the unified model a link between star-forming activity and the amount of obscuring material collected in the circumnuclear region.

  13. On Using SysML, DoDAF 2.0 and UPDM to Model the Architecture for the NOAA's Joint Polar Satellite System (JPSS) Ground System (GS)

    NASA Technical Reports Server (NTRS)

    Hayden, Jeffrey L.; Jeffries, Alan

    2012-01-01

    The JPSS Ground System is a lIexible system of systems responsible for telemetry, tracking & command (TT &C), data acquisition, routing and data processing services for a varied lIeet of satellites to support weather prediction, modeling and climate modeling. To assist in this engineering effort, architecture modeling tools are being employed to translate the former NPOESS baseline to the new JPSS baseline, The paper will focus on the methodology for the system engineering process and the use of these architecture modeling tools within that process, The Department of Defense Architecture Framework version 2,0 (DoDAF 2.0) viewpoints and views that are being used to describe the JPSS GS architecture are discussed. The Unified Profile for DoOAF and MODAF (UPDM) and Systems Modeling Language (SysML), as ' provided by extensions to the MagicDraw UML modeling tool, are used to develop the diagrams and tables that make up the architecture model. The model development process and structure are discussed, examples are shown, and details of handling the complexities of a large System of Systems (SoS), such as the JPSS GS, with an equally complex modeling tool, are described

  14. Mi-STAR Unit Challenges serve as a model for integrating earth science and systems thinking in a Next Generation Science Standards (NGSS) aligned curriculum.

    NASA Astrophysics Data System (ADS)

    Gochis, E. E.; Tubman, S.; Matthys, T.; Bluth, G.; Oppliger, D.; Danhoff, B.; Huntoon, J. E.

    2017-12-01

    Michigan Science Teaching and Assessment Reform (Mi-STAR) is developing an NGSS-aligned middle school curriculum and associated teacher professional learning program in which science is taught and learned as an integrated body of knowledge that can be applied to address societal issues. With the generous support of the Herbert H. and Grace A. Dow Foundation, Mi-STAR has released several pilot-tested units through the Mi-STAR curriculum portal at mi-star.mtu.edu. Each of these units focuses on an ongoing `Unit Challenge' investigation that integrates STEM content across disciplinary boundaries, stimulates interest, and engages students in using scientific practices to address 21st century challenges. Each Mi-STAR unit is connected to a Unifying NGSS Crosscutting Concept (CCC) that allows students to recognize the concepts that are related to the phenomena or problems under investigation. In the 6th grade, students begin with an exploration of the CCC Systems and System Models. Through repeated applications across units, students refine their understanding of what a system is and how to model a complex Earth system. An example 6th grade unit entitled "Water on the Move: The Water Cycle," provides an example of how Mi-STAR approaches the use of Unifying CCCs and Unit Challenges to enhance middle school students' understanding of the interconnections of Earth system processes and human activities. Throughout the unit, students use a series of hands-on explorations and simulations to explore the hydrologic cycle and how human activity can alter Earth systems. Students develop new knowledge through repeated interactions with the Unit Challenge, which requires development of system models and construction of evidence-based arguments related to flooding problems in a local community. Students have the opportunity to make predictions about how proposed land-use management practices (e.g. development of a skate-park, rain garden, soccer field, etc.) can alter the earth-system processes. Students present their findings and recommendations in a public forum format. Student-learning outcomes are measured using a combination of formative and summative assessments that address students' proficiency with science and engineering content and practices in conjunction with the unit's Unifying CCC.

  15. Phase ordering in disordered and inhomogeneous systems

    NASA Astrophysics Data System (ADS)

    Corberi, Federico; Zannetti, Marco; Lippiello, Eugenio; Burioni, Raffaella; Vezzani, Alessandro

    2015-06-01

    We study numerically the coarsening dynamics of the Ising model on a regular lattice with random bonds and on deterministic fractal substrates. We propose a unifying interpretation of the phase-ordering processes based on two classes of dynamical behaviors characterized by different growth laws of the ordered domain size, namely logarithmic or power law, respectively. It is conjectured that the interplay between these dynamical classes is regulated by the same topological feature that governs the presence or the absence of a finite-temperature phase transition.

  16. Pricing foreign equity option with stochastic volatility

    NASA Astrophysics Data System (ADS)

    Sun, Qi; Xu, Weidong

    2015-11-01

    In this paper we propose a general foreign equity option pricing framework that unifies the vast foreign equity option pricing literature and incorporates the stochastic volatility into foreign equity option pricing. Under our framework, the time-changed Lévy processes are used to model the underlying assets price of foreign equity option and the closed form pricing formula is obtained through the use of characteristic function methodology. Numerical tests indicate that stochastic volatility has a dramatic effect on the foreign equity option prices.

  17. Unified Static and Dynamic Recrystallization Model for the Minerals of Earth's Mantle Using Internal State Variable Model

    NASA Astrophysics Data System (ADS)

    Cho, H. E.; Horstemeyer, M. F.; Baumgardner, J. R.

    2017-12-01

    In this study, we present an internal state variable (ISV) constitutive model developed to model static and dynamic recrystallization and grain size progression in a unified manner. This method accurately captures temperature, pressure and strain rate effect on the recrystallization and grain size. Because this ISV approach treats dislocation density, volume fraction of recrystallization and grain size as internal variables, this model can simultaneously track their history during the deformation with unprecedented realism. Based on this deformation history, this method can capture realistic mechanical properties such as stress-strain behavior in the relationship of microstructure-mechanical property. Also, both the transient grain size during the deformation and the steady-state grain size of dynamic recrystallization can be predicted from the history variable of recrystallization volume fraction. Furthermore, because this model has a capability to simultaneously handle plasticity and creep behaviors (unified creep-plasticity), the mechanisms (static recovery (or diffusion creep), dynamic recovery (or dislocation creep) and hardening) related to dislocation dynamics can also be captured. To model these comprehensive mechanical behaviors, the mathematical formulation of this model includes elasticity to evaluate yield stress, work hardening in treating plasticity, creep, as well as the unified recrystallization and grain size progression. Because pressure sensitivity is especially important for the mantle minerals, we developed a yield function combining Drucker-Prager shear failure and von Mises yield surfaces to model the pressure dependent yield stress, while using pressure dependent work hardening and creep terms. Using these formulations, we calibrated against experimental data of the minerals acquired from the literature. Additionally, we also calibrated experimental data for metals to show the general applicability of our model. Understanding of realistic mantle dynamics can only be acquired once the various deformation regimes and mechanisms are comprehensively modeled. The results of this study demonstrate that this ISV model is a good modeling candidate to help reveal the realistic dynamics of the Earth's mantle.

  18. Building a Unified Computational Model for the Resonant X-Ray Scattering of Strongly Correlated Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bansil, Arun

    2016-12-01

    Basic-Energy Sciences of the Department of Energy (BES/DOE) has made large investments in x-ray sources in the U.S. (NSLS-II, LCLS, NGLS, ALS, APS) as powerful enabling tools for opening up unprecedented new opportunities for exploring properties of matter at various length and time scales. The coming online of the pulsed photon source literally allows us to see and follow the dynamics of processes in materials at their natural timescales. There is an urgent need therefore to develop theoretical methodologies and computational models for understanding how x-rays interact with matter and the related spectroscopies of materials. The present project addressed aspectsmore » of this grand challenge of X-ray science. In particular, our Collaborative Research Team (CRT) focused on understanding and modeling of elastic and inelastic resonant X-ray scattering processes. We worked to unify the three different computational approaches currently used for modeling X-ray scattering—density functional theory, dynamical mean-field theory, and small-cluster exact diagonalization—to achieve a more realistic material-specific picture of the interaction between X-rays and complex matter. To achieve a convergence in the interpretation and to maximize complementary aspects of different theoretical methods, we concentrated on the cuprates, where most experiments have been performed. Our team included both US and international researchers, and it fostered new collaborations between researchers currently working with different approaches. In addition, we developed close relationships with experimental groups working in the area at various synchrotron facilities in the US. Our CRT thus helped toward enabling the US to assume a leadership role in the theoretical development of the field, and to create a global network and community of scholars dedicated to X-ray scattering research.« less

  19. A Unified Estimation Framework for State-Related Changes in Effective Brain Connectivity.

    PubMed

    Samdin, S Balqis; Ting, Chee-Ming; Ombao, Hernando; Salleh, Sh-Hussain

    2017-04-01

    This paper addresses the critical problem of estimating time-evolving effective brain connectivity. Current approaches based on sliding window analysis or time-varying coefficient models do not simultaneously capture both slow and abrupt changes in causal interactions between different brain regions. To overcome these limitations, we develop a unified framework based on a switching vector autoregressive (SVAR) model. Here, the dynamic connectivity regimes are uniquely characterized by distinct vector autoregressive (VAR) processes and allowed to switch between quasi-stationary brain states. The state evolution and the associated directed dependencies are defined by a Markov process and the SVAR parameters. We develop a three-stage estimation algorithm for the SVAR model: 1) feature extraction using time-varying VAR (TV-VAR) coefficients, 2) preliminary regime identification via clustering of the TV-VAR coefficients, 3) refined regime segmentation by Kalman smoothing and parameter estimation via expectation-maximization algorithm under a state-space formulation, using initial estimates from the previous two stages. The proposed framework is adaptive to state-related changes and gives reliable estimates of effective connectivity. Simulation results show that our method provides accurate regime change-point detection and connectivity estimates. In real applications to brain signals, the approach was able to capture directed connectivity state changes in functional magnetic resonance imaging data linked with changes in stimulus conditions, and in epileptic electroencephalograms, differentiating ictal from nonictal periods. The proposed framework accurately identifies state-dependent changes in brain network and provides estimates of connectivity strength and directionality. The proposed approach is useful in neuroscience studies that investigate the dynamics of underlying brain states.

  20. Theory, modeling, and simulation of structural and functional materials: Micromechanics, microstructures, and properties

    NASA Astrophysics Data System (ADS)

    Jin, Yongmei

    In recent years, theoretical modeling and computational simulation of microstructure evolution and materials property has been attracting much attention. While significant advances have been made, two major challenges remain. One is the integration of multiple physical phenomena for simulation of complex materials behavior, the other is the bridging over multiple length and time scales in materials modeling and simulation. The research presented in this Thesis is focused mainly on tackling the first major challenge. In this Thesis, a unified Phase Field Microelasticity (PFM) approach is developed. This approach is an advanced version of the phase field method that takes into account the exact elasticity of arbitrarily anisotropic, elastically and structurally inhomogeneous systems. The proposed theory and models are applicable to infinite solids, elastic half-space, and finite bodies with arbitrary-shaped free surfaces, which may undergo various concomitant physical processes. The Phase Field Microelasticity approach is employed to formulate the theories and models of martensitic transformation, dislocation dynamics, and crack evolution in single crystal and polycrystalline solids. It is also used to study strain relaxation in heteroepitaxial thin films through misfit dislocation and surface roughening. Magnetic domain evolution in nanocrystalline thin films is also investigated. Numerous simulation studies are performed. Comparison with analytical predictions and experimental observations are presented. Agreement verities the theory and models as realistic simulation tools for computational materials science and engineering. The same Phase Field Microelasticity formalism of individual models of different physical phenomena makes it easy to integrate multiple physical processes into one unified simulation model, where multiple phenomena are treated as various relaxation modes that together act as one common cooperative phenomenon. The model does not impose a priori constraints on possible microstructure evolution paths. This gives the model predicting power, where material system itself "chooses" the optimal path for multiple processes. The advances made in this Thesis present a significant step forward to overcome the first challenge, mesoscale multi-physics modeling and simulation of materials. At the end of this Thesis, the way to tackle the second challenge, bridging over multiple length and time scales in materials modeling and simulation, is discussed based on connection between the mesoscale Phase Field Microelasticity modeling and microscopic atomistic calculation as well as macroscopic continuum theory.

  1. A unified analytical drain current model for Double-Gate Junctionless Field-Effect Transistors including short channel effects

    NASA Astrophysics Data System (ADS)

    Raksharam; Dutta, Aloke K.

    2017-04-01

    In this paper, a unified analytical model for the drain current of a symmetric Double-Gate Junctionless Field-Effect Transistor (DG-JLFET) is presented. The operation of the device has been classified into four modes: subthreshold, semi-depleted, accumulation, and hybrid; with the main focus of this work being on the accumulation mode, which has not been dealt with in detail so far in the literature. A physics-based model, using a simplified one-dimensional approach, has been developed for this mode, and it has been successfully integrated with the model for the hybrid mode. It also includes the effect of carrier mobility degradation due to the transverse electric field, which was hitherto missing in the earlier models reported in the literature. The piece-wise models have been unified using suitable interpolation functions. In addition, the model includes two most important short-channel effects pertaining to DG-JLFETs, namely the Drain Induced Barrier Lowering (DIBL) and the Subthreshold Swing (SS) degradation. The model is completely analytical, and is thus computationally highly efficient. The results of our model have shown an excellent match with those obtained from TCAD simulations for both long- and short-channel devices, as well as with the experimental data reported in the literature.

  2. An Empirical Analysis of Citizens' Acceptance Decisions of Electronic-Government Services: A Modification of the Unified Theory of Acceptance and Use of Technology (UTAUT) Model to Include Trust as a Basis for Investigation

    ERIC Educational Resources Information Center

    Awuah, Lawrence J.

    2012-01-01

    Understanding citizens' adoption of electronic-government (e-government) is an important topic, as the use of e-government has become an integral part of governance. Success of such initiatives depends largely on the efficient use of e-government services. The unified theory of acceptance and use of technology (UTAUT) model has provided a…

  3. Unified formalism for higher order non-autonomous dynamical systems

    NASA Astrophysics Data System (ADS)

    Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso

    2012-03-01

    This work is devoted to giving a geometric framework for describing higher order non-autonomous mechanical systems. The starting point is to extend the Lagrangian-Hamiltonian unified formalism of Skinner and Rusk for these kinds of systems, generalizing previous developments for higher order autonomous mechanical systems and first-order non-autonomous mechanical systems. Then, we use this unified formulation to derive the standard Lagrangian and Hamiltonian formalisms, including the Legendre-Ostrogradsky map and the Euler-Lagrange and the Hamilton equations, both for regular and singular systems. As applications of our model, two examples of regular and singular physical systems are studied.

  4. Research and development activities in unified control-structure modeling and design

    NASA Technical Reports Server (NTRS)

    Nayak, A. P.

    1985-01-01

    Results of work to develop a unified control/structures modeling and design capability for large space structures modeling are presented. Recent analytical results are presented to demonstrate the significant interdependence between structural and control properties. A new design methodology is suggested in which the structure, material properties, dynamic model and control design are all optimized simultaneously. Parallel research done by other researchers is reviewed. The development of a methodology for global design optimization is recommended as a long-term goal. It is suggested that this methodology should be incorporated into computer aided engineering programs, which eventually will be supplemented by an expert system to aid design optimization.

  5. SSBRP Communication & Data System Development using the Unified Modeling Language (UML)

    NASA Technical Reports Server (NTRS)

    Windrem, May; Picinich, Lou; Givens, John J. (Technical Monitor)

    1998-01-01

    The Unified Modeling Language (UML) is the standard method for specifying, visualizing, and documenting the artifacts of an object-oriented system under development. UML is the unification of the object-oriented methods developed by Grady Booch and James Rumbaugh, and of the Use Case Model developed by Ivar Jacobson. This paper discusses the application of UML by the Communications and Data Systems (CDS) team to model the ground control and command of the Space Station Biological Research Project (SSBRP) User Operations Facility (UOF). UML is used to define the context of the system, the logical static structure, the life history of objects, and the interactions among objects.

  6. Micromechanical modeling of damage growth in titanium based metal-matrix composites

    NASA Technical Reports Server (NTRS)

    Sherwood, James A.; Quimby, Howard M.

    1994-01-01

    The thermomechanical behavior of continuous-fiber reinforced titanium based metal-matrix composites (MMC) is studied using the finite element method. A thermoviscoplastic unified state variable constitutive theory is employed to capture inelastic and strain-rate sensitive behavior in the Timetal-21s matrix. The SCS-6 fibers are modeled as thermoplastic. The effects of residual stresses generated during the consolidation process on the tensile response of the composites are investigated. Unidirectional and cross-ply geometries are considered. Differences between the tensile responses in composites with perfectly bonded and completely debonded fiber/matrix interfaces are discussed. Model simulations for the completely debonded-interface condition are shown to correlate well with experimental results.

  7. A Unified Data-Driven Approach for Programming In Situ Analysis and Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aiken, Alex

    The placement and movement of data is becoming the key limiting factor on both performance and energy efficiency of high performance computations. As systems generate more data, it is becoming increasingly difficult to actually move that data elsewhere for post-processing, as the rate of improvements in supporting I/O infrastructure is not keeping pace. Together, these trends are creating a shift in how we think about exascale computations, from a viewpoint that focuses on FLOPS to one that focuses on data and data-centric operations as fundamental to the reasoning about, and optimization of, scientific workflows on extreme-scale architectures. The overarching goalmore » of our effort was the study of a unified data-driven approach for programming applications and in situ analysis and visualization. Our work was to understand the interplay between data-centric programming model requirements at extreme-scale and the overall impact of those requirements on the design, capabilities, flexibility, and implementation details for both applications and the supporting in situ infrastructure. In this context, we made many improvements to the Legion programming system (one of the leading data-centric models today) and demonstrated in situ analyses on real application codes using these improvements.« less

  8. The multisensory basis of the self: From body to identity to others

    PubMed Central

    Tsakiris, Manos

    2017-01-01

    ABSTRACT By grounding the self in the body, experimental psychology has taken the body as the starting point for a science of the self. One fundamental dimension of the bodily self is the sense of body ownership that refers to the special perceptual status of one’s own body, the feeling that “my body” belongs to me. The primary aim of this review article is to highlight recent advances in the study of body ownership and our understanding of the underlying neurocognitive processes in three ways. I first consider how the sense of body ownership has been investigated and elucidated in the context of multisensory integration. Beyond exteroception, recent studies have considered how this exteroceptively driven sense of body ownership can be linked to the other side of embodiment, that of the unobservable, yet felt, interoceptive body, suggesting that these two sides of embodiment interact to provide a unifying bodily self. Lastly, the multisensorial understanding of the self has been shown to have implications for our understanding of social relationships, especially in the context of self–other boundaries. Taken together, these three research strands motivate a unified model of the self inspired by current predictive coding models. PMID:27100132

  9. The multisensory basis of the self: From body to identity to others [Formula: see text].

    PubMed

    Tsakiris, Manos

    2017-04-01

    By grounding the self in the body, experimental psychology has taken the body as the starting point for a science of the self. One fundamental dimension of the bodily self is the sense of body ownership that refers to the special perceptual status of one's own body, the feeling that "my body" belongs to me. The primary aim of this review article is to highlight recent advances in the study of body ownership and our understanding of the underlying neurocognitive processes in three ways. I first consider how the sense of body ownership has been investigated and elucidated in the context of multisensory integration. Beyond exteroception, recent studies have considered how this exteroceptively driven sense of body ownership can be linked to the other side of embodiment, that of the unobservable, yet felt, interoceptive body, suggesting that these two sides of embodiment interact to provide a unifying bodily self. Lastly, the multisensorial understanding of the self has been shown to have implications for our understanding of social relationships, especially in the context of self-other boundaries. Taken together, these three research strands motivate a unified model of the self inspired by current predictive coding models.

  10. Verifying Stability of Dynamic Soft-Computing Systems

    NASA Technical Reports Server (NTRS)

    Wen, Wu; Napolitano, Marcello; Callahan, John

    1997-01-01

    Soft computing is a general term for algorithms that learn from human knowledge and mimic human skills. Example of such algorithms are fuzzy inference systems and neural networks. Many applications, especially in control engineering, have demonstrated their appropriateness in building intelligent systems that are flexible and robust. Although recent research have shown that certain class of neuro-fuzzy controllers can be proven bounded and stable, they are implementation dependent and difficult to apply to the design and validation process. Many practitioners adopt the trial and error approach for system validation or resort to exhaustive testing using prototypes. In this paper, we describe our on-going research towards establishing necessary theoretic foundation as well as building practical tools for the verification and validation of soft-computing systems. A unified model for general neuro-fuzzy system is adopted. Classic non-linear system control theory and recent results of its applications to neuro-fuzzy systems are incorporated and applied to the unified model. It is hoped that general tools can be developed to help the designer to visualize and manipulate the regions of stability and boundedness, much the same way Bode plots and Root locus plots have helped conventional control design and validation.

  11. Integrated catchment modelling within a strategic planning and decision making process: Werra case study

    NASA Astrophysics Data System (ADS)

    Dietrich, Jörg; Funke, Markus

    Integrated water resources management (IWRM) redefines conventional water management approaches through a closer cross-linkage between environment and society. The role of public participation and socio-economic considerations becomes more important within the planning and decision making process. In this paper we address aspects of the integration of catchment models into such a process taking the implementation of the European Water Framework Directive (WFD) as an example. Within a case study situated in the Werra river basin (Central Germany), a systems analytic decision process model was developed. This model uses the semantics of the Unified Modeling Language (UML) activity model. As an example application, the catchment model SWAT and the water quality model RWQM1 were applied to simulate the effect of phosphorus emissions from non-point and point sources on water quality. The decision process model was able to guide the participants of the case study through the interdisciplinary planning and negotiation of actions. Further improvements of the integration framework include tools for quantitative uncertainty analyses, which are crucial for real life application of models within an IWRM decision making toolbox. For the case study, the multi-criteria assessment of actions indicates that the polluter pays principle can be met at larger scales (sub-catchment or river basin) without significantly compromising cost efficiency for the local situation.

  12. A unified physical model of Seebeck coefficient in amorphous oxide semiconductor thin-film transistors

    NASA Astrophysics Data System (ADS)

    Lu, Nianduan; Li, Ling; Sun, Pengxiao; Banerjee, Writam; Liu, Ming

    2014-09-01

    A unified physical model for Seebeck coefficient was presented based on the multiple-trapping and release theory for amorphous oxide semiconductor thin-film transistors. According to the proposed model, the Seebeck coefficient is attributed to the Fermi-Dirac statistics combined with the energy dependent trap density of states and the gate-voltage dependence of the quasi-Fermi level. The simulation results show that the gate voltage, energy disorder, and temperature dependent Seebeck coefficient can be well described. The calculation also shows a good agreement with the experimental data in amorphous In-Ga-Zn-O thin-film transistor.

  13. Unified Viscoplastic Behavior of Metal Matrix Composites

    NASA Technical Reports Server (NTRS)

    Arnold, S. M.; Robinson, D. N.; Bartolotta, P. A.

    1992-01-01

    The need for unified constitutive models was recognized more than a decade ago in the results of phenomenological tests on monolithic metals that exhibited strong creep-plasticity interaction. Recently, metallic alloys have been combined to form high-temperature ductile/ductile composite materials, raising the natural question of whether these metallic composites exhibit the same phenomenological features as their monolithic constituents. This question is addressed in the context of a limited, yet definite (to illustrate creep/plasticity interaction) set of experimental data on the model metal matrix composite (MMC) system W/Kanthal. Furthermore, it is demonstrated that a unified viscoplastic representation, extended for unidirectional composites and correlated to W/Kanthal, can accurately predict the observed longitudinal composite creep/plasticity interaction response and strain rate dependency. Finally, the predicted influence of fiber orientation on the creep response of W/Kanthal is illustrated.

  14. Four Courses within a Discipline: UGA Unified Core

    ERIC Educational Resources Information Center

    Powell, Gwynn M.; Johnson, Corey W.; James, Joy; Dunlap, Rudy

    2013-01-01

    This article introduces the reader to the Unified Core Curriculum model developed and implemented at the University of Georgia (UGA). Four courses are taught as one course to the juniors coming into the Recreation and Leisure Studies major. An overview of the blended course and sample assignments are provided, as well as a discussion of challenges…

  15. Explanatory pluralism: An unrewarding prediction error for free energy theorists.

    PubMed

    Colombo, Matteo; Wright, Cory

    2017-03-01

    Courtesy of its free energy formulation, the hierarchical predictive processing theory of the brain (PTB) is often claimed to be a grand unifying theory. To test this claim, we examine a central case: activity of mesocorticolimbic dopaminergic (DA) systems. After reviewing the three most prominent hypotheses of DA activity-the anhedonia, incentive salience, and reward prediction error hypotheses-we conclude that the evidence currently vindicates explanatory pluralism. This vindication implies that the grand unifying claims of advocates of PTB are unwarranted. More generally, we suggest that the form of scientific progress in the cognitive sciences is unlikely to be a single overarching grand unifying theory. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Command and Control: Toward Arctic Unity of Command and Unity of Effort

    DTIC Science & Technology

    2011-05-19

    Russia, Norway, and Denmark) are in the process of preparing or have submitted territorial claims in the Arctic by way of this convention.58... longitude . The Unified Command Plan divides the Arctic region geographically among three GCCs. U.S. Northern Command (USNORTHCOM), U.S. European...2008, http://www.defense.gov/specials/unifiedcommand/ images /unified-command_world-map.jpg (accessed November 22, 2010). While the Department of

  17. Unified Engineering Software System

    NASA Technical Reports Server (NTRS)

    Purves, L. R.; Gordon, S.; Peltzman, A.; Dube, M.

    1989-01-01

    Collection of computer programs performs diverse functions in prototype engineering. NEXUS, NASA Engineering Extendible Unified Software system, is research set of computer programs designed to support full sequence of activities encountered in NASA engineering projects. Sequence spans preliminary design, design analysis, detailed design, manufacturing, assembly, and testing. Primarily addresses process of prototype engineering, task of getting single or small number of copies of product to work. Written in FORTRAN 77 and PROLOG.

  18. Statistical Bayesian method for reliability evaluation based on ADT data

    NASA Astrophysics Data System (ADS)

    Lu, Dawei; Wang, Lizhi; Sun, Yusheng; Wang, Xiaohong

    2018-05-01

    Accelerated degradation testing (ADT) is frequently conducted in the laboratory to predict the products’ reliability under normal operating conditions. Two kinds of methods, degradation path models and stochastic process models, are utilized to analyze degradation data and the latter one is the most popular method. However, some limitations like imprecise solution process and estimation result of degradation ratio still exist, which may affect the accuracy of the acceleration model and the extrapolation value. Moreover, the conducted solution of this problem, Bayesian method, lose key information when unifying the degradation data. In this paper, a new data processing and parameter inference method based on Bayesian method is proposed to handle degradation data and solve the problems above. First, Wiener process and acceleration model is chosen; Second, the initial values of degradation model and parameters of prior and posterior distribution under each level is calculated with updating and iteration of estimation values; Third, the lifetime and reliability values are estimated on the basis of the estimation parameters; Finally, a case study is provided to demonstrate the validity of the proposed method. The results illustrate that the proposed method is quite effective and accuracy in estimating the lifetime and reliability of a product.

  19. Psychometric evaluation of a unified Portuguese-language version of the Body Shape Questionnaire in female university students.

    PubMed

    Silva, Wanderson Roberto; Costa, David; Pimenta, Filipa; Maroco, João; Campos, Juliana Alvares Duarte Bonini

    2016-07-21

    The objectives of this study were to develop a unified Portuguese-language version, for use in Brazil and Portugal, of the Body Shape Questionnaire (BSQ) and to estimate its validity, reliability, and internal consistency in Brazilian and Portuguese female university students. Confirmatory factor analysis was performed using both original (34-item) and shortened (8-item) versions. The model's fit was assessed with χ²/df, CFI, NFI, and RMSEA. Concurrent and convergent validity were assessed. Reliability was estimated through internal consistency and composite reliability (α). Transnational invariance of the BSQ was tested using multi-group analysis. The original 32-item model was refined to present a better fit and adequate validity and reliability. The shortened model was stable in both independent samples and in transnational samples (Brazil and Portugal). The use of this unified version is recommended for the assessment of body shape concerns in both Brazilian and Portuguese college students.

  20. Object-oriented software design in semiautomatic building extraction

    NASA Astrophysics Data System (ADS)

    Guelch, Eberhard; Mueller, Hardo

    1997-08-01

    Developing a system for semiautomatic building acquisition is a complex process, that requires constant integration and updating of software modules and user interfaces. To facilitate these processes we apply an object-oriented design not only for the data but also for the software involved. We use the unified modeling language (UML) to describe the object-oriented modeling of the system in different levels of detail. We can distinguish between use cases from the users point of view, that represent a sequence of actions, yielding in an observable result and the use cases for the programmers, who can use the system as a class library to integrate the acquisition modules in their own software. The structure of the system is based on the model-view-controller (MVC) design pattern. An example from the integration of automated texture extraction for the visualization of results demonstrate the feasibility of this approach.

  1. A Neurobehavioral Model of Flexible Spatial Language Behaviors

    PubMed Central

    Lipinski, John; Schneegans, Sebastian; Sandamirskaya, Yulia; Spencer, John P.; Schöner, Gregor

    2012-01-01

    We propose a neural dynamic model that specifies how low-level visual processes can be integrated with higher level cognition to achieve flexible spatial language behaviors. This model uses real-word visual input that is linked to relational spatial descriptions through a neural mechanism for reference frame transformations. We demonstrate that the system can extract spatial relations from visual scenes, select items based on relational spatial descriptions, and perform reference object selection in a single unified architecture. We further show that the performance of the system is consistent with behavioral data in humans by simulating results from 2 independent empirical studies, 1 spatial term rating task and 1 study of reference object selection behavior. The architecture we present thereby achieves a high degree of task flexibility under realistic stimulus conditions. At the same time, it also provides a detailed neural grounding for complex behavioral and cognitive processes. PMID:21517224

  2. A unified algorithm for predicting partition coefficients for PBPK modeling of drugs and environmental chemicals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peyret, Thomas; Poulin, Patrick; Krishnan, Kannan, E-mail: kannan.krishnan@umontreal.ca

    The algorithms in the literature focusing to predict tissue:blood PC (P{sub tb}) for environmental chemicals and tissue:plasma PC based on total (K{sub p}) or unbound concentration (K{sub pu}) for drugs differ in their consideration of binding to hemoglobin, plasma proteins and charged phospholipids. The objective of the present study was to develop a unified algorithm such that P{sub tb}, K{sub p} and K{sub pu} for both drugs and environmental chemicals could be predicted. The development of the unified algorithm was accomplished by integrating all mechanistic algorithms previously published to compute the PCs. Furthermore, the algorithm was structured in such amore » way as to facilitate predictions of the distribution of organic compounds at the macro (i.e. whole tissue) and micro (i.e. cells and fluids) levels. The resulting unified algorithm was applied to compute the rat P{sub tb}, K{sub p} or K{sub pu} of muscle (n = 174), liver (n = 139) and adipose tissue (n = 141) for acidic, neutral, zwitterionic and basic drugs as well as ketones, acetate esters, alcohols, aliphatic hydrocarbons, aromatic hydrocarbons and ethers. The unified algorithm reproduced adequately the values predicted previously by the published algorithms for a total of 142 drugs and chemicals. The sensitivity analysis demonstrated the relative importance of the various compound properties reflective of specific mechanistic determinants relevant to prediction of PC values of drugs and environmental chemicals. Overall, the present unified algorithm uniquely facilitates the computation of macro and micro level PCs for developing organ and cellular-level PBPK models for both chemicals and drugs.« less

  3. Vector Observation-Aided/Attitude-Rate Estimation Using Global Positioning System Signals

    NASA Technical Reports Server (NTRS)

    Oshman, Yaakov; Markley, F. Landis

    1997-01-01

    A sequential filtering algorithm is presented for attitude and attitude-rate estimation from Global Positioning System (GPS) differential carrier phase measurements. A third-order, minimal-parameter method for solving the attitude matrix kinematic equation is used to parameterize the filter's state, which renders the resulting estimator computationally efficient. Borrowing from tracking theory concepts, the angular acceleration is modeled as an exponentially autocorrelated stochastic process, thus avoiding the use of the uncertain spacecraft dynamic model. The new formulation facilitates the use of aiding vector observations in a unified filtering algorithm, which can enhance the method's robustness and accuracy. Numerical examples are used to demonstrate the performance of the method.

  4. Heterogeneous continuous-time random walks

    NASA Astrophysics Data System (ADS)

    Grebenkov, Denis S.; Tupikina, Liubov

    2018-01-01

    We introduce a heterogeneous continuous-time random walk (HCTRW) model as a versatile analytical formalism for studying and modeling diffusion processes in heterogeneous structures, such as porous or disordered media, multiscale or crowded environments, weighted graphs or networks. We derive the exact form of the propagator and investigate the effects of spatiotemporal heterogeneities onto the diffusive dynamics via the spectral properties of the generalized transition matrix. In particular, we show how the distribution of first-passage times changes due to local and global heterogeneities of the medium. The HCTRW formalism offers a unified mathematical language to address various diffusion-reaction problems, with numerous applications in material sciences, physics, chemistry, biology, and social sciences.

  5. Exploring the complementarity of THz pulse imaging and DCE-MRIs: Toward a unified multi-channel classification and a deep learning framework.

    PubMed

    Yin, X-X; Zhang, Y; Cao, J; Wu, J-L; Hadjiloucas, S

    2016-12-01

    We provide a comprehensive account of recent advances in biomedical image analysis and classification from two complementary imaging modalities: terahertz (THz) pulse imaging and dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). The work aims to highlight underlining commonalities in both data structures so that a common multi-channel data fusion framework can be developed. Signal pre-processing in both datasets is discussed briefly taking into consideration advances in multi-resolution analysis and model based fractional order calculus system identification. Developments in statistical signal processing using principal component and independent component analysis are also considered. These algorithms have been developed independently by the THz-pulse imaging and DCE-MRI communities, and there is scope to place them in a common multi-channel framework to provide better software standardization at the pre-processing de-noising stage. A comprehensive discussion of feature selection strategies is also provided and the importance of preserving textural information is highlighted. Feature extraction and classification methods taking into consideration recent advances in support vector machine (SVM) and extreme learning machine (ELM) classifiers and their complex extensions are presented. An outlook on Clifford algebra classifiers and deep learning techniques suitable to both types of datasets is also provided. The work points toward the direction of developing a new unified multi-channel signal processing framework for biomedical image analysis that will explore synergies from both sensing modalities for inferring disease proliferation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Non-extensitivity vs. informative moments for financial models —A unifying framework and empirical results

    NASA Astrophysics Data System (ADS)

    Herrmann, K.

    2009-11-01

    Information-theoretic approaches still play a minor role in financial market analysis. Nonetheless, there have been two very similar approaches evolving during the last years, one in the so-called econophysics and the other in econometrics. Both generalize the notion of GARCH processes in an information-theoretic sense and are able to capture kurtosis better than traditional models. In this article we present both approaches in a more general framework. The latter allows the derivation of a wide range of new models. We choose a third model using an entropy measure suggested by Kapur. In an application to financial market data, we find that all considered models - with similar flexibility in terms of skewness and kurtosis - lead to very similar results.

  7. OmniPHR: A distributed architecture model to integrate personal health records.

    PubMed

    Roehrs, Alex; da Costa, Cristiano André; da Rosa Righi, Rodrigo

    2017-07-01

    The advances in the Information and Communications Technology (ICT) brought many benefits to the healthcare area, specially to digital storage of patients' health records. However, it is still a challenge to have a unified viewpoint of patients' health history, because typically health data is scattered among different health organizations. Furthermore, there are several standards for these records, some of them open and others proprietary. Usually health records are stored in databases within health organizations and rarely have external access. This situation applies mainly to cases where patients' data are maintained by healthcare providers, known as EHRs (Electronic Health Records). In case of PHRs (Personal Health Records), in which patients by definition can manage their health records, they usually have no control over their data stored in healthcare providers' databases. Thereby, we envision two main challenges regarding PHR context: first, how patients could have a unified view of their scattered health records, and second, how healthcare providers can access up-to-date data regarding their patients, even though changes occurred elsewhere. For addressing these issues, this work proposes a model named OmniPHR, a distributed model to integrate PHRs, for patients and healthcare providers use. The scientific contribution is to propose an architecture model to support a distributed PHR, where patients can maintain their health history in an unified viewpoint, from any device anywhere. Likewise, for healthcare providers, the possibility of having their patients data interconnected among health organizations. The evaluation demonstrates the feasibility of the model in maintaining health records distributed in an architecture model that promotes a unified view of PHR with elasticity and scalability of the solution. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. A Unified Model of Performance: Validation of its Predictions across Different Sleep/Wake Schedules

    PubMed Central

    Ramakrishnan, Sridhar; Wesensten, Nancy J.; Balkin, Thomas J.; Reifman, Jaques

    2016-01-01

    Study Objectives: Historically, mathematical models of human neurobehavioral performance developed on data from one sleep study were limited to predicting performance in similar studies, restricting their practical utility. We recently developed a unified model of performance (UMP) to predict the effects of the continuum of sleep loss—from chronic sleep restriction (CSR) to total sleep deprivation (TSD) challenges—and validated it using data from two studies of one laboratory. Here, we significantly extended this effort by validating the UMP predictions across a wide range of sleep/wake schedules from different studies and laboratories. Methods: We developed the UMP on psychomotor vigilance task (PVT) lapse data from one study encompassing four different CSR conditions (7 d of 3, 5, 7, and 9 h of sleep/night), and predicted performance in five other studies (from four laboratories), including different combinations of TSD (40 to 88 h), CSR (2 to 6 h of sleep/night), control (8 to 10 h of sleep/night), and nap (nocturnal and diurnal) schedules. Results: The UMP accurately predicted PVT performance trends across 14 different sleep/wake conditions, yielding average prediction errors between 7% and 36%, with the predictions lying within 2 standard errors of the measured data 87% of the time. In addition, the UMP accurately predicted performance impairment (average error of 15%) for schedules (TSD and naps) not used in model development. Conclusions: The unified model of performance can be used as a tool to help design sleep/wake schedules to optimize the extent and duration of neurobehavioral performance and to accelerate recovery after sleep loss. Citation: Ramakrishnan S, Wesensten NJ, Balkin TJ, Reifman J. A unified model of performance: validation of its predictions across different sleep/wake schedules. SLEEP 2016;39(1):249–262. PMID:26518594

  9. A Unified Model of Performance for Predicting the Effects of Sleep and Caffeine

    PubMed Central

    Ramakrishnan, Sridhar; Wesensten, Nancy J.; Kamimori, Gary H.; Moon, James E.; Balkin, Thomas J.; Reifman, Jaques

    2016-01-01

    Study Objectives: Existing mathematical models of neurobehavioral performance cannot predict the beneficial effects of caffeine across the spectrum of sleep loss conditions, limiting their practical utility. Here, we closed this research gap by integrating a model of caffeine effects with the recently validated unified model of performance (UMP) into a single, unified modeling framework. We then assessed the accuracy of this new UMP in predicting performance across multiple studies. Methods: We hypothesized that the pharmacodynamics of caffeine vary similarly during both wakefulness and sleep, and that caffeine has a multiplicative effect on performance. Accordingly, to represent the effects of caffeine in the UMP, we multiplied a dose-dependent caffeine factor (which accounts for the pharmacokinetics and pharmacodynamics of caffeine) to the performance estimated in the absence of caffeine. We assessed the UMP predictions in 14 distinct laboratory- and field-study conditions, including 7 different sleep-loss schedules (from 5 h of sleep per night to continuous sleep loss for 85 h) and 6 different caffeine doses (from placebo to repeated 200 mg doses to a single dose of 600 mg). Results: The UMP accurately predicted group-average psychomotor vigilance task performance data across the different sleep loss and caffeine conditions (6% < error < 27%), yielding greater accuracy for mild and moderate sleep loss conditions than for more severe cases. Overall, accounting for the effects of caffeine resulted in improved predictions (after caffeine consumption) by up to 70%. Conclusions: The UMP provides the first comprehensive tool for accurate selection of combinations of sleep schedules and caffeine countermeasure strategies to optimize neurobehavioral performance. Citation: Ramakrishnan S, Wesensten NJ, Kamimori GH, Moon JE, Balkin TJ, Reifman J. A unified model of performance for predicting the effects of sleep and caffeine. SLEEP 2016;39(10):1827–1841. PMID:27397562

  10. Generalized Information Theory Meets Human Cognition: Introducing a Unified Framework to Model Uncertainty and Information Search.

    PubMed

    Crupi, Vincenzo; Nelson, Jonathan D; Meder, Björn; Cevolani, Gustavo; Tentori, Katya

    2018-06-17

    Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. Copyright © 2018 Cognitive Science Society, Inc.

  11. The three constituencies of the state: why the state has lost unifying energy.

    PubMed

    King, Desmond; Le Galès, Patrick

    2017-11-01

    We address resurgent populism by examining structural processes of state transformation in the UK, the US and France. Scholars stress the 'unifying energy of the state', a set of institutions and policies capable of limiting inequalities and defending legal regimes. One characteristic of modern Western statehood were packages of policies designed to integrate social groups and territories in part by ensuring common standards of provision and social citizenship across the nation state. This echoes James Scott's critical analysis of the modernist project of the state (1998). This 'unifying energy' had different origins including nationalist movements, combatting external influence or powers, war, and preparing citizens for the rigours of industrialization. Overcoming class differences and territorial differences (including cultural, social and economic differences) was a major source of mobilization to feed this 'unifying energy of the state' in France, Italy or Spain for instance. Political and cultural identities are related in significant part to respective nation states. We argue that this 'unifying energy' was an essential component of statehood in Europe and in the US. It is now largely lost. We explain why and the significance of its displacement. © London School of Economics and Political Science 2017.

  12. Applying a Lifespan Developmental Perspective to Chronic Pain: Pediatrics to Geriatrics.

    PubMed

    Walco, Gary A; Krane, Elliot J; Schmader, Kenneth E; Weiner, Debra K

    2016-09-01

    An ideal taxonomy of chronic pain would be applicable to people of all ages. Developmental sciences focus on lifespan developmental approaches, and view the trajectory of processes in the life course from birth to death. In this article we provide a review of lifespan developmental models, describe normal developmental processes that affect pain processing, and identify deviations from those processes that lead to stable individual differences of clinical interest, specifically the development of chronic pain syndromes. The goals of this review were 1) to unify what are currently separate purviews of "pediatric pain," "adult pain," and "geriatric pain," and 2) to generate models so that specific elements of the chronic pain taxonomy might include important developmental considerations. A lifespan developmental model is applied to the forthcoming Analgesic, Anesthetic, and Addiction Clinical Trial Translations, Innovations, Opportunities, and Networks-American Pain Society Pain Taxonomy to ascertain the degree to which general "adult" descriptions apply to pediatric and geriatric populations, or if age- or development-related considerations need to be invoked. Copyright © 2016. Published by Elsevier Inc.

  13. Analysis and Management of Animal Populations: Modeling, Estimation and Decision Making

    USGS Publications Warehouse

    Williams, B.K.; Nichols, J.D.; Conroy, M.J.

    2002-01-01

    This book deals with the processes involved in making informed decisions about the management of animal populations. It covers the modeling of population responses to management actions, the estimation of quantities needed in the modeling effort, and the application of these estimates and models to the development of sound management decisions. The book synthesizes and integrates in a single volume the methods associated with these themes, as they apply to ecological assessment and conservation of animal populations. KEY FEATURES * Integrates population modeling, parameter estimation and * decision-theoretic approaches to management in a single, cohesive framework * Provides authoritative, state-of-the-art descriptions of quantitative * approaches to modeling, estimation and decision-making * Emphasizes the role of mathematical modeling in the conduct of science * and management * Utilizes a unifying biological context, consistent mathematical notation, * and numerous biological examples

  14. Factors Affecting Acceptance & Use of ReWIND: Validating the Extended Unified Theory of Acceptance and Use of Technology

    ERIC Educational Resources Information Center

    Nair, Pradeep Kumar; Ali, Faizan; Leong, Lim Chee

    2015-01-01

    Purpose: This study aims to explain the factors affecting students' acceptance and usage of a lecture capture system (LCS)--ReWIND--in a Malaysian university based on the extended unified theory of acceptance and use of technology (UTAUT2) model. Technological advances have become an important feature of universities' plans to improve the…

  15. Laminar Cortical Dynamics of Cognitive and Motor Working Memory, Sequence Learning and Performance: Toward a Unified Theory of How the Cerebral Cortex Works

    ERIC Educational Resources Information Center

    Grossberg, Stephen; Pearson, Lance R.

    2008-01-01

    How does the brain carry out working memory storage, categorization, and voluntary performance of event sequences? The LIST PARSE neural model proposes an answer that unifies the explanation of cognitive, neurophysiological, and anatomical data. It quantitatively simulates human cognitive data about immediate serial recall and free recall, and…

  16. Effects of Age on Auditory and Cognitive Processing: Implications for Hearing Aid Fitting and Audiologic Rehabilitation

    PubMed Central

    Pichora-Fuller, M. Kathleen; Singh, Gurjit

    2006-01-01

    Recent advances in research and clinical practice concerning aging and auditory communication have been driven by questions about age-related differences in peripheral hearing, central auditory processing, and cognitive processing. A “site-of-lesion” view based on anatomic levels inspired research to test competing hypotheses about the contributions of changes at these three levels of the nervous system. A “processing” view based on psychologic functions inspired research to test alternative hypotheses about how lower-level sensory processes and higher-level cognitive processes interact. In the present paper, we suggest that these two views can begin to be unified following the example set by the cognitive neuroscience of aging. The early pioneers of audiology anticipated such a unified view, but today, advances in science and technology make it both possible and necessary. Specifically, we argue that a synthesis of new knowledge concerning the functional neuroscience of auditory cognition is necessary to inform the design and fitting of digital signal processing in “intelligent” hearing devices, as well as to inform best practices for resituating hearing aid fitting in a broader context of audiologic rehabilitation. Long-standing approaches to rehabilitative audiology should be revitalized to emphasize the important role that training and therapy play in promoting compensatory brain reorganization as older adults acclimatize to new technologies. The purpose of the present paper is to provide an integrated framework for understanding how auditory and cognitive processing interact when older adults listen, comprehend, and communicate in realistic situations, to review relevant models and findings, and to suggest how new knowledge about age-related changes in audition and cognition may influence future developments in hearing aid fitting and audiologic rehabilitation. PMID:16528429

  17. Why trace and delay conditioning are sometimes (but not always) hippocampal dependent: A computational model

    PubMed Central

    Moustafa, Ahmed A.; Wufong, Ella; Servatius, Richard J.; Pang, Kevin C. H.; Gluck, Mark A.; Myers, Catherine E.

    2013-01-01

    A recurrent-network model provides a unified account of the hippocampal region in mediating the representation of temporal information in classical eyeblink conditioning. Much empirical research is consistent with a general conclusion that delay conditioning (in which the conditioned stimulus CS and unconditioned stimulus US overlap and co-terminate) is independent of the hippocampal system, while trace conditioning (in which the CS terminates before US onset) depends on the hippocampus. However, recent studies show that, under some circumstances, delay conditioning can be hippocampal-dependent and trace conditioning can be spared following hippocampal lesion. Here, we present an extension of our prior trial-level models of hippocampal function and stimulus representation that can explain these findings within a unified framework. Specifically, the current model includes adaptive recurrent collateral connections that aid in the representation of intra-trial temporal information. With this model, as in our prior models, we argue that the hippocampus is not specialized for conditioned response timing, but rather is a general-purpose system that learns to predict the next state of all stimuli given the current state of variables encoded by activity in recurrent collaterals. As such, the model correctly predicts that hippocampal involvement in classical conditioning should be critical not only when there is an intervening trace interval, but also when there is a long delay between CS onset and US onset. Our model simulates empirical data from many variants of classical conditioning, including delay and trace paradigms in which the length of the CS, the inter-stimulus interval, or the trace interval is varied. Finally, we discuss model limitations, future directions, and several novel empirical predictions of this temporal processing model of hippocampal function and learning. PMID:23178699

  18. Lagrangian-Hamiltonian unified formalism for autonomous higher order dynamical systems

    NASA Astrophysics Data System (ADS)

    Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso

    2011-09-01

    The Lagrangian-Hamiltonian unified formalism of Skinner and Rusk was originally stated for autonomous dynamical systems in classical mechanics. It has been generalized for non-autonomous first-order mechanical systems, as well as for first-order and higher order field theories. However, a complete generalization to higher order mechanical systems is yet to be described. In this work, after reviewing the natural geometrical setting and the Lagrangian and Hamiltonian formalisms for higher order autonomous mechanical systems, we develop a complete generalization of the Lagrangian-Hamiltonian unified formalism for these kinds of systems, and we use it to analyze some physical models from this new point of view.

  19. Unified reduction principle for the evolution of mutation, migration, and recombination

    PubMed Central

    Altenberg, Lee; Liberman, Uri; Feldman, Marcus W.

    2017-01-01

    Modifier-gene models for the evolution of genetic information transmission between generations of organisms exhibit the reduction principle: Selection favors reduction in the rate of variation production in populations near equilibrium under a balance of constant viability selection and variation production. Whereas this outcome has been proven for a variety of genetic models, it has not been proven in general for multiallelic genetic models of mutation, migration, and recombination modification with arbitrary linkage between the modifier and major genes under viability selection. We show that the reduction principle holds for all of these cases by developing a unifying mathematical framework that characterizes all of these evolutionary models. PMID:28265103

  20. PERSPECTIVE: Physical aspects of cancer invasion

    NASA Astrophysics Data System (ADS)

    Guiot, Caterina; Pugno, Nicola; Delsanto, Pier Paolo; Deisboeck, Thomas S.

    2007-12-01

    Invasiveness, one of the hallmarks of tumor progression, represents the tumor's ability to expand into the host tissue by means of several complex biochemical and biomechanical processes. Since certain aspects of the problem present a striking resemblance with well-known physical mechanisms, such as the mechanical insertion of a solid inclusion in an elastic material specimen (G Eaves 1973 The invasive growth of malignant tumours as a purely mechanical process J. Pathol. 109 233; C Guiot, N Pugno and P P Delsanto 2006 Elastomechanical model of tumor invasion Appl. Phys. Lett. 89 233901) or a water drop impinging on a surface (C Guiot, P P Delsanto and T S Deisboeck 2007 Morphological instability and cancer invasion: a 'splashing water drop' analogy Theor. Biol. Med. Model 4 4), we propose here an analogy between these physical processes and a cancer system's invasive branching into the surrounding tissue. Accounting for its solid and viscous properties, we then arrive, as a unifying model, to an analogy with a granular solid. While our model has been explicitly formulated for multicellular tumor spheroids in vitro, it should also contribute to a better understanding of tumor invasion in vivo.

  1. Do intuitive and deliberate judgments rely on two distinct neural systems? A case study in face processing

    PubMed Central

    Mega, Laura F.; Gigerenzer, Gerd; Volz, Kirsten G.

    2015-01-01

    Arguably the most influential models of human decision-making today are based on the assumption that two separable systems – intuition and deliberation – underlie the judgments that people make. Our recent work is among the first to present neural evidence contrary to the predictions of these dual-systems accounts. We measured brain activations using functional magnetic resonance imaging while participants were specifically instructed to either intuitively or deliberately judge the authenticity of emotional facial expressions. Results from three different analyses revealed both common brain networks of activation across decision mode and differential activations as a function of strategy adherence. We take our results to contradict popular dual-systems accounts that propose a clear-cut dichotomy of the processing systems, and to support rather a unified model. According to this, intuitive and deliberate judgment processes rely on the same rules, though only the former are thought to be characterized by non-conscious processing. PMID:26379523

  2. Bayesian model of categorical effects in L1 and L2 speech perception

    NASA Astrophysics Data System (ADS)

    Kronrod, Yakov

    In this dissertation I present a model that captures categorical effects in both first language (L1) and second language (L2) speech perception. In L1 perception, categorical effects range between extremely strong for consonants to nearly continuous perception of vowels. I treat the problem of speech perception as a statistical inference problem and by quantifying categoricity I obtain a unified model of both strong and weak categorical effects. In this optimal inference mechanism, the listener uses their knowledge of categories and the acoustics of the signal to infer the intended productions of the speaker. The model splits up speech variability into meaningful category variance and perceptual noise variance. The ratio of these two variances, which I call Tau, directly correlates with the degree of categorical effects for a given phoneme or continuum. By fitting the model to behavioral data from different phonemes, I show how a single parametric quantitative variation can lead to the different degrees of categorical effects seen in perception experiments with different phonemes. In L2 perception, L1 categories have been shown to exert an effect on how L2 sounds are identified and how well the listener is able to discriminate them. Various models have been developed to relate the state of L1 categories with both the initial and eventual ability to process the L2. These models largely lacked a formalized metric to measure perceptual distance, a means of making a-priori predictions of behavior for a new contrast, and a way of describing non-discrete gradient effects. In the second part of my dissertation, I apply the same computational model that I used to unify L1 categorical effects to examining L2 perception. I show that we can use the model to make the same type of predictions as other SLA models, but also provide a quantitative framework while formalizing all measures of similarity and bias. Further, I show how using this model to consider L2 learners at different stages of development we can track specific parameters of categories as they change over time, giving us a look into the actual process of L2 category development.

  3. Role of the ATLAS Grid Information System (AGIS) in Distributed Data Analysis and Simulation

    NASA Astrophysics Data System (ADS)

    Anisenkov, A. V.

    2018-03-01

    In modern high-energy physics experiments, particular attention is paid to the global integration of information and computing resources into a unified system for efficient storage and processing of experimental data. Annually, the ATLAS experiment performed at the Large Hadron Collider at the European Organization for Nuclear Research (CERN) produces tens of petabytes raw data from the recording electronics and several petabytes of data from the simulation system. For processing and storage of such super-large volumes of data, the computing model of the ATLAS experiment is based on heterogeneous geographically distributed computing environment, which includes the worldwide LHC computing grid (WLCG) infrastructure and is able to meet the requirements of the experiment for processing huge data sets and provide a high degree of their accessibility (hundreds of petabytes). The paper considers the ATLAS grid information system (AGIS) used by the ATLAS collaboration to describe the topology and resources of the computing infrastructure, to configure and connect the high-level software systems of computer centers, to describe and store all possible parameters, control, configuration, and other auxiliary information required for the effective operation of the ATLAS distributed computing applications and services. The role of the AGIS system in the development of a unified description of the computing resources provided by grid sites, supercomputer centers, and cloud computing into a consistent information model for the ATLAS experiment is outlined. This approach has allowed the collaboration to extend the computing capabilities of the WLCG project and integrate the supercomputers and cloud computing platforms into the software components of the production and distributed analysis workload management system (PanDA, ATLAS).

  4. A unified approach to computational drug discovery.

    PubMed

    Tseng, Chih-Yuan; Tuszynski, Jack

    2015-11-01

    It has been reported that a slowdown in the development of new medical therapies is affecting clinical outcomes. The FDA has thus initiated the Critical Path Initiative project investigating better approaches. We review the current strategies in drug discovery and focus on the advantages of the maximum entropy method being introduced in this area. The maximum entropy principle is derived from statistical thermodynamics and has been demonstrated to be an inductive inference tool. We propose a unified method to drug discovery that hinges on robust information processing using entropic inductive inference. Increasingly, applications of maximum entropy in drug discovery employ this unified approach and demonstrate the usefulness of the concept in the area of pharmaceutical sciences. Copyright © 2015. Published by Elsevier Ltd.

  5. Hilltop supernatural inflation and SUSY unified models

    NASA Astrophysics Data System (ADS)

    Kohri, Kazunori; Lim, C. S.; Lin, Chia-Min; Mimura, Yukihiro

    2014-01-01

    In this paper, we consider high scale (100TeV) supersymmetry (SUSY) breaking and realize the idea of hilltop supernatural inflation in concrete particle physics models based on flipped-SU(5)and Pati-Salam models in the framework of supersymmetric grand unified theories (SUSY GUTs). The inflaton can be a flat direction including right-handed sneutrino and the waterfall field is a GUT Higgs. The spectral index is ns = 0.96 which fits very well with recent data by PLANCK satellite. There is no both thermal and non-thermal gravitino problems. Non-thermal leptogenesis can be resulted from the decay of right-handed sneutrino which plays (part of) the role of inflaton.

  6. Standardized languages and notations for graphical modelling of patient care processes: a systematic review.

    PubMed

    Mincarone, Pierpaolo; Leo, Carlo Giacomo; Trujillo-Martín, Maria Del Mar; Manson, Jan; Guarino, Roberto; Ponzini, Giuseppe; Sabina, Saverio

    2018-04-01

    The importance of working toward quality improvement in healthcare implies an increasing interest in analysing, understanding and optimizing process logic and sequences of activities embedded in healthcare processes. Their graphical representation promotes faster learning, higher retention and better compliance. The study identifies standardized graphical languages and notations applied to patient care processes and investigates their usefulness in the healthcare setting. Peer-reviewed literature up to 19 May 2016. Information complemented by a questionnaire sent to the authors of selected studies. Systematic review conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses statement. Five authors extracted results of selected studies. Ten articles met the inclusion criteria. One notation and language for healthcare process modelling were identified with an application to patient care processes: Business Process Model and Notation and Unified Modeling Language™. One of the authors of every selected study completed the questionnaire. Users' comprehensibility and facilitation of inter-professional analysis of processes have been recognized, in the filled in questionnaires, as major strengths for process modelling in healthcare. Both the notation and the language could increase the clarity of presentation thanks to their visual properties, the capacity of easily managing macro and micro scenarios, the possibility of clearly and precisely representing the process logic. Both could increase guidelines/pathways applicability by representing complex scenarios through charts and algorithms hence contributing to reduce unjustified practice variations which negatively impact on quality of care and patient safety.

  7. Human mobility in a continuum approach.

    PubMed

    Simini, Filippo; Maritan, Amos; Néda, Zoltán

    2013-01-01

    Human mobility is investigated using a continuum approach that allows to calculate the probability to observe a trip to any arbitrary region, and the fluxes between any two regions. The considered description offers a general and unified framework, in which previously proposed mobility models like the gravity model, the intervening opportunities model, and the recently introduced radiation model are naturally resulting as special cases. A new form of radiation model is derived and its validity is investigated using observational data offered by commuting trips obtained from the United States census data set, and the mobility fluxes extracted from mobile phone data collected in a western European country. The new modeling paradigm offered by this description suggests that the complex topological features observed in large mobility and transportation networks may be the result of a simple stochastic process taking place on an inhomogeneous landscape.

  8. Human Mobility in a Continuum Approach

    PubMed Central

    Simini, Filippo; Maritan, Amos; Néda, Zoltán

    2013-01-01

    Human mobility is investigated using a continuum approach that allows to calculate the probability to observe a trip to any arbitrary region, and the fluxes between any two regions. The considered description offers a general and unified framework, in which previously proposed mobility models like the gravity model, the intervening opportunities model, and the recently introduced radiation model are naturally resulting as special cases. A new form of radiation model is derived and its validity is investigated using observational data offered by commuting trips obtained from the United States census data set, and the mobility fluxes extracted from mobile phone data collected in a western European country. The new modeling paradigm offered by this description suggests that the complex topological features observed in large mobility and transportation networks may be the result of a simple stochastic process taking place on an inhomogeneous landscape. PMID:23555885

  9. The intrapsychics of gender: a model of self-socialization.

    PubMed

    Tobin, Desiree D; Menon, Meenakshi; Menon, Madhavi; Spatta, Brooke C; Hodges, Ernest V E; Perry, David G

    2010-04-01

    This article outlines a model of the structure and the dynamics of gender cognition in childhood. The model incorporates 3 hypotheses featured in different contemporary theories of childhood gender cognition and unites them under a single theoretical framework. Adapted from Greenwald et al. (2002), the model distinguishes three constructs: gender identity, gender stereotypes, and attribute self-perceptions. The model specifies 3 causal processes among the constructs: Gender identity and stereotypes interactively influence attribute self-perceptions (stereotype emulation hypothesis); gender identity and attribute self-perceptions interactively influence gender stereotypes (stereotype construction hypothesis); and gender stereotypes and attribute self-perceptions interactively influence identity (identity construction hypothesis). The model resolves nagging ambiguities in terminology, organizes diverse hypotheses and empirical findings under a unifying conceptual umbrella, and stimulates many new research directions. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  10. 3D molecular models of whole HIV-1 virions generated with cellPACK

    PubMed Central

    Goodsell, David S.; Autin, Ludovic; Forli, Stefano; Sanner, Michel F.; Olson, Arthur J.

    2014-01-01

    As knowledge of individual biological processes grows, it becomes increasingly useful to frame new findings within their larger biological contexts in order to generate new systems-scale hypotheses. This report highlights two major iterations of a whole virus model of HIV-1, generated with the cellPACK software. cellPACK integrates structural and systems biology data with packing algorithms to assemble comprehensive 3D models of cell-scale structures in molecular detail. This report describes the biological data, modeling parameters and cellPACK methods used to specify and construct editable models for HIV-1. Anticipating that cellPACK interfaces under development will enable researchers from diverse backgrounds to critique and improve the biological models, we discuss how cellPACK can be used as a framework to unify different types of data across all scales of biology. PMID:25253262

  11. An investigation of difficulties experienced by students developing unified modelling language (UML) class and sequence diagrams

    NASA Astrophysics Data System (ADS)

    Sien, Ven Yu

    2011-12-01

    Object-oriented analysis and design (OOAD) is not an easy subject to learn. There are many challenges confronting students when studying OOAD. Students have particular difficulty abstracting real-world problems within the context of OOAD. They are unable to effectively build object-oriented (OO) models from the problem domain because they essentially do not know "what" to model. This article investigates the difficulties and misconceptions undergraduate students have with analysing systems using unified modelling language analysis class and sequence diagrams. These models were chosen because they represent important static and dynamic aspects of the software system under development. The results of this study will help students produce effective OO models, and facilitate software engineering lecturers design learning materials and approaches for introductory OOAD courses.

  12. A new model for fluid velocity slip on a solid surface.

    PubMed

    Shu, Jian-Jun; Teo, Ji Bin Melvin; Chan, Weng Kong

    2016-10-12

    A general adsorption model is developed to describe the interactions between near-wall fluid molecules and solid surfaces. This model serves as a framework for the theoretical modelling of boundary slip phenomena. Based on this adsorption model, a new general model for the slip velocity of fluids on solid surfaces is introduced. The slip boundary condition at a fluid-solid interface has hitherto been considered separately for gases and liquids. In this paper, we show that the slip velocity in both gases and liquids may originate from dynamical adsorption processes at the interface. A unified analytical model that is valid for both gas-solid and liquid-solid slip boundary conditions is proposed based on surface science theory. The corroboration with the experimental data extracted from the literature shows that the proposed model provides an improved prediction compared to existing analytical models for gases at higher shear rates and close agreement for liquid-solid interfaces in general.

  13. Free-form geometric modeling by integrating parametric and implicit PDEs.

    PubMed

    Du, Haixia; Qin, Hong

    2007-01-01

    Parametric PDE techniques, which use partial differential equations (PDEs) defined over a 2D or 3D parametric domain to model graphical objects and processes, can unify geometric attributes and functional constraints of the models. PDEs can also model implicit shapes defined by level sets of scalar intensity fields. In this paper, we present an approach that integrates parametric and implicit trivariate PDEs to define geometric solid models containing both geometric information and intensity distribution subject to flexible boundary conditions. The integrated formulation of second-order or fourth-order elliptic PDEs permits designers to manipulate PDE objects of complex geometry and/or arbitrary topology through direct sculpting and free-form modeling. We developed a PDE-based geometric modeling system for shape design and manipulation of PDE objects. The integration of implicit PDEs with parametric geometry offers more general and arbitrary shape blending and free-form modeling for objects with intensity attributes than pure geometric models.

  14. Quantum Structure in Cognition and the Foundations of Human Reasoning

    NASA Astrophysics Data System (ADS)

    Aerts, Diederik; Sozzo, Sandro; Veloz, Tomas

    2015-12-01

    Traditional cognitive science rests on a foundation of classical logic and probability theory. This foundation has been seriously challenged by several findings in experimental psychology on human decision making. Meanwhile, the formalism of quantum theory has provided an efficient resource for modeling these classically problematical situations. In this paper, we start from our successful quantum-theoretic approach to the modeling of concept combinations to formulate a unifying explanatory hypothesis. In it, human reasoning is the superposition of two processes - a conceptual reasoning, whose nature is emergence of new conceptuality, and a logical reasoning, founded on an algebraic calculus of the logical type. In most cognitive processes however, the former reasoning prevails over the latter. In this perspective, the observed deviations from classical logical reasoning should not be interpreted as biases but, rather, as natural expressions of emergence in its deepest form.

  15. Unifying the concept of consciousness across the disciplines: A concept-based, cross-cultural approach

    NASA Astrophysics Data System (ADS)

    Jones, Peter N.

    The majority of studies concerning consciousness have examined and modeled the concept of consciousness in terms of particular lines of inquiry, a process that has circumscribed the general applicability of any results from such approaches. The purpose of this dissertation was to study consciousness from a concept-based, cross-cultural approach and to attempt to unify the concept across the cultures examined. The 4 cultures are the academic disciplines of philosophy, physics, psychology, and anthropology. Consciousness was examined in terms of how the concept is framed and where the major limitations in each line of inquiry occur. The rationale for examining consciousness as a concept across 4 cultures was to determine whether there was any common component in each line's framing that could be used to unify the concept. The study found that experience itself was the primary unifying factor in each field's framing and that experience was treated as a nonreducible property within each line of inquiry. By taking experience itself (but not subjective experience) as a fundamental property, each culture's concept of consciousness becomes tractable. As such, this dissertation argues that experience should be taken as a fundamental property of the concept. The significance of this analysis is that by taking experience as a fundamental property, it becomes possible to unify the concept across the 4 cultures. This unification is presented as a unity thesis, which is a theory arguing for unification of the concept based on the fundamental of experience. Following this theoretical examination, this paper discusses several key implications of the unity thesis, including implications of the unity thesis for the current status of altered states of consciousness and for the so-called hard and easy problems associated with the concept (at least within Occidental ontology). It is argued that the so-called hard problem does not exist when experience is taken as a fundamental property of ontological reality and that altered states of consciousness are in fact better understood as access states of consciousness based on unity thesis. The dissertation concludes with suggestions for further lines of research.

  16. Symplectic multi-particle tracking on GPUs

    NASA Astrophysics Data System (ADS)

    Liu, Zhicong; Qiang, Ji

    2018-05-01

    A symplectic multi-particle tracking model is implemented on the Graphic Processing Units (GPUs) using the Compute Unified Device Architecture (CUDA) language. The symplectic tracking model can preserve phase space structure and reduce non-physical effects in long term simulation, which is important for beam property evaluation in particle accelerators. Though this model is computationally expensive, it is very suitable for parallelization and can be accelerated significantly by using GPUs. In this paper, we optimized the implementation of the symplectic tracking model on both single GPU and multiple GPUs. Using a single GPU processor, the code achieves a factor of 2-10 speedup for a range of problem sizes compared with the time on a single state-of-the-art Central Processing Unit (CPU) node with similar power consumption and semiconductor technology. It also shows good scalability on a multi-GPU cluster at Oak Ridge Leadership Computing Facility. In an application to beam dynamics simulation, the GPU implementation helps save more than a factor of two total computing time in comparison to the CPU implementation.

  17. High performance hybrid functional Petri net simulations of biological pathway models on CUDA.

    PubMed

    Chalkidis, Georgios; Nagasaki, Masao; Miyano, Satoru

    2011-01-01

    Hybrid functional Petri nets are a wide-spread tool for representing and simulating biological models. Due to their potential of providing virtual drug testing environments, biological simulations have a growing impact on pharmaceutical research. Continuous research advancements in biology and medicine lead to exponentially increasing simulation times, thus raising the demand for performance accelerations by efficient and inexpensive parallel computation solutions. Recent developments in the field of general-purpose computation on graphics processing units (GPGPU) enabled the scientific community to port a variety of compute intensive algorithms onto the graphics processing unit (GPU). This work presents the first scheme for mapping biological hybrid functional Petri net models, which can handle both discrete and continuous entities, onto compute unified device architecture (CUDA) enabled GPUs. GPU accelerated simulations are observed to run up to 18 times faster than sequential implementations. Simulating the cell boundary formation by Delta-Notch signaling on a CUDA enabled GPU results in a speedup of approximately 7x for a model containing 1,600 cells.

  18. Designer's unified cost model

    NASA Technical Reports Server (NTRS)

    Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.

    1992-01-01

    A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).

  19. Practical Application of Model-based Programming and State-based Architecture to Space Missions

    NASA Technical Reports Server (NTRS)

    Horvath, Gregory A.; Ingham, Michel D.; Chung, Seung; Martin, Oliver; Williams, Brian

    2006-01-01

    Innovative systems and software engineering solutions are required to meet the increasingly challenging demands of deep-space robotic missions. While recent advances in the development of an integrated systems and software engineering approach have begun to address some of these issues, they are still at the core highly manual and, therefore, error-prone. This paper describes a task aimed at infusing MIT's model-based executive, Titan, into JPL's Mission Data System (MDS), a unified state-based architecture, systems engineering process, and supporting software framework. Results of the task are presented, including a discussion of the benefits and challenges associated with integrating mature model-based programming techniques and technologies into a rigorously-defined domain specific architecture.

  20. Object-oriented integrated approach for the design of scalable ECG systems.

    PubMed

    Boskovic, Dusanka; Besic, Ingmar; Avdagic, Zikrija

    2009-01-01

    The paper presents the implementation of Object-Oriented (OO) integrated approaches to the design of scalable Electro-Cardio-Graph (ECG) Systems. The purpose of this methodology is to preserve real-world structure and relations with the aim to minimize the information loss during the process of modeling, especially for Real-Time (RT) systems. We report on a case study of the design that uses the integration of OO and RT methods and the Unified Modeling Language (UML) standard notation. OO methods identify objects in the real-world domain and use them as fundamental building blocks for the software system. The gained experience based on the strongly defined semantics of the object model is discussed and related problems are analyzed.

  1. Toward a unifying constitutive relation for sediment transport across environments

    NASA Astrophysics Data System (ADS)

    Houssais, Morgane; Jerolmack, Douglas J.

    2017-01-01

    Landscape evolution models typically parse the environment into different process domains, each with its own sediment transport law: e.g., soil creep, landslides and debris flows, and river bed-load and suspended-sediment transport. Sediment transport in all environments, however, contains many of the same physical ingredients, albeit in varying proportions: grain entrainment due to a shear force, that is a combination of fluid flow, particle-particle friction and gravity. We present a new take on the perspective originally advanced by Bagnold, that views the long profile of a hillsope-river-shelf system as a continuous gradient of decreasing granular friction dominance and increasing fluid drag dominance on transport capacity. Recent advances in understanding the behavior and regime transitions of dense granular systems suggest that the entire span of granular-to-fluid regimes may be accommodated by a single-phase rheology. This model predicts a material-flow effective friction (or viscosity) that changes with the degree of shear rate and confining pressure. We present experimental results confirming that fluid-driven sediment transport follows this same rheology, for bed and suspended load. Surprisingly, below the apparent threshold of motion we observe that sediment particles creep, in a manner characteristic of glassy systems. We argue that this mechanism is relevant for both hillslopes and rivers. We discuss the possibilities of unifying sediment transport across environments and disciplines, and the potential consequences for modeling landscape evolution.

  2. A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations.

    PubMed

    Hahne, Jan; Helias, Moritz; Kunkel, Susanne; Igarashi, Jun; Bolten, Matthias; Frommer, Andreas; Diesmann, Markus

    2015-01-01

    Contemporary simulators for networks of point and few-compartment model neurons come with a plethora of ready-to-use neuron and synapse models and support complex network topologies. Recent technological advancements have broadened the spectrum of application further to the efficient simulation of brain-scale networks on supercomputers. In distributed network simulations the amount of spike data that accrues per millisecond and process is typically low, such that a common optimization strategy is to communicate spikes at relatively long intervals, where the upper limit is given by the shortest synaptic transmission delay in the network. This approach is well-suited for simulations that employ only chemical synapses but it has so far impeded the incorporation of gap-junction models, which require instantaneous neuronal interactions. Here, we present a numerical algorithm based on a waveform-relaxation technique which allows for network simulations with gap junctions in a way that is compatible with the delayed communication strategy. Using a reference implementation in the NEST simulator, we demonstrate that the algorithm and the required data structures can be smoothly integrated with existing code such that they complement the infrastructure for spiking connections. To show that the unified framework for gap-junction and spiking interactions achieves high performance and delivers high accuracy in the presence of gap junctions, we present benchmarks for workstations, clusters, and supercomputers. Finally, we discuss limitations of the novel technology.

  3. A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations

    PubMed Central

    Hahne, Jan; Helias, Moritz; Kunkel, Susanne; Igarashi, Jun; Bolten, Matthias; Frommer, Andreas; Diesmann, Markus

    2015-01-01

    Contemporary simulators for networks of point and few-compartment model neurons come with a plethora of ready-to-use neuron and synapse models and support complex network topologies. Recent technological advancements have broadened the spectrum of application further to the efficient simulation of brain-scale networks on supercomputers. In distributed network simulations the amount of spike data that accrues per millisecond and process is typically low, such that a common optimization strategy is to communicate spikes at relatively long intervals, where the upper limit is given by the shortest synaptic transmission delay in the network. This approach is well-suited for simulations that employ only chemical synapses but it has so far impeded the incorporation of gap-junction models, which require instantaneous neuronal interactions. Here, we present a numerical algorithm based on a waveform-relaxation technique which allows for network simulations with gap junctions in a way that is compatible with the delayed communication strategy. Using a reference implementation in the NEST simulator, we demonstrate that the algorithm and the required data structures can be smoothly integrated with existing code such that they complement the infrastructure for spiking connections. To show that the unified framework for gap-junction and spiking interactions achieves high performance and delivers high accuracy in the presence of gap junctions, we present benchmarks for workstations, clusters, and supercomputers. Finally, we discuss limitations of the novel technology. PMID:26441628

  4. Differential processing: towards a unified model of direction and speed perception.

    PubMed

    Farrell-Whelan, Max; Brooks, Kevin R

    2013-11-01

    In two experiments, we demonstrate a misperception of the velocity of a random-dot stimulus moving in the presence of a static line oriented obliquely to the direction of dot motion. As shown in previous studies, the perceived direction of the dots is shifted away from the orientation of the static line, with the size of the shift varying as a function of line orientation relative to dot direction (the statically-induced direction illusion, or 'SDI'). In addition, we report a novel effect - that perceived speed also varies as a function of relative line orientation, decreasing systematically as the angle is reduced from 90° to 0°. We propose that these illusions both stem from the differential processing of object-relative and non-object-relative component velocities, with the latter being perceptually underestimated with respect to the former by a constant ratio. Although previous proposals regarding the SDI have not allowed quantitative accounts, we present a unified formal model of perceived velocity (both direction and speed) with the magnitude of this ratio as the only free parameter. The model was successful in accounting for the angular repulsion of motion direction across line orientations, and in predicting the systematic decrease in perceived velocity as the line's angle was reduced. Although fitting for direction and speed produced different best-fit values of the ratio of underestimation of non-object-relative motion compared to object-relative motion (with the ratio for speed being larger than that for direction) this discrepancy may be due to differences in the psychophysical procedures for measuring direction and speed. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Improving Energy Efficiency in CNC Machining

    NASA Astrophysics Data System (ADS)

    Pavanaskar, Sushrut S.

    We present our work on analyzing and improving the energy efficiency of multi-axis CNC milling process. Due to the differences in energy consumption behavior, we treat 3- and 5-axis CNC machines separately in our work. For 3-axis CNC machines, we first propose an energy model that estimates the energy requirement for machining a component on a specified 3-axis CNC milling machine. Our model makes machine-specific predictions of energy requirements while also considering the geometric aspects of the machining toolpath. Our model - and the associated software tool - facilitate direct comparison of various alternative toolpath strategies based on their energy-consumption performance. Further, we identify key factors in toolpath planning that affect energy consumption in CNC machining. We then use this knowledge to propose and demonstrate a novel toolpath planning strategy that may be used to generate new toolpaths that are inherently energy-efficient, inspired by research on digital micrography -- a form of computational art. For 5-axis CNC machines, the process planning problem consists of several sub-problems that researchers have traditionally solved separately to obtain an approximate solution. After illustrating the need to solve all sub-problems simultaneously for a truly optimal solution, we propose a unified formulation based on configuration space theory. We apply our formulation to solve a problem variant that retains key characteristics of the full problem but has lower dimensionality, allowing visualization in 2D. Given the complexity of the full 5-axis toolpath planning problem, our unified formulation represents an important step towards obtaining a truly optimal solution. With this work on the two types of CNC machines, we demonstrate that without changing the current infrastructure or business practices, machine-specific, geometry-based, customized toolpath planning can save energy in CNC machining.

  6. Qa-1/HLA-E-restricted regulatory CD8+ T cells and self-nonself discrimination: an essay on peripheral T-cell regulation.

    PubMed

    Jiang, Hong; Chess, Leonard

    2008-11-01

    By discriminating self from nonself and controlling the magnitude and class of immune responses, the immune system mounts effective immunity against virtually any foreign antigens but avoids harmful immune responses to self. These are two equally important and related but distinct processes, which function in concert to ensure an optimal function of the immune system. Immunologically relevant clinical problems often occur because of failure of either process, especially the former. Currently, there is no unified conceptual framework to characterize the precise relationship between thymic negative selection and peripheral immune regulation, which is the basis for understanding self-non-self discrimination versus control of magnitude and class of immune responses. In this article, we explore a novel hypothesis of how the immune system discriminates self from nonself in the periphery during adaptive immunity. This hypothesis permits rational analysis of various seemingly unrelated biomedical problems inherent in immunologic disorders that cannot be uniformly interpreted by any currently existing paradigms. The proposed hypothesis is based on a unified conceptual framework of the "avidity model of peripheral T-cell regulation" that we originally proposed and tested, in both basic and clinical immunology, to understand how the immune system achieves self-nonself discrimination in the periphery.

  7. A Graph-Embedding Approach to Hierarchical Visual Word Mergence.

    PubMed

    Wang, Lei; Liu, Lingqiao; Zhou, Luping

    2017-02-01

    Appropriately merging visual words are an effective dimension reduction method for the bag-of-visual-words model in image classification. The approach of hierarchically merging visual words has been extensively employed, because it gives a fully determined merging hierarchy. Existing supervised hierarchical merging methods take different approaches and realize the merging process with various formulations. In this paper, we propose a unified hierarchical merging approach built upon the graph-embedding framework. Our approach is able to merge visual words for any scenario, where a preferred structure and an undesired structure are defined, and, therefore, can effectively attend to all kinds of requirements for the word-merging process. In terms of computational efficiency, we show that our algorithm can seamlessly integrate a fast search strategy developed in our previous work and, thus, well maintain the state-of-the-art merging speed. To the best of our survey, the proposed approach is the first one that addresses the hierarchical visual word mergence in such a flexible and unified manner. As demonstrated, it can maintain excellent image classification performance even after a significant dimension reduction, and outperform all the existing comparable visual word-merging methods. In a broad sense, our work provides an open platform for applying, evaluating, and developing new criteria for hierarchical word-merging tasks.

  8. Estimating direct fatality impacts at wind farms: how far we’ve come, where we have yet to go

    USGS Publications Warehouse

    Huso, Manuela M.; Schwartz, Susan Savitt

    2013-01-01

    Measuring the potential impacts of wind farms on wildlife can be difficult and may require development of new statistical tools and models to accurately reflect the measurement process. This presentation reviews the recent history of approaches to estimating wildlife fatality under the unique conditions encountered at wind farms, their unifying themes and their potential shortcomings. Avenues of future research are suggested to continue to address the needs of resource managers and industry in understanding direct impacts of wind turbine-caused wildlife fatality.

  9. Indexing Anatomical Phrases in Neuro-Radiology Reports to the UMLS 2005AA

    PubMed Central

    Bashyam, Vijayaraghavan; Taira, Ricky K.

    2005-01-01

    This work describes a methodology to index anatomical phrases to the 2005AA release of the Unified Medical Language System (UMLS). A phrase chunking tool based on Natural Language Processing (NLP) was developed to identify semantically coherent phrases within medical reports. Using this phrase chunker, a set of 2,551 unique anatomical phrases was extracted from brain radiology reports. These phrases were mapped to the 2005AA release of the UMLS using a vector space model. Precision for the task of indexing unique phrases was 0.87. PMID:16778995

  10. Myxobacteria Fruiting Body Formation

    NASA Astrophysics Data System (ADS)

    Jiang, Yi

    2006-03-01

    Myxobacteria are social bacteria that swarm and glide on surfaces, and feed cooperatively. When starved, tens of thousands of cells change their movement pattern from outward spreading to inward concentration; they form aggregates that become fruiting bodies, inside which cells differentiate into nonmotile, environmentally resistant spores. Traditionally, cell aggregation has been considered to imply chemotaxis, a long-range cell interaction mediated by diffusing chemicals. However, myxobacteria aggregation is the consequence of direct cell-contact interactions. I will review our recent efforts in modeling the fruiting body formation of Myxobacteria, using lattice gas cellular automata models that are based on local cell-cell contact signaling. These models have reproduced the individual phases in Myxobacteria development such as the rippling, streaming, early aggregation and the final sporulation; the models can be unified to simulate the whole developmental process of Myxobacteria.

  11. Technical Challenges and Opportunities of Centralizing Space Science Mission Operations (SSMO) at NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Ido, Haisam; Burns, Rich

    2015-01-01

    The NASA Goddard Space Science Mission Operations project (SSMO) is performing a technical cost-benefit analysis for centralizing and consolidating operations of a diverse set of missions into a unified and integrated technical infrastructure. The presentation will focus on the notion of normalizing spacecraft operations processes, workflows, and tools. It will also show the processes of creating a standardized open architecture, creating common security models and implementations, interfaces, services, automations, notifications, alerts, logging, publish, subscribe and middleware capabilities. The presentation will also discuss how to leverage traditional capabilities, along with virtualization, cloud computing services, control groups and containers, and possibly Big Data concepts.

  12. Eye growth and myopia development: Unifying theory and Matlab model.

    PubMed

    Hung, George K; Mahadas, Kausalendra; Mohammad, Faisal

    2016-03-01

    The aim of this article is to present an updated unifying theory of the mechanisms underlying eye growth and myopia development. A series of model simulation programs were developed to illustrate the mechanism of eye growth regulation and myopia development. Two fundamental processes are presumed to govern the relationship between physiological optics and eye growth: genetically pre-programmed signaling and blur feedback. Cornea/lens is considered to have only a genetically pre-programmed component, whereas eye growth is considered to have both a genetically pre-programmed and a blur feedback component. Moreover, based on the Incremental Retinal-Defocus Theory (IRDT), the rate of change of blur size provides the direction for blur-driven regulation. The various factors affecting eye growth are shown in 5 simulations: (1 - unregulated eye growth): blur feedback is rendered ineffective, as in the case of form deprivation, so there is only genetically pre-programmed eye growth, generally resulting in myopia; (2 - regulated eye growth): blur feedback regulation demonstrates the emmetropization process, with abnormally excessive or reduced eye growth leading to myopia and hyperopia, respectively; (3 - repeated near-far viewing): simulation of large-to-small change in blur size as seen in the accommodative stimulus/response function, and via IRDT as well as nearwork-induced transient myopia (NITM), leading to the development of myopia; (4 - neurochemical bulk flow and diffusion): release of dopamine from the inner plexiform layer of the retina, and the subsequent diffusion and relay of neurochemical cascade show that a decrease in dopamine results in a reduction of proteoglycan synthesis rate, which leads to myopia; (5 - Simulink model): model of genetically pre-programmed signaling and blur feedback components that allows for different input functions to simulate experimental manipulations that result in hyperopia, emmetropia, and myopia. These model simulation programs (available upon request) can provide a useful tutorial for the general scientist and serve as a quantitative tool for researchers in eye growth and myopia. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Data-Flow Based Model Analysis

    NASA Technical Reports Server (NTRS)

    Saad, Christian; Bauer, Bernhard

    2010-01-01

    The concept of (meta) modeling combines an intuitive way of formalizing the structure of an application domain with a high expressiveness that makes it suitable for a wide variety of use cases and has therefore become an integral part of many areas in computer science. While the definition of modeling languages through the use of meta models, e.g. in Unified Modeling Language (UML), is a well-understood process, their validation and the extraction of behavioral information is still a challenge. In this paper we present a novel approach for dynamic model analysis along with several fields of application. Examining the propagation of information along the edges and nodes of the model graph allows to extend and simplify the definition of semantic constraints in comparison to the capabilities offered by e.g. the Object Constraint Language. Performing a flow-based analysis also enables the simulation of dynamic behavior, thus providing an "abstract interpretation"-like analysis method for the modeling domain.

  14. Generalized interferometry - I: theory for interstation correlations

    NASA Astrophysics Data System (ADS)

    Fichtner, Andreas; Stehly, Laurent; Ermert, Laura; Boehm, Christian

    2017-02-01

    We develop a general theory for interferometry by correlation that (i) properly accounts for heterogeneously distributed sources of continuous or transient nature, (ii) fully incorporates any type of linear and nonlinear processing, such as one-bit normalization, spectral whitening and phase-weighted stacking, (iii) operates for any type of medium, including 3-D elastic, heterogeneous and attenuating media, (iv) enables the exploitation of complete correlation waveforms, including seemingly unphysical arrivals, and (v) unifies the earthquake-based two-station method and ambient noise correlations. Our central theme is not to equate interferometry with Green function retrieval, and to extract information directly from processed interstation correlations, regardless of their relation to the Green function. We demonstrate that processing transforms the actual wavefield sources and actual wave propagation physics into effective sources and effective wave propagation. This transformation is uniquely determined by the processing applied to the observed data, and can be easily computed. The effective forward model, that links effective sources and propagation to synthetic interstation correlations, may not be perfect. A forward modelling error, induced by processing, describes the extent to which processed correlations can actually be interpreted as proper correlations, that is, as resulting from some effective source and some effective wave propagation. The magnitude of the forward modelling error is controlled by the processing scheme and the temporal variability of the sources. Applying adjoint techniques to the effective forward model, we derive finite-frequency Fréchet kernels for the sources of the wavefield and Earth structure, that should be inverted jointly. The structure kernels depend on the sources of the wavefield and the processing scheme applied to the raw data. Therefore, both must be taken into account correctly in order to make accurate inferences on Earth structure. Not making any restrictive assumptions on the nature of the wavefield sources, our theory can be applied to earthquake and ambient noise data, either separately or combined. This allows us (i) to locate earthquakes using interstation correlations and without knowledge of the origin time, (ii) to unify the earthquake-based two-station method and noise correlations without the need to exclude either of the two data types, and (iii) to eliminate the requirement to remove earthquake signals from noise recordings prior to the computation of correlation functions. In addition to the basic theory for acoustic wavefields, we present numerical examples for 2-D media, an extension to the most general viscoelastic case, and a method for the design of optimal processing schemes that eliminate the forward modelling error completely. This work is intended to provide a comprehensive theoretical foundation of full-waveform interferometry by correlation, and to suggest improvements to current passive monitoring methods.

  15. Beyond the Unified Model

    NASA Astrophysics Data System (ADS)

    Frauendorf, S.

    2018-04-01

    The key elements of the Unified Model are reviewed. The microscopic derivation of the Bohr Hamiltonian by means of adiabatic time-dependent mean field theory is presented. By checking against experimental data the limitations of the Unified Model are delineated. The description of the strong coupling between the rotational and intrinsic degrees of freedom in framework of the rotating mean field is presented from a conceptual point of view. The classification of rotational bands as configurations of rotating quasiparticles is introduced. The occurrence of uniform rotation about an axis that differs from the principle axes of the nuclear density distribution is discussed. The physics behind this tilted-axis rotation, unknown in molecular physics, is explained on a basic level. The new symmetries of the rotating mean field that arise from the various orientations of the angular momentum vector with respect to the triaxial nuclear density distribution and their manifestation by the level sequence of rotational bands are discussed. Resulting phenomena, as transverse wobbling, rotational chirality, magnetic rotation and band termination are discussed. Using the concept of spontaneous symmetry breaking the microscopic underpinning of the rotational degrees is refined.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Childs, Andrew M.; Center for Theoretical Physics, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139; Leung, Debbie W.

    We present unified, systematic derivations of schemes in the two known measurement-based models of quantum computation. The first model (introduced by Raussendorf and Briegel, [Phys. Rev. Lett. 86, 5188 (2001)]) uses a fixed entangled state, adaptive measurements on single qubits, and feedforward of the measurement results. The second model (proposed by Nielsen, [Phys. Lett. A 308, 96 (2003)] and further simplified by Leung, [Int. J. Quant. Inf. 2, 33 (2004)]) uses adaptive two-qubit measurements that can be applied to arbitrary pairs of qubits, and feedforward of the measurement results. The underlying principle of our derivations is a variant of teleportationmore » introduced by Zhou, Leung, and Chuang, [Phys. Rev. A 62, 052316 (2000)]. Our derivations unify these two measurement-based models of quantum computation and provide significantly simpler schemes.« less

  17. Toward a unified approach to dose-response modeling in ecotoxicology.

    PubMed

    Ritz, Christian

    2010-01-01

    This study reviews dose-response models that are used in ecotoxicology. The focus lies on clarification of differences and similarities between models, and as a side effect, their different guises in ecotoxicology are unravelled. A look at frequently used dose-response models reveals major discrepancies, among other things in naming conventions. Therefore, there is a need for a unified view on dose-response modeling in order to improve the understanding of it and to facilitate communication and comparison of findings across studies, thus realizing its full potential. This study attempts to establish a general framework that encompasses most dose-response models that are of interest to ecotoxicologists in practice. The framework includes commonly used models such as the log-logistic and Weibull models, but also features entire suites of models as found in various guidance documents. An outline on how the proposed framework can be implemented in statistical software systems is also provided.

  18. Reduction of parameters in Finite Unified Theories and the MSSM

    NASA Astrophysics Data System (ADS)

    Heinemeyer, Sven; Mondragón, Myriam; Tracas, Nicholas; Zoupanos, George

    2018-02-01

    The method of reduction of couplings developed by W. Zimmermann, combined with supersymmetry, can lead to realistic quantum field theories, where the gauge and Yukawa sectors are related. It is the basis to find all-loop Finite Unified Theories, where the β-function vanishes to all-loops in perturbation theory. It can also be applied to the Minimal Supersymmetric Standard Model, leading to a drastic reduction in the number of parameters. Both Finite Unified Theories and the reduced MSSM lead to successful predictions for the masses of the third generation of quarks and the Higgs boson, and also predict a heavy supersymmetric spectrum, consistent with the non-observation of supersymmetry so far.

  19. System monitoring and diagnosis with qualitative models

    NASA Technical Reports Server (NTRS)

    Kuipers, Benjamin

    1991-01-01

    A substantial foundation of tools for model-based reasoning with incomplete knowledge was developed: QSIM (a qualitative simulation program) and its extensions for qualitative simulation; Q2, Q3 and their successors for quantitative reasoning on a qualitative framework; and the CC (component-connection) and QPC (Qualitative Process Theory) model compilers for building QSIM QDE (qualitative differential equation) models starting from different ontological assumptions. Other model-compilers for QDE's, e.g., using bond graphs or compartmental models, have been developed elsewhere. These model-building tools will support automatic construction of qualitative models from physical specifications, and further research into selection of appropriate modeling viewpoints. For monitoring and diagnosis, plausible hypotheses are unified against observations to strengthen or refute the predicted behaviors. In MIMIC (Model Integration via Mesh Interpolation Coefficients), multiple hypothesized models of the system are tracked in parallel in order to reduce the 'missing model' problem. Each model begins as a qualitative model, and is unified with a priori quantitative knowledge and with the stream of incoming observational data. When the model/data unification yields a contradiction, the model is refuted. When there is no contradiction, the predictions of the model are progressively strengthened, for use in procedure planning and differential diagnosis. Only under a qualitative level of description can a finite set of models guarantee the complete coverage necessary for this performance. The results of this research are presented in several publications. Abstracts of these published papers are presented along with abtracts of papers representing work that was synergistic with the NASA grant but funded otherwise. These 28 papers include but are not limited to: 'Combined qualitative and numerical simulation with Q3'; 'Comparative analysis and qualitative integral representations'; 'Model-based monitoring of dynamic systems'; 'Numerical behavior envelopes for qualitative models'; 'Higher-order derivative constraints in qualitative simulation'; and 'Non-intersection of trajectories in qualitative phase space: a global constraint for qualitative simulation.'

  20. Model Based Document and Report Generation for Systems Engineering

    NASA Technical Reports Server (NTRS)

    Delp, Christopher; Lam, Doris; Fosse, Elyse; Lee, Cin-Young

    2013-01-01

    As Model Based Systems Engineering (MBSE) practices gain adoption, various approaches have been developed in order to simplify and automate the process of generating documents from models. Essentially, all of these techniques can be unified around the concept of producing different views of the model according to the needs of the intended audience. In this paper, we will describe a technique developed at JPL of applying SysML Viewpoints and Views to generate documents and reports. An architecture of model-based view and document generation will be presented, and the necessary extensions to SysML with associated rationale will be explained. A survey of examples will highlight a variety of views that can be generated, and will provide some insight into how collaboration and integration is enabled. We will also describe the basic architecture for the enterprise applications that support this approach.

Top