A Theoretically Consistent Framework for Modelling Lagrangian Particle Deposition in Plant Canopies
NASA Astrophysics Data System (ADS)
Bailey, Brian N.; Stoll, Rob; Pardyjak, Eric R.
2018-06-01
We present a theoretically consistent framework for modelling Lagrangian particle deposition in plant canopies. The primary focus is on describing the probability of particles encountering canopy elements (i.e., potential deposition), and provides a consistent means for including the effects of imperfect deposition through any appropriate sub-model for deposition efficiency. Some aspects of the framework draw upon an analogy to radiation propagation through a turbid medium with which to develop model theory. The present method is compared against one of the most commonly used heuristic Lagrangian frameworks, namely that originally developed by Legg and Powell (Agricultural Meteorology, 1979, Vol. 20, 47-67), which is shown to be theoretically inconsistent. A recommendation is made to discontinue the use of this heuristic approach in favour of the theoretically consistent framework developed herein, which is no more difficult to apply under equivalent assumptions. The proposed framework has the additional advantage that it can be applied to arbitrary canopy geometries given readily measurable parameters describing vegetation structure.
Rethinking modeling framework design: object modeling system 3.0
USDA-ARS?s Scientific Manuscript database
The Object Modeling System (OMS) is a framework for environmental model development, data provisioning, testing, validation, and deployment. It provides a bridge for transferring technology from the research organization to the program delivery agency. The framework provides a consistent and efficie...
Toward a consistent modeling framework to assess multi-sectoral climate impacts.
Monier, Erwan; Paltsev, Sergey; Sokolov, Andrei; Chen, Y-H Henry; Gao, Xiang; Ejaz, Qudsia; Couzo, Evan; Schlosser, C Adam; Dutkiewicz, Stephanie; Fant, Charles; Scott, Jeffery; Kicklighter, David; Morris, Jennifer; Jacoby, Henry; Prinn, Ronald; Haigh, Martin
2018-02-13
Efforts to estimate the physical and economic impacts of future climate change face substantial challenges. To enrich the currently popular approaches to impact analysis-which involve evaluation of a damage function or multi-model comparisons based on a limited number of standardized scenarios-we propose integrating a geospatially resolved physical representation of impacts into a coupled human-Earth system modeling framework. Large internationally coordinated exercises cannot easily respond to new policy targets and the implementation of standard scenarios across models, institutions and research communities can yield inconsistent estimates. Here, we argue for a shift toward the use of a self-consistent integrated modeling framework to assess climate impacts, and discuss ways the integrated assessment modeling community can move in this direction. We then demonstrate the capabilities of such a modeling framework by conducting a multi-sectoral assessment of climate impacts under a range of consistent and integrated economic and climate scenarios that are responsive to new policies and business expectations.
We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...
We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...
High Spatial Resolution Multi-Organ Finite Element Modeling of Ventricular-Arterial Coupling
Shavik, Sheikh Mohammad; Jiang, Zhenxiang; Baek, Seungik; Lee, Lik Chuan
2018-01-01
While it has long been recognized that bi-directional interaction between the heart and the vasculature plays a critical role in the proper functioning of the cardiovascular system, a comprehensive study of this interaction has largely been hampered by a lack of modeling framework capable of simultaneously accommodating high-resolution models of the heart and vasculature. Here, we address this issue and present a computational modeling framework that couples finite element (FE) models of the left ventricle (LV) and aorta to elucidate ventricular—arterial coupling in the systemic circulation. We show in a baseline simulation that the framework predictions of (1) LV pressure—volume loop, (2) aorta pressure—diameter relationship, (3) pressure—waveforms of the aorta, LV, and left atrium (LA) over the cardiac cycle are consistent with the physiological measurements found in healthy human. To develop insights of ventricular-arterial interactions, the framework was then used to simulate how alterations in the geometrical or, material parameter(s) of the aorta affect the LV and vice versa. We show that changing the geometry and microstructure of the aorta model in the framework led to changes in the functional behaviors of both LV and aorta that are consistent with experimental observations. On the other hand, changing contractility and passive stiffness of the LV model in the framework also produced changes in both the LV and aorta functional behaviors that are consistent with physiology principles. PMID:29551977
Generalized Multilevel Structural Equation Modeling
ERIC Educational Resources Information Center
Rabe-Hesketh, Sophia; Skrondal, Anders; Pickles, Andrew
2004-01-01
A unifying framework for generalized multilevel structural equation modeling is introduced. The models in the framework, called generalized linear latent and mixed models (GLLAMM), combine features of generalized linear mixed models (GLMM) and structural equation models (SEM) and consist of a response model and a structural model for the latent…
ERIC Educational Resources Information Center
King, Gillian; Currie, Melissa; Smith, Linda; Servais, Michelle; McDougall, Janette
2008-01-01
A framework of operating models for interdisciplinary research programs in clinical service organizations is presented, consisting of a "clinician-researcher" skill development model, a program evaluation model, a researcher-led knowledge generation model, and a knowledge conduit model. Together, these models comprise a tailored, collaborative…
The Smoothed Dirichlet Distribution: Understanding Cross-Entropy Ranking in Information Retrieval
2006-07-01
reflect those of the spon- sor. viii ABSTRACT Unigram Language modeling is a successful probabilistic framework for Information Retrieval (IR) that uses...the Relevance model (RM), a state-of-the-art model for IR in the language modeling framework that uses the same cross-entropy as its ranking function...In addition, the SD based classifier provides more flexibility than RM in modeling documents owing to a consistent generative framework . We
A complete categorization of multiscale models of infectious disease systems.
Garira, Winston
2017-12-01
Modelling of infectious disease systems has entered a new era in which disease modellers are increasingly turning to multiscale modelling to extend traditional modelling frameworks into new application areas and to achieve higher levels of detail and accuracy in characterizing infectious disease systems. In this paper we present a categorization framework for categorizing multiscale models of infectious disease systems. The categorization framework consists of five integration frameworks and five criteria. We use the categorization framework to give a complete categorization of host-level immuno-epidemiological models (HL-IEMs). This categorization framework is also shown to be applicable in categorizing other types of multiscale models of infectious diseases beyond HL-IEMs through modifying the initial categorization framework presented in this study. Categorization of multiscale models of infectious disease systems in this way is useful in bringing some order to the discussion on the structure of these multiscale models.
NASA Technical Reports Server (NTRS)
1977-01-01
The development of a framework and structure for shuttle era unmanned spacecraft projects and the development of a commonality evaluation model is documented. The methodology developed for model utilization in performing cost trades and comparative evaluations for commonality studies is discussed. The model framework consists of categories of activities associated with the spacecraft system's development process. The model structure describes the physical elements to be treated as separate identifiable entities. Cost estimating relationships for subsystem and program-level components were calculated.
A Conceptual Framework Curriculum Evaluation Electrical Engineering Education
ERIC Educational Resources Information Center
Imansari, Nurulita; Sutadji, Eddy
2017-01-01
This evaluation is a conceptual framework that has been analyzed in the hope that can help research related an evaluation of the curriculum. The Model of evaluation used was CIPPO model. CIPPO Model consists of "context," "input," "process," "product," and "outcomes." On the dimension of the…
The Climate Change Impacts and Risk Analysis (CIRA) project establishes a new multi-model framework to systematically assess the impacts, economic damages, and risks from climate change in the United States. The primary goal of this framework to estimate how climate change impac...
An Exploration of the Factors Influencing the Adoption of an IS Governance Framework
ERIC Educational Resources Information Center
Parker, Sharon L.
2013-01-01
This research explored IT governance framework adoption, leveraging established IS theories. It applied both the technology acceptance model (TAM) and the technology, organization, environment (TOE) models. The study consisted of developing a model utilizing TOE and TAM, deriving relevant hypotheses. Interviews with a group of practitioners…
Nakarmi, Ukash; Wang, Yanhua; Lyu, Jingyuan; Liang, Dong; Ying, Leslie
2017-11-01
While many low rank and sparsity-based approaches have been developed for accelerated dynamic magnetic resonance imaging (dMRI), they all use low rankness or sparsity in input space, overlooking the intrinsic nonlinear correlation in most dMRI data. In this paper, we propose a kernel-based framework to allow nonlinear manifold models in reconstruction from sub-Nyquist data. Within this framework, many existing algorithms can be extended to kernel framework with nonlinear models. In particular, we have developed a novel algorithm with a kernel-based low-rank model generalizing the conventional low rank formulation. The algorithm consists of manifold learning using kernel, low rank enforcement in feature space, and preimaging with data consistency. Extensive simulation and experiment results show that the proposed method surpasses the conventional low-rank-modeled approaches for dMRI.
PACS/information systems interoperability using Enterprise Communication Framework.
alSafadi, Y; Lord, W P; Mankovich, N J
1998-06-01
Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).
The Modeling Environment for Total Risks studies (MENTOR) system, combined with an extension of the SHEDS (Stochastic Human Exposure and Dose Simulation) methodology, provide a mechanistically consistent framework for conducting source-to-dose exposure assessments of multiple pol...
Toward improved calibration of watershed models: multisite many objective measures of information
USDA-ARS?s Scientific Manuscript database
This paper presents a computational framework for incorporation of disparate information from observed hydrologic responses at multiple locations into the calibration of watershed models. The framework consists of four components: (i) an a-priori characterization of system behavior; (ii) a formal an...
Agatha: Disentangling period signals from correlated noise in a periodogram framework
NASA Astrophysics Data System (ADS)
Feng, F.; Tuomi, M.; Jones, H. R. A.
2018-04-01
Agatha is a framework of periodograms to disentangle periodic signals from correlated noise and to solve the two-dimensional model selection problem: signal dimension and noise model dimension. These periodograms are calculated by applying likelihood maximization and marginalization and combined in a self-consistent way. Agatha can be used to select the optimal noise model and to test the consistency of signals in time and can be applied to time series analyses in other astronomical and scientific disciplines. An interactive web implementation of the software is also available at http://agatha.herts.ac.uk/.
Integration of RAM-SCB into the Space Weather Modeling Framework
Welling, Daniel; Toth, Gabor; Jordanova, Vania Koleva; ...
2018-02-07
We present that numerical simulations of the ring current are a challenging endeavor. They require a large set of inputs, including electric and magnetic fields and plasma sheet fluxes. Because the ring current broadly affects the magnetosphere-ionosphere system, the input set is dependent on the ring current region itself. This makes obtaining a set of inputs that are self-consistent with the ring current difficult. To overcome this challenge, researchers have begun coupling ring current models to global models of the magnetosphere-ionosphere system. This paper describes the coupling between the Ring current Atmosphere interaction Model with Self-Consistent Magnetic field (RAM-SCB) tomore » the models within the Space Weather Modeling Framework. Full details on both previously introduced and new coupling mechanisms are defined. Finally, the impact of self-consistently including the ring current on the magnetosphere-ionosphere system is illustrated via a set of example simulations.« less
Integration of RAM-SCB into the Space Weather Modeling Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Welling, Daniel; Toth, Gabor; Jordanova, Vania Koleva
We present that numerical simulations of the ring current are a challenging endeavor. They require a large set of inputs, including electric and magnetic fields and plasma sheet fluxes. Because the ring current broadly affects the magnetosphere-ionosphere system, the input set is dependent on the ring current region itself. This makes obtaining a set of inputs that are self-consistent with the ring current difficult. To overcome this challenge, researchers have begun coupling ring current models to global models of the magnetosphere-ionosphere system. This paper describes the coupling between the Ring current Atmosphere interaction Model with Self-Consistent Magnetic field (RAM-SCB) tomore » the models within the Space Weather Modeling Framework. Full details on both previously introduced and new coupling mechanisms are defined. Finally, the impact of self-consistently including the ring current on the magnetosphere-ionosphere system is illustrated via a set of example simulations.« less
DOT National Transportation Integrated Search
2016-04-01
In this study, we developed an adaptive signal control (ASC) framework for connected vehicles (CVs) using agent-based modeling technique. : The proposed framework consists of two types of agents: 1) vehicle agents (VAs); and 2) signal controller agen...
The Inter-Sectoral Impact Model Intercomparison Project (ISI–MIP): Project framework
Warszawski, Lila; Frieler, Katja; Huber, Veronika; Piontek, Franziska; Serdeczny, Olivia; Schewe, Jacob
2014-01-01
The Inter-Sectoral Impact Model Intercomparison Project offers a framework to compare climate impact projections in different sectors and at different scales. Consistent climate and socio-economic input data provide the basis for a cross-sectoral integration of impact projections. The project is designed to enable quantitative synthesis of climate change impacts at different levels of global warming. This report briefly outlines the objectives and framework of the first, fast-tracked phase of Inter-Sectoral Impact Model Intercomparison Project, based on global impact models, and provides an overview of the participating models, input data, and scenario set-up. PMID:24344316
Ecosystem Services and Climate Change Considerations for ...
Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework “iemWatersheds” has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water
XAL Application Framework and Bricks GUI Builder
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pelaia II, Tom
2007-01-01
The XAL [1] Application Framework is a framework for rapidly developing document based Java applications with a common look and feel along with many built-in user interface behaviors. The Bricks GUI builder consists of a modern application and framework for rapidly building user interfaces in support of true Model-View-Controller (MVC) compliant Java applications. Bricks and the XAL Application Framework allow developers to rapidly create quality applications.
A Data Analytical Framework for Improving Real-Time, Decision Support Systems in Healthcare
ERIC Educational Resources Information Center
Yahav, Inbal
2010-01-01
In this dissertation we develop a framework that combines data mining, statistics and operations research methods for improving real-time decision support systems in healthcare. Our approach consists of three main concepts: data gathering and preprocessing, modeling, and deployment. We introduce the notion of offline and semi-offline modeling to…
Modelling Participatory Geographic Information System for Customary Land Conflict Resolution
NASA Astrophysics Data System (ADS)
Gyamera, E. A.; Arko-Adjei, A.; Duncan, E. E.; Kuma, J. S. Y.
2017-11-01
Since land contributes to about 73 % of most countries Gross Domestic Product (GDP), attention on land rights have tremendously increased globally. Conflicts over land have therefore become part of the major problems associated with land administration. However, the conventional mechanisms for land conflict resolution do not provide satisfactory result to disputants due to various factors. This study sought to develop a Framework of using Participatory Geographic Information System (PGIS) for customary land conflict resolution. The framework was modelled using Unified Modelling Language (UML). The PGIS framework, called butterfly model, consists of three units namely, Social Unit (SU), Technical Unit (TU) and Decision Making Unit (DMU). The name butterfly model for land conflict resolution was adopted for the framework based on its features and properties. The framework has therefore been recommended to be adopted for land conflict resolution in customary areas.
Warren B. Cohen; Hans-Erik Andersen; Sean P. Healey; Gretchen G. Moisen; Todd A. Schroeder; Christopher W. Woodall; Grant M. Domke; Zhiqiang Yang; Robert E. Kennedy; Stephen V. Stehman; Curtis Woodcock; Jim Vogelmann; Zhe Zhu; Chengquan Huang
2015-01-01
We are developing a system that provides temporally consistent biomass estimates for national greenhouse gas inventory reporting to the United Nations Framework Convention on Climate Change. Our model-assisted estimation framework relies on remote sensing to scale from plot measurements to lidar strip samples, to Landsat time series-based maps. As a demonstration, new...
Communication: Introducing prescribed biases in out-of-equilibrium Markov models
NASA Astrophysics Data System (ADS)
Dixit, Purushottam D.
2018-03-01
Markov models are often used in modeling complex out-of-equilibrium chemical and biochemical systems. However, many times their predictions do not agree with experiments. We need a systematic framework to update existing Markov models to make them consistent with constraints that are derived from experiments. Here, we present a framework based on the principle of maximum relative path entropy (minimum Kullback-Leibler divergence) to update Markov models using stationary state and dynamical trajectory-based constraints. We illustrate the framework using a biochemical model network of growth factor-based signaling. We also show how to find the closest detailed balanced Markov model to a given Markov model. Further applications and generalizations are discussed.
ERIC Educational Resources Information Center
Yin, Chengjiu; Song, Yanjie; Tabata, Yoshiyuki; Ogata, Hiroaki; Hwang, Gwo-Jen
2013-01-01
This paper proposes a conceptual framework, scaffolding participatory simulation for mobile learning (SPSML), used on mobile devices for helping students learn conceptual knowledge in the classroom. As the pedagogical design, the framework adopts an experiential learning model, which consists of five sequential but cyclic steps: the initial stage,…
NASA Astrophysics Data System (ADS)
Johnston, J. M.
2013-12-01
Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework 'iemWatersheds' has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water Assessment Tool (SWAT) predicts surface water and sediment runoff and associated contaminants; the Watershed Mercury Model (WMM) predicts mercury runoff and loading to streams; the Water quality Analysis and Simulation Program (WASP) predicts water quality within the stream channel; the Habitat Suitability Index (HSI) model scores physicochemical habitat quality for individual fish species; and the Bioaccumulation and Aquatic System Simulator (BASS) predicts fish growth, population dynamics and bioaccumulation of toxic substances. The capability of the Framework to address cumulative impacts will be demonstrated for freshwater ecosystem services and mountaintop mining.
ERIC Educational Resources Information Center
Gynther, Karsten
2016-01-01
The research project has developed a design framework for an adaptive MOOC that complements the MOOC format with blended learning. The design framework consists of a design model and a series of learning design principles which can be used to design in-service courses for teacher professional development. The framework has been evaluated by…
Framework for non-coherent interface models at finite displacement jumps and finite strains
NASA Astrophysics Data System (ADS)
Ottosen, Niels Saabye; Ristinmaa, Matti; Mosler, Jörn
2016-05-01
This paper deals with a novel constitutive framework suitable for non-coherent interfaces, such as cracks, undergoing large deformations in a geometrically exact setting. For this type of interface, the displacement field shows a jump across the interface. Within the engineering community, so-called cohesive zone models are frequently applied in order to describe non-coherent interfaces. However, for existing models to comply with the restrictions imposed by (a) thermodynamical consistency (e.g., the second law of thermodynamics), (b) balance equations (in particular, balance of angular momentum) and (c) material frame indifference, these models are essentially fiber models, i.e. models where the traction vector is collinear with the displacement jump. This constraints the ability to model shear and, in addition, anisotropic effects are excluded. A novel, extended constitutive framework which is consistent with the above mentioned fundamental physical principles is elaborated in this paper. In addition to the classical tractions associated with a cohesive zone model, the main idea is to consider additional tractions related to membrane-like forces and out-of-plane shear forces acting within the interface. For zero displacement jump, i.e. coherent interfaces, this framework degenerates to existing formulations presented in the literature. For hyperelasticity, the Helmholtz energy of the proposed novel framework depends on the displacement jump as well as on the tangent vectors of the interface with respect to the current configuration - or equivalently - the Helmholtz energy depends on the displacement jump and the surface deformation gradient. It turns out that by defining the Helmholtz energy in terms of the invariants of these variables, all above-mentioned fundamental physical principles are automatically fulfilled. Extensions of the novel framework necessary for material degradation (damage) and plasticity are also covered.
NASA Astrophysics Data System (ADS)
Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan
2018-01-01
Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.
Automatic Earth observation data service based on reusable geo-processing workflow
NASA Astrophysics Data System (ADS)
Chen, Nengcheng; Di, Liping; Gong, Jianya; Yu, Genong; Min, Min
2008-12-01
A common Sensor Web data service framework for Geo-Processing Workflow (GPW) is presented as part of the NASA Sensor Web project. This framework consists of a data service node, a data processing node, a data presentation node, a Catalogue Service node and BPEL engine. An abstract model designer is used to design the top level GPW model, model instantiation service is used to generate the concrete BPEL, and the BPEL execution engine is adopted. The framework is used to generate several kinds of data: raw data from live sensors, coverage or feature data, geospatial products, or sensor maps. A scenario for an EO-1 Sensor Web data service for fire classification is used to test the feasibility of the proposed framework. The execution time and influences of the service framework are evaluated. The experiments show that this framework can improve the quality of services for sensor data retrieval and processing.
Probabilistic graphs as a conceptual and computational tool in hydrology and water management
NASA Astrophysics Data System (ADS)
Schoups, Gerrit
2014-05-01
Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.
A Framework for Developing the Structure of Public Health Economic Models.
Squires, Hazel; Chilcott, James; Akehurst, Ronald; Burr, Jennifer; Kelly, Michael P
2016-01-01
A conceptual modeling framework is a methodology that assists modelers through the process of developing a model structure. Public health interventions tend to operate in dynamically complex systems. Modeling public health interventions requires broader considerations than clinical ones. Inappropriately simple models may lead to poor validity and credibility, resulting in suboptimal allocation of resources. This article presents the first conceptual modeling framework for public health economic evaluation. The framework presented here was informed by literature reviews of the key challenges in public health economic modeling and existing conceptual modeling frameworks; qualitative research to understand the experiences of modelers when developing public health economic models; and piloting a draft version of the framework. The conceptual modeling framework comprises four key principles of good practice and a proposed methodology. The key principles are that 1) a systems approach to modeling should be taken; 2) a documented understanding of the problem is imperative before and alongside developing and justifying the model structure; 3) strong communication with stakeholders and members of the team throughout model development is essential; and 4) a systematic consideration of the determinants of health is central to identifying the key impacts of public health interventions. The methodology consists of four phases: phase A, aligning the framework with the decision-making process; phase B, identifying relevant stakeholders; phase C, understanding the problem; and phase D, developing and justifying the model structure. Key areas for further research involve evaluation of the framework in diverse case studies and the development of methods for modeling individual and social behavior. This approach could improve the quality of Public Health economic models, supporting efficient allocation of scarce resources. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Heartbeat-based error diagnosis framework for distributed embedded systems
NASA Astrophysics Data System (ADS)
Mishra, Swagat; Khilar, Pabitra Mohan
2012-01-01
Distributed Embedded Systems have significant applications in automobile industry as steer-by-wire, fly-by-wire and brake-by-wire systems. In this paper, we provide a general framework for fault detection in a distributed embedded real time system. We use heartbeat monitoring, check pointing and model based redundancy to design a scalable framework that takes care of task scheduling, temperature control and diagnosis of faulty nodes in a distributed embedded system. This helps in diagnosis and shutting down of faulty actuators before the system becomes unsafe. The framework is designed and tested using a new simulation model consisting of virtual nodes working on a message passing system.
Heartbeat-based error diagnosis framework for distributed embedded systems
NASA Astrophysics Data System (ADS)
Mishra, Swagat; Khilar, Pabitra Mohan
2011-12-01
Distributed Embedded Systems have significant applications in automobile industry as steer-by-wire, fly-by-wire and brake-by-wire systems. In this paper, we provide a general framework for fault detection in a distributed embedded real time system. We use heartbeat monitoring, check pointing and model based redundancy to design a scalable framework that takes care of task scheduling, temperature control and diagnosis of faulty nodes in a distributed embedded system. This helps in diagnosis and shutting down of faulty actuators before the system becomes unsafe. The framework is designed and tested using a new simulation model consisting of virtual nodes working on a message passing system.
Modelling multimedia teleservices with OSI upper layers framework: Short paper
NASA Astrophysics Data System (ADS)
Widya, I.; Vanrijssen, E.; Michiels, E.
The paper presents the use of the concepts and modelling principles of the Open Systems Interconnection (OSI) upper layers structure in the modelling of multimedia teleservices. It puts emphasis on the revised Application Layer Structure (OSI/ALS). OSI/ALS is an object based reference model which intends to coordinate the development of application oriented services and protocols in a consistent and modular way. It enables the rapid deployment and integrated use of these services. The paper emphasizes further on the nesting structure defined in OSI/ALS which allows the design of scalable and user tailorable/controllable teleservices. OSI/ALS consistent teleservices are moreover implementable on communication platforms of different capabilities. An analysis of distributed multimedia architectures which can be found in the literature, confirms the ability of the OSI/ALS framework to model the interworking functionalities of teleservices.
Distributed software framework and continuous integration in hydroinformatics systems
NASA Astrophysics Data System (ADS)
Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao
2017-08-01
When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.
Robust Dehaze Algorithm for Degraded Image of CMOS Image Sensors.
Qu, Chen; Bi, Du-Yan; Sui, Ping; Chao, Ai-Nong; Wang, Yun-Fei
2017-09-22
The CMOS (Complementary Metal-Oxide-Semiconductor) is a new type of solid image sensor device widely used in object tracking, object recognition, intelligent navigation fields, and so on. However, images captured by outdoor CMOS sensor devices are usually affected by suspended atmospheric particles (such as haze), causing a reduction in image contrast, color distortion problems, and so on. In view of this, we propose a novel dehazing approach based on a local consistent Markov random field (MRF) framework. The neighboring clique in traditional MRF is extended to the non-neighboring clique, which is defined on local consistent blocks based on two clues, where both the atmospheric light and transmission map satisfy the character of local consistency. In this framework, our model can strengthen the restriction of the whole image while incorporating more sophisticated statistical priors, resulting in more expressive power of modeling, thus, solving inadequate detail recovery effectively and alleviating color distortion. Moreover, the local consistent MRF framework can obtain details while maintaining better results for dehazing, which effectively improves the image quality captured by the CMOS image sensor. Experimental results verified that the method proposed has the combined advantages of detail recovery and color preservation.
Akita, Yasuyuki; Baldasano, Jose M; Beelen, Rob; Cirach, Marta; de Hoogh, Kees; Hoek, Gerard; Nieuwenhuijsen, Mark; Serre, Marc L; de Nazelle, Audrey
2014-04-15
In recognition that intraurban exposure gradients may be as large as between-city variations, recent air pollution epidemiologic studies have become increasingly interested in capturing within-city exposure gradients. In addition, because of the rapidly accumulating health data, recent studies also need to handle large study populations distributed over large geographic domains. Even though several modeling approaches have been introduced, a consistent modeling framework capturing within-city exposure variability and applicable to large geographic domains is still missing. To address these needs, we proposed a modeling framework based on the Bayesian Maximum Entropy method that integrates monitoring data and outputs from existing air quality models based on Land Use Regression (LUR) and Chemical Transport Models (CTM). The framework was applied to estimate the yearly average NO2 concentrations over the region of Catalunya in Spain. By jointly accounting for the global scale variability in the concentration from the output of CTM and the intraurban scale variability through LUR model output, the proposed framework outperformed more conventional approaches.
Consistent searches for SMEFT effects in non-resonant dijet events
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alte, Stefan; Konig, Matthias; Shepherd, William
Here, we investigate the bounds which can be placed on generic new-physics contributions to dijet production at the LHC using the framework of the Standard Model Effective Field Theory, deriving the first consistently-treated EFT bounds from non-resonant high-energy data. We recast an analysis searching for quark compositeness, equivalent to treating the SM with one higher-dimensional operator as a complete UV model. In order to reach consistent, model-independent EFT conclusions, it is necessary to truncate the EFT effects consistently at ordermore » $$1/\\Lambda^2$$ and to include the possibility of multiple operators simultaneously contributing to the observables, neither of which has been done in previous searches of this nature. Furthermore, it is important to give consistent error estimates for the theoretical predictions of the signal model, particularly in the region of phase space where the probed energy is approaching the cutoff scale of the EFT. There are two linear combinations of operators which contribute to dijet production in the SMEFT with distinct angular behavior; we identify those linear combinations and determine the ability of LHC searches to constrain them simultaneously. Consistently treating the EFT generically leads to weakened bounds on new-physics parameters. These constraints will be a useful input to future global analyses in the SMEFT framework, and the techniques used here to consistently search for EFT effects are directly applicable to other off-resonance signals.« less
Consistent searches for SMEFT effects in non-resonant dijet events
Alte, Stefan; Konig, Matthias; Shepherd, William
2018-01-19
Here, we investigate the bounds which can be placed on generic new-physics contributions to dijet production at the LHC using the framework of the Standard Model Effective Field Theory, deriving the first consistently-treated EFT bounds from non-resonant high-energy data. We recast an analysis searching for quark compositeness, equivalent to treating the SM with one higher-dimensional operator as a complete UV model. In order to reach consistent, model-independent EFT conclusions, it is necessary to truncate the EFT effects consistently at ordermore » $$1/\\Lambda^2$$ and to include the possibility of multiple operators simultaneously contributing to the observables, neither of which has been done in previous searches of this nature. Furthermore, it is important to give consistent error estimates for the theoretical predictions of the signal model, particularly in the region of phase space where the probed energy is approaching the cutoff scale of the EFT. There are two linear combinations of operators which contribute to dijet production in the SMEFT with distinct angular behavior; we identify those linear combinations and determine the ability of LHC searches to constrain them simultaneously. Consistently treating the EFT generically leads to weakened bounds on new-physics parameters. These constraints will be a useful input to future global analyses in the SMEFT framework, and the techniques used here to consistently search for EFT effects are directly applicable to other off-resonance signals.« less
NASA Astrophysics Data System (ADS)
Bartels, A.; Bartel, T.; Canadija, M.; Mosler, J.
2015-09-01
This paper deals with the thermomechanical coupling in dissipative materials. The focus lies on finite strain plasticity theory and the temperature increase resulting from plastic deformation. For this type of problem, two fundamentally different modeling approaches can be found in the literature: (a) models based on thermodynamical considerations and (b) models based on the so-called Taylor-Quinney factor. While a naive straightforward implementation of thermodynamically consistent approaches usually leads to an over-prediction of the temperature increase due to plastic deformation, models relying on the Taylor-Quinney factor often violate fundamental physical principles such as the first and the second law of thermodynamics. In this paper, a thermodynamically consistent framework is elaborated which indeed allows the realistic prediction of the temperature evolution. In contrast to previously proposed frameworks, it is based on a fully three-dimensional, finite strain setting and it naturally covers coupled isotropic and kinematic hardening - also based on non-associative evolution equations. Considering a variationally consistent description based on incremental energy minimization, it is shown that the aforementioned problem (thermodynamical consistency and a realistic temperature prediction) is essentially equivalent to correctly defining the decomposition of the total energy into stored and dissipative parts. Interestingly, this decomposition shows strong analogies to the Taylor-Quinney factor. In this respect, the Taylor-Quinney factor can be well motivated from a physical point of view. Furthermore, certain intervals for this factor can be derived in order to guarantee that fundamental physically principles are fulfilled a priori. Representative examples demonstrate the predictive capabilities of the final constitutive modeling framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, David; Agarwal, Deborah A.; Sun, Xin
2011-09-01
The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, D.; Agarwal, D.; Sun, X.
2011-01-01
The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.
Fletcher, Alexander G; Osborne, James M; Maini, Philip K; Gavaghan, David J
2013-11-01
The dynamic behaviour of epithelial cell sheets plays a central role during development, growth, disease and wound healing. These processes occur as a result of cell adhesion, migration, division, differentiation and death, and involve multiple processes acting at the cellular and molecular level. Computational models offer a useful means by which to investigate and test hypotheses about these processes, and have played a key role in the study of cell-cell interactions. However, the necessarily complex nature of such models means that it is difficult to make accurate comparison between different models, since it is often impossible to distinguish between differences in behaviour that are due to the underlying model assumptions, and those due to differences in the in silico implementation of the model. In this work, an approach is described for the implementation of vertex dynamics models, a discrete approach that represents each cell by a polygon (or polyhedron) whose vertices may move in response to forces. The implementation is undertaken in a consistent manner within a single open source computational framework, Chaste, which comprises fully tested, industrial-grade software that has been developed using an agile approach. This framework allows one to easily change assumptions regarding force generation and cell rearrangement processes within these models. The versatility and generality of this framework is illustrated using a number of biological examples. In each case we provide full details of all technical aspects of our model implementations, and in some cases provide extensions to make the models more generally applicable. Copyright © 2013 Elsevier Ltd. All rights reserved.
A paradigm shift toward a consistent modeling framework to assess climate impacts
NASA Astrophysics Data System (ADS)
Monier, E.; Paltsev, S.; Sokolov, A. P.; Fant, C.; Chen, H.; Gao, X.; Schlosser, C. A.; Scott, J. R.; Dutkiewicz, S.; Ejaz, Q.; Couzo, E. A.; Prinn, R. G.; Haigh, M.
2017-12-01
Estimates of physical and economic impacts of future climate change are subject to substantial challenges. To enrich the currently popular approaches of assessing climate impacts by evaluating a damage function or by multi-model comparisons based on the Representative Concentration Pathways (RCPs), we focus here on integrating impacts into a self-consistent coupled human and Earth system modeling framework that includes modules that represent multiple physical impacts. In a sample application we show that this framework is capable of investigating the physical impacts of climate change and socio-economic stressors. The projected climate impacts vary dramatically across the globe in a set of scenarios with global mean warming ranging between 2.4°C and 3.6°C above pre-industrial by 2100. Unabated emissions lead to substantial sea level rise, acidification that impacts the base of the oceanic food chain, air pollution that exceeds health standards by tenfold, water stress that impacts an additional 1 to 2 billion people globally and agricultural productivity that decreases substantially in many parts of the world. We compare the outcomes from these forward-looking scenarios against the common goal described by the target-driven scenario of 2°C, which results in much smaller impacts. It is challenging for large internationally coordinated exercises to respond quickly to new policy targets. We propose that a paradigm shift toward a self-consistent modeling framework to assess climate impacts is needed to produce information relevant to evolving global climate policy and mitigation strategies in a timely way.
BioASF: a framework for automatically generating executable pathway models specified in BioPAX.
Haydarlou, Reza; Jacobsen, Annika; Bonzanni, Nicola; Feenstra, K Anton; Abeln, Sanne; Heringa, Jaap
2016-06-15
Biological pathways play a key role in most cellular functions. To better understand these functions, diverse computational and cell biology researchers use biological pathway data for various analysis and modeling purposes. For specifying these biological pathways, a community of researchers has defined BioPAX and provided various tools for creating, validating and visualizing BioPAX models. However, a generic software framework for simulating BioPAX models is missing. Here, we attempt to fill this gap by introducing a generic simulation framework for BioPAX. The framework explicitly separates the execution model from the model structure as provided by BioPAX, with the advantage that the modelling process becomes more reproducible and intrinsically more modular; this ensures natural biological constraints are satisfied upon execution. The framework is based on the principles of discrete event systems and multi-agent systems, and is capable of automatically generating a hierarchical multi-agent system for a given BioPAX model. To demonstrate the applicability of the framework, we simulated two types of biological network models: a gene regulatory network modeling the haematopoietic stem cell regulators and a signal transduction network modeling the Wnt/β-catenin signaling pathway. We observed that the results of the simulations performed using our framework were entirely consistent with the simulation results reported by the researchers who developed the original models in a proprietary language. The framework, implemented in Java, is open source and its source code, documentation and tutorial are available at http://www.ibi.vu.nl/programs/BioASF CONTACT: j.heringa@vu.nl. © The Author 2016. Published by Oxford University Press.
Colaborated Architechture Framework for Composition UML 2.0 in Zachman Framework
NASA Astrophysics Data System (ADS)
Hermawan; Hastarista, Fika
2016-01-01
Zachman Framework (ZF) is the framework of enterprise architechture that most widely adopted in the Enterprise Information System (EIS) development. In this study, has been developed Colaborated Architechture Framework (CAF) to collaborate ZF with Unified Modeling Language (UML) 2.0 modeling. The CAF provides the composition of ZF matrix that each cell is consist of the Model Driven architechture (MDA) from the various UML models and many Software Requirement Specification (SRS) documents. Implementation of this modeling is used to develops Enterprise Resource Planning (ERP). Because ERP have a coverage of applications in large numbers and complexly relations, it is necessary to use Agile Model Driven Design (AMDD) approach as an advanced method to transforms MDA into components of application modules with efficiently and accurately. Finally, through the using of the CAF, give good achievement in fullfilment the needs from all stakeholders that are involved in the overall process stage of Rational Unified Process (RUP), and also obtaining a high satisfaction to fullfiled the functionality features of the ERP software in PT. Iglas (Persero) Gresik.
An active monitoring method for flood events
NASA Astrophysics Data System (ADS)
Chen, Zeqiang; Chen, Nengcheng; Du, Wenying; Gong, Jianya
2018-07-01
Timely and active detecting and monitoring of a flood event are critical for a quick response, effective decision-making and disaster reduction. To achieve the purpose, this paper proposes an active service framework for flood monitoring based on Sensor Web services and an active model for the concrete implementation of the active service framework. The framework consists of two core components-active warning and active planning. The active warning component is based on a publish-subscribe mechanism implemented by the Sensor Event Service. The active planning component employs the Sensor Planning Service to control the execution of the schemes and models and plans the model input data. The active model, called SMDSA, defines the quantitative calculation method for five elements, scheme, model, data, sensor, and auxiliary information, as well as their associations. Experimental monitoring of the Liangzi Lake flood in the summer of 2010 is conducted to test the proposed framework and model. The results show that 1) the proposed active service framework is efficient for timely and automated flood monitoring. 2) The active model, SMDSA, is a quantitative calculation method used to monitor floods from manual intervention to automatic computation. 3) As much preliminary work as possible should be done to take full advantage of the active service framework and the active model.
Gondek, John C; Gensemer, Robert W; Claytor, Carrie A; Canton, Steven P; Gorsuch, Joseph W
2018-06-01
Acceptance of the Biotic Ligand Model (BLM) to derive aquatic life criteria, for metals in general and copper in particular, is growing amongst regulatory agencies worldwide. Thus, it is important to ensure that water quality data are used appropriately and consistently in deriving such criteria. Here we present a suggested BLM implementation framework (hereafter referred to as "the Framework") to help guide the decision-making process when designing sampling and analysis programs for use of the BLM to derive water quality criteria applied on a site-specific basis. Such a framework will help inform stakeholders on the requirements needed to derive BLM-based criteria, and thus, ensure the appropriate types and amount of data are being collected and interpreted. The Framework was developed for calculating BLM-based criteria when data are available from multiple sampling locations on a stream. The Framework aspires to promote consistency when applying the BLM across datasets of disparate water quality, data quantity, and spatial and temporal representativeness, and is meant to be flexible to maximize applicability over a wide range of scenarios. Therefore, the Framework allows for a certain level of interpretation and adjustment to address the issues unique to each dataset. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Hierarchical Bayesian Modeling of Fluid-Induced Seismicity
NASA Astrophysics Data System (ADS)
Broccardo, M.; Mignan, A.; Wiemer, S.; Stojadinovic, B.; Giardini, D.
2017-11-01
In this study, we present a Bayesian hierarchical framework to model fluid-induced seismicity. The framework is based on a nonhomogeneous Poisson process with a fluid-induced seismicity rate proportional to the rate of injected fluid. The fluid-induced seismicity rate model depends upon a set of physically meaningful parameters and has been validated for six fluid-induced case studies. In line with the vision of hierarchical Bayesian modeling, the rate parameters are considered as random variables. We develop both the Bayesian inference and updating rules, which are used to develop a probabilistic forecasting model. We tested the Basel 2006 fluid-induced seismic case study to prove that the hierarchical Bayesian model offers a suitable framework to coherently encode both epistemic uncertainty and aleatory variability. Moreover, it provides a robust and consistent short-term seismic forecasting model suitable for online risk quantification and mitigation.
Bagraith, Karl; Chardon, Lydia; King, Robert John
2010-11-01
Although there are widely accepted and utilized models and frameworks for nondirective counseling (NDC), there is little in the way of tools or instruments designed to assist in determining whether or not a specific episode of counseling is consistent with the stated model or framework. The Counseling Progress and Depth Rating Instrument (CPDRI) was developed to evaluate counselor integrity in the use of Egan's skilled helper model in online counseling. The instrument was found to have sound internal consistency, good interrater reliability, and good face and convergent validity. The CPDRI is, therefore, proposed as a useful tool to facilitate investigation of the degree to which counselors adhere to and apply a widely used approach to NDC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.
Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research & Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorists actions based on assumptions regardingmore » their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patra, Anirban; Wen, Wei; Martinez Saez, Enrique
2016-02-05
It is essential to understand the deformation behavior of these Fe-Cr-Al alloys, in order to be able to develop models for predicting their mechanical response under varied loading conditions. Interaction of dislocations with the radiation-induced defects governs the crystallographic deformation mechanisms. A crystal plasticity framework is employed to model these mechanisms in Fe-Cr-Al alloys. This work builds on a previously developed defect density-based crystal plasticity model for bcc metals and alloys, with necessary modifications made to account for the defect substructure observed in Fe-Cr-Al alloys. The model is implemented in a Visco-Plastic Self Consistent (VPSC) framework, to predict the mechanicalmore » behavior under quasi-static loading.« less
Sohl, Terry L.; Sleeter, Benjamin M.; Zhu, Zhiliang; Sayler, Kristi L.; Bennett, Stacie; Bouchard, Michelle; Reker, Ryan R.; Hawbaker, Todd J.; Wein, Anne M.; Liu, Shuguang; Kanengieter, Ronald L.; Acevedo, William
2012-01-01
Changes in land use, land cover, disturbance regimes, and land management have considerable influence on carbon and greenhouse gas (GHG) fluxes within ecosystems. Through targeted land-use and land-management activities, ecosystems can be managed to enhance carbon sequestration and mitigate fluxes of other GHGs. National-scale, comprehensive analyses of carbon sequestration potential by ecosystem are needed, with a consistent, nationally applicable land-use and land-cover (LULC) modeling framework a key component of such analyses. The U.S. Geological Survey has initiated a project to analyze current and projected future GHG fluxes by ecosystem and quantify potential mitigation strategies. We have developed a unique LULC modeling framework to support this work. Downscaled scenarios consistent with IPCC Special Report on Emissions Scenarios (SRES) were constructed for U.S. ecoregions, and the FORE-SCE model was used to spatially map the scenarios. Results for a prototype demonstrate our ability to model LULC change and inform a biogeochemical modeling framework for analysis of subsequent GHG fluxes. The methodology was then successfully used to model LULC change for four IPCC SRES scenarios for an ecoregion in the Great Plains. The scenario-based LULC projections are now being used to analyze potential GHG impacts of LULC change across the U.S.
Sohl, Terry L.; Sleeter, Benjamin M.; Zhu, Zhi-Liang; Sayler, Kristi L.; Bennett, Stacie; Bouchard, Michelle; Reker, Ryan R.; Hawbaker, Todd; Wein, Anne; Liu, Shu-Guang; Kanengleter, Ronald; Acevedo, William
2012-01-01
Changes in land use, land cover, disturbance regimes, and land management have considerable influence on carbon and greenhouse gas (GHG) fluxes within ecosystems. Through targeted land-use and landmanagement activities, ecosystems can be managed to enhance carbon sequestration and mitigate fluxes of other GHGs. National-scale, comprehensive analyses of carbon sequestration potential by ecosystem are needed, with a consistent, nationally applicable land-use and land-cover (LULC) modeling framework a key component of such analyses. The U.S. Geological Survey has initiated a project to analyze current and projected future GHG fluxes by ecosystem and quantify potential mitigation strategies. We have developed a unique LULC modeling framework to support this work. Downscaled scenarios consistent with IPCC Special Report on Emissions Scenarios (SRES) were constructed for U.S. ecoregions, and the FORE-SCE model was used to spatially map the scenarios. Results for a prototype demonstrate our ability to model LULC change and inform a biogeochemical modeling framework for analysis of subsequent GHG fluxes. The methodology was then successfully used to model LULC change for four IPCC SRES scenarios for an ecoregion in the Great Plains. The scenario-based LULC projections are now being used to analyze potential GHG impacts of LULC change across the U.S.
From microscopic taxation and redistribution models to macroscopic income distributions
NASA Astrophysics Data System (ADS)
Bertotti, Maria Letizia; Modanese, Giovanni
2011-10-01
We present here a general framework, expressed by a system of nonlinear differential equations, suitable for the modeling of taxation and redistribution in a closed society. This framework allows one to describe the evolution of income distribution over the population and to explain the emergence of collective features based on knowledge of the individual interactions. By making different choices of the framework parameters, we construct different models, whose long-time behavior is then investigated. Asymptotic stationary distributions are found, which enjoy similar properties as those observed in empirical distributions. In particular, they exhibit power law tails of Pareto type and their Lorenz curves and Gini indices are consistent with some real world ones.
Bayesian Model Averaging for Propensity Score Analysis
ERIC Educational Resources Information Center
Kaplan, David; Chen, Jianshen
2013-01-01
The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…
A generic biokinetic model for noble gases with application to radon.
Leggett, Rich; Marsh, James; Gregoratto, Demetrio; Blanchardon, Eric
2013-06-01
To facilitate the estimation of radiation doses from intake of radionuclides, the International Commission on Radiological Protection (ICRP) publishes dose coefficients (dose per unit intake) based on reference biokinetic and dosimetric models. The ICRP generally has not provided biokinetic models or dose coefficients for intake of noble gases, but plans to provide such information for (222)Rn and other important radioisotopes of noble gases in a forthcoming series of reports on occupational intake of radionuclides (OIR). This paper proposes a generic biokinetic model framework for noble gases and develops parameter values for radon. The framework is tailored to applications in radiation protection and is consistent with a physiologically based biokinetic modelling scheme adopted for the OIR series. Parameter values for a noble gas are based largely on a blood flow model and physical laws governing transfer of a non-reactive and soluble gas between materials. Model predictions for radon are shown to be consistent with results of controlled studies of its biokinetics in human subjects.
Modeling Environment for Total Risk-2E
MENTOR-2E uses an integrated, mechanistically consistent source-to-dose-to-response modeling framework to quantify inhalation exposure and doses resulting from emergency events. It is an implementation of the MENTOR system that is focused towards modeling of the impacts of rele...
A generic biogeochemical module for earth system models
NASA Astrophysics Data System (ADS)
Fang, Y.; Huang, M.; Liu, C.; Li, H.-Y.; Leung, L. R.
2013-06-01
Physical and biogeochemical processes regulate soil carbon dynamics and CO2 flux to and from the atmosphere, influencing global climate changes. Integration of these processes into earth system models (e.g. community land models - CLM), however, currently faces three major challenges: (1) extensive efforts are required to modify modeling structures and to rewrite computer programs to incorporate new or updated processes as new knowledge is being generated, (2) computational cost is prohibitively expensive to simulate biogeochemical processes in land models due to large variations in the rates of biogeochemical processes, and (3) various mathematical representations of biogeochemical processes exist to incorporate different aspects of fundamental mechanisms, but systematic evaluation of the different mathematical representations is difficult, if not impossible. To address these challenges, we propose a new computational framework to easily incorporate physical and biogeochemical processes into land models. The new framework consists of a new biogeochemical module with a generic algorithm and reaction database so that new and updated processes can be incorporated into land models without the need to manually set up the ordinary differential equations to be solved numerically. The reaction database consists of processes of nutrient flow through the terrestrial ecosystems in plants, litter and soil. This framework facilitates effective comparison studies of biogeochemical cycles in an ecosystem using different conceptual models under the same land modeling framework. The approach was first implemented in CLM and benchmarked against simulations from the original CLM-CN code. A case study was then provided to demonstrate the advantages of using the new approach to incorporate a phosphorus cycle into the CLM model. To our knowledge, the phosphorus-incorporated CLM is a new model that can be used to simulate phosphorus limitation on the productivity of terrestrial ecosystems.
NASA Astrophysics Data System (ADS)
Nowak, W.; Koch, J.
2014-12-01
Predicting DNAPL fate and transport in heterogeneous aquifers is challenging and subject to an uncertainty that needs to be quantified. Models for this task needs to be equipped with an accurate source zone description, i.e., the distribution of mass of all partitioning phases (DNAPL, water, and soil) in all possible states ((im)mobile, dissolved, and sorbed), mass-transfer algorithms, and the simulation of transport processes in the groundwater. Such detailed models tend to be computationally cumbersome when used for uncertainty quantification. Therefore, a selective choice of the relevant model states, processes, and scales are both sensitive and indispensable. We investigate the questions: what is a meaningful level of model complexity and how to obtain an efficient model framework that is still physically and statistically consistent. In our proposed model, aquifer parameters and the contaminant source architecture are conceptualized jointly as random space functions. The governing processes are simulated in a three-dimensional, highly-resolved, stochastic, and coupled model that can predict probability density functions of mass discharge and source depletion times. We apply a stochastic percolation approach as an emulator to simulate the contaminant source formation, a random walk particle tracking method to simulate DNAPL dissolution and solute transport within the aqueous phase, and a quasi-steady-state approach to solve for DNAPL depletion times. Using this novel model framework, we test whether and to which degree the desired model predictions are sensitive to simplifications often found in the literature. With this we identify that aquifer heterogeneity, groundwater flow irregularity, uncertain and physically-based contaminant source zones, and their mutual interlinkages are indispensable components of a sound model framework.
Students' Models of Curve Fitting: A Models and Modeling Perspective
ERIC Educational Resources Information Center
Gupta, Shweta
2010-01-01
The Models and Modeling Perspectives (MMP) has evolved out of research that began 26 years ago. MMP researchers use Model Eliciting Activities (MEAs) to elicit students' mental models. In this study MMP was used as the conceptual framework to investigate the nature of students' models of curve fitting in a problem-solving environment consisting of…
MODELS-3 (CMAQ). NARSTO NEWS (VOL. 3, NO. 2, SUMMER/FALL 1999)
A revised version of the U.S. EPA's Models-3/CMAQ system was released on June 30, 1999. Models-3 consists of a sophisticated computational framework for environmental models allowing for much flexibility in the communications between component parts of the system, in updating or ...
Patient-reported outcomes in insomnia: development of a conceptual framework and endpoint model.
Kleinman, Leah; Buysse, Daniel J; Harding, Gale; Lichstein, Kenneth; Kalsekar, Anupama; Roth, Thomas
2013-01-01
This article describes qualitative research conducted with patients with clinical diagnoses of insomnia and focuses on the development of a conceptual framework and endpoint model that identifies a hierarchy and interrelationships of potential outcomes in insomnia research. Focus groups were convened to discuss how patients experience insomnia and to generate items for patient-reported questionnaires on insomnia and associated daytime consequences. Results for the focus group produced two conceptual frameworks: one for sleep and one for daytime impairment. Each conceptual framework consists of hypothesized domains and items in each domain based on patient language taken from the focus group. These item pools may ultimately serve as a basis to develop new questionnaires to assess insomnia.
Stochastic and Deterministic Models for the Metastatic Emission Process: Formalisms and Crosslinks.
Gomez, Christophe; Hartung, Niklas
2018-01-01
Although the detection of metastases radically changes prognosis of and treatment decisions for a cancer patient, clinically undetectable micrometastases hamper a consistent classification into localized or metastatic disease. This chapter discusses mathematical modeling efforts that could help to estimate the metastatic risk in such a situation. We focus on two approaches: (1) a stochastic framework describing metastatic emission events at random times, formalized via Poisson processes, and (2) a deterministic framework describing the micrometastatic state through a size-structured density function in a partial differential equation model. Three aspects are addressed in this chapter. First, a motivation for the Poisson process framework is presented and modeling hypotheses and mechanisms are introduced. Second, we extend the Poisson model to account for secondary metastatic emission. Third, we highlight an inherent crosslink between the stochastic and deterministic frameworks and discuss its implications. For increased accessibility the chapter is split into an informal presentation of the results using a minimum of mathematical formalism and a rigorous mathematical treatment for more theoretically interested readers.
Tachyon logamediate inflation on the brane
NASA Astrophysics Data System (ADS)
Kamali, Vahid; Nik, Elahe Navaee
2017-07-01
According to a Barrow solution for the scale factor of the universe, the main properties of the tachyon inflation model in the framework of the RSII braneworld are studied. Within this framework the basic slow-roll parameters are calculated analytically. We compare this inflationary scenario to the latest observational data. The predicted spectral index and the tensor-to-scalar fluctuation ratio are in excellent agreement with those of Planck 2015. The current predictions are consistent with those of viable inflationary models.
NASA Astrophysics Data System (ADS)
Bora, Sanjay; Scherbaum, Frank; Kuehn, Nicolas; Stafford, Peter; Edwards, Benjamin
2016-04-01
The current practice of deriving empirical ground motion prediction equations (GMPEs) involves using ground motions recorded at multiple sites. However, in applications like site-specific (e.g., critical facility) hazard ground motions obtained from the GMPEs are need to be adjusted/corrected to a particular site/site-condition under investigation. This study presents a complete framework for developing a response spectral GMPE, within which the issue of adjustment of ground motions is addressed in a manner consistent with the linear system framework. The present approach is a two-step process in which the first step consists of deriving two separate empirical models, one for Fourier amplitude spectra (FAS) and the other for a random vibration theory (RVT) optimized duration (Drvto) of ground motion. In the second step the two models are combined within the RVT framework to obtain full response spectral amplitudes. Additionally, the framework also involves a stochastic model based extrapolation of individual Fourier spectra to extend the useable frequency limit of the empirically derived FAS model. The stochastic model parameters were determined by inverting the Fourier spectral data using an approach similar to the one as described in Edwards and Faeh (2013). Comparison of median predicted response spectra from present approach with those from other regional GMPEs indicates that the present approach can also be used as a stand-alone model. The dataset used for the presented analysis is a subset of the recently compiled database RESORCE-2012 across Europe, the Middle East and the Mediterranean region.
AGAMA: Action-based galaxy modeling framework
NASA Astrophysics Data System (ADS)
Vasiliev, Eugene
2018-05-01
The AGAMA library models galaxies. It computes gravitational potential and forces, performs orbit integration and analysis, and can convert between position/velocity and action/angle coordinates. It offers a framework for finding best-fit parameters of a model from data and self-consistent multi-component galaxy models, and contains useful auxiliary utilities such as various mathematical routines. The core of the library is written in C++, and there are Python and Fortran interfaces. AGAMA may be used as a plugin for the stellar-dynamical software packages galpy (ascl:1411.008), AMUSE (ascl:1107.007), and NEMO (ascl:1010.051).
A Framework and Toolkit for the Construction of Multimodal Learning Interfaces
1998-04-29
human communication modalities in the context of a broad class of applications, specifically those that support state manipulation via parameterized actions. The multimodal semantic model is also the basis for a flexible, domain independent, incrementally trainable multimodal interpretation algorithm based on a connectionist network. The second major contribution is an application framework consisting of reusable components and a modular, distributed system architecture. Multimodal application developers can assemble the components in the framework into a new application,
Modeling Environment for Total Risk-4M
MENTOR-4M uses an integrated, mechanistically consistent, source-to-dose modeling framework to quantify simultaneous exposures and doses of individuals and populations to multiple contaminants. It is an implementation of the MENTOR system for exposures to Multiple contaminants fr...
Designing Online Management Education Courses Using the Community of Inquiry Framework
ERIC Educational Resources Information Center
Weyant, Lee E.
2013-01-01
Online learning has grown as a program delivery option for many colleges and programs of business. The Community of Inquiry (CoI) framework consisting of three interrelated elements--social presence, cognitive presence, and teaching presences--provides a model to guide business faculty in their online course design. The course design of an online…
ERIC Educational Resources Information Center
Lyon, Edward G.
2011-01-01
This paper describes the Assessment Practices Framework and how I used it to study a high school Chemistry teacher as she designed, implemented, and learned from a chemistry lab report. The framework consists of exploring three teacher-centered components of classroom assessment (assessment beliefs, practices, and reflection) and analyzing…
An Integrated Framework for Model-Based Distributed Diagnosis and Prognosis
NASA Technical Reports Server (NTRS)
Bregon, Anibal; Daigle, Matthew J.; Roychoudhury, Indranil
2012-01-01
Diagnosis and prognosis are necessary tasks for system reconfiguration and fault-adaptive control in complex systems. Diagnosis consists of detection, isolation and identification of faults, while prognosis consists of prediction of the remaining useful life of systems. This paper presents a novel integrated framework for model-based distributed diagnosis and prognosis, where system decomposition is used to enable the diagnosis and prognosis tasks to be performed in a distributed way. We show how different submodels can be automatically constructed to solve the local diagnosis and prognosis problems. We illustrate our approach using a simulated four-wheeled rover for different fault scenarios. Our experiments show that our approach correctly performs distributed fault diagnosis and prognosis in an efficient and robust manner.
Dash, Bibek
2018-04-26
The present work deals with a density functional theory (DFT) study of porous organic framework materials containing - groups for CO 2 capture. In this study, first principle calculations were performed for CO 2 adsorption using N-containing covalent organic framework (COFs) models. Ab initio and DFT-based methods were used to characterize the N-containing porous model system based on their interaction energies upon complexing with CO 2 and nitrogen gas. Binding energies (BEs) of CO 2 and N 2 molecules with the polymer framework were calculated with DFT methods. Hybrid B3LYP and second order MP2 methods combined with of Pople 6-31G(d,p) and correlation consistent basis sets cc-pVDZ, cc-pVTZ and aug-ccVDZ were used to calculate BEs. The effect of linker groups in the designed covalent organic framework model system on the CO 2 and N 2 interactions was studied using quantum calculations.
Effects of agricultural conservation practices on N loads in the Mississippi-Atchafalya River Basin
USDA-ARS?s Scientific Manuscript database
A modeling framework consisting of a farm-scale model, Agricultural Policy Environmental Extender (APEX); a watershedscale model, Soil and Water Assessment Tool (SWAT); and databases was used in the Conservation Effects Assessment Project to quantify the environmental benefits of conservation practi...
A nursing-specific model of EPR documentation: organizational and professional requirements.
von Krogh, Gunn; Nåden, Dagfinn
2008-01-01
To present the Norwegian documentation KPO model (quality assurance, problem solving, and caring). To present the requirements and multiple electronic patient record (EPR) functions the model is designed to address. The model's professional substance, a conceptual framework for nursing practice is developed by examining, reorganizing, and completing existing frameworks. The model's methodology, an information management system, is developed using an expert group. Both model elements were clinically tested over a period of 1 year. The model is designed for nursing documentation in step with statutory, organizational, and professional requirements. Complete documentation is arranged for by incorporating the Nursing Minimum Data Set. A systematic and comprehensive documentation is arranged for by establishing categories as provided in the model's framework domains. Consistent documentation is arranged for by incorporating NANDA-I Nursing Diagnoses, Nursing Intervention Classification, and Nursing Outcome Classification. The model can be used as a tool in cooperation with vendors to ensure the interests of the nursing profession is met when developing EPR solutions in healthcare. The model can provide clinicians with a framework for documentation in step with legal and organizational requirements and at the same time retain the ability to record all aspects of clinical nursing.
Modeling Environment for Total Risk-1A
MENTOR-1A uses an integrated, mechanistically consistent source-to-dose modeling framework to quantify inhalation exposure and dose for individuals and/or populations due to co-occurring air pollutants. It uses the "One Atmosphere" concept to characterize simultaneous exposures t...
ERIC Educational Resources Information Center
Chen, Chung-Yang; Chang, Huiju; Hsu, Wen-Chin; Sheen, Gwo-Ji
2017-01-01
This paper proposes a training model for raters, with the goal to improve the intra- and inter-consistency of evaluation quality for higher education curricula. The model, termed the learning, behaviour and reaction (LBR) circular training model, is an interdisciplinary application from the business and organisational training domain. The…
Using the SIOP Model for Effective Content Teaching with Second and Foreign Language Learners
ERIC Educational Resources Information Center
Kareva, Veronika; Echevarria, Jana
2013-01-01
In this paper we present a comprehensive model of instruction for providing consistent, high quality teaching to L2 students. This model, the SIOP Model (Sheltered Instruction Observation Protocol), provides an explicit framework for organizing instructional practices to optimize the effectiveness of teaching second and foreign language learners.…
78 FR 26269 - Connect America Fund; High-Cost Universal Service Support
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-06
... the model platform, which is the basic framework for the model consisting of key assumptions about the... combination of competitive bidding and a new forward-looking model of the cost of constructing modern multi-purpose networks.'' Using the cost model to ``estimate the support necessary to serve areas where costs...
NASA Technical Reports Server (NTRS)
Kim, E.; Tedesco, M.; Reichle, R.; Choudhury, B.; Peters-Lidard C.; Foster, J.; Hall, D.; Riggs, G.
2006-01-01
Microwave-based retrievals of snow parameters from satellite observations have a long heritage and have so far been generated primarily by regression-based empirical "inversion" methods based on snapshots in time. Direct assimilation of microwave radiance into physical land surface models can be used to avoid errors associated with such retrieval/inversion methods, instead utilizing more straightforward forward models and temporal information. This approach has been used for years for atmospheric parameters by the operational weather forecasting community with great success. Recent developments in forward radiative transfer modeling, physical land surface modeling, and land data assimilation are converging to allow the assembly of an integrated framework for snow/cold lands modeling and radiance assimilation. The objective of the Goddard snow radiance assimilation project is to develop such a framework and explore its capabilities. The key elements of this framework include: a forward radiative transfer model (FRTM) for snow, a snowpack physical model, a land surface water/energy cycle model, and a data assimilation scheme. In fact, multiple models are available for each element enabling optimization to match the needs of a particular study. Together these form a modular and flexible framework for self-consistent, physically-based remote sensing and water/energy cycle studies. In this paper we will describe the elements and the integration plan. All modules will operate within the framework of the Land Information System (LIS), a land surface modeling framework with data assimilation capabilities running on a parallel-node computing cluster. Capabilities for assimilation of snow retrieval products are already under development for LIS. We will describe plans to add radiance-based assimilation capabilities. Plans for validation activities using field measurements will also be discussed.
HyDE Framework for Stochastic and Hybrid Model-Based Diagnosis
NASA Technical Reports Server (NTRS)
Narasimhan, Sriram; Brownston, Lee
2012-01-01
Hybrid Diagnosis Engine (HyDE) is a general framework for stochastic and hybrid model-based diagnosis that offers flexibility to the diagnosis application designer. The HyDE architecture supports the use of multiple modeling paradigms at the component and system level. Several alternative algorithms are available for the various steps in diagnostic reasoning. This approach is extensible, with support for the addition of new modeling paradigms as well as diagnostic reasoning algorithms for existing or new modeling paradigms. HyDE is a general framework for stochastic hybrid model-based diagnosis of discrete faults; that is, spontaneous changes in operating modes of components. HyDE combines ideas from consistency-based and stochastic approaches to model- based diagnosis using discrete and continuous models to create a flexible and extensible architecture for stochastic and hybrid diagnosis. HyDE supports the use of multiple paradigms and is extensible to support new paradigms. HyDE generates candidate diagnoses and checks them for consistency with the observations. It uses hybrid models built by the users and sensor data from the system to deduce the state of the system over time, including changes in state indicative of faults. At each time step when observations are available, HyDE checks each existing candidate for continued consistency with the new observations. If the candidate is consistent, it continues to remain in the candidate set. If it is not consistent, then the information about the inconsistency is used to generate successor candidates while discarding the candidate that was inconsistent. The models used by HyDE are similar to simulation models. They describe the expected behavior of the system under nominal and fault conditions. The model can be constructed in modular and hierarchical fashion by building component/subsystem models (which may themselves contain component/ subsystem models) and linking them through shared variables/parameters. The component model is expressed as operating modes of the component and conditions for transitions between these various modes. Faults are modeled as transitions whose conditions for transitions are unknown (and have to be inferred through the reasoning process). Finally, the behavior of the components is expressed as a set of variables/ parameters and relations governing the interaction between the variables. The hybrid nature of the systems being modeled is captured by a combination of the above transitional model and behavioral model. Stochasticity is captured as probabilities associated with transitions (indicating the likelihood of that transition being taken), as well as noise on the sensed variables.
The Dual-Factor Model of Mental Health: Further Study of the Determinants of Group Differences
ERIC Educational Resources Information Center
Lyons, Michael D.; Huebner, E. Scott; Hills, Kimberly J.; Shinkareva, Svetlana V.
2012-01-01
Consistent with a positive psychology framework, this study examined the contributions of personality, environmental, and perceived social support variables in classifying adolescents using Greenspoon and Saklofske's Dual-Factor model of mental health. This model incorporates information about positive subjective well-being (SWB), along with…
A framework for scalable parameter estimation of gene circuit models using structural information.
Kuwahara, Hiroyuki; Fan, Ming; Wang, Suojin; Gao, Xin
2013-07-01
Systematic and scalable parameter estimation is a key to construct complex gene regulatory models and to ultimately facilitate an integrative systems biology approach to quantitatively understand the molecular mechanisms underpinning gene regulation. Here, we report a novel framework for efficient and scalable parameter estimation that focuses specifically on modeling of gene circuits. Exploiting the structure commonly found in gene circuit models, this framework decomposes a system of coupled rate equations into individual ones and efficiently integrates them separately to reconstruct the mean time evolution of the gene products. The accuracy of the parameter estimates is refined by iteratively increasing the accuracy of numerical integration using the model structure. As a case study, we applied our framework to four gene circuit models with complex dynamics based on three synthetic datasets and one time series microarray data set. We compared our framework to three state-of-the-art parameter estimation methods and found that our approach consistently generated higher quality parameter solutions efficiently. Although many general-purpose parameter estimation methods have been applied for modeling of gene circuits, our results suggest that the use of more tailored approaches to use domain-specific information may be a key to reverse engineering of complex biological systems. http://sfb.kaust.edu.sa/Pages/Software.aspx. Supplementary data are available at Bioinformatics online.
Wen, Wei; Capolungo, Laurent; Patra, Anirban; ...
2017-02-23
In this work, a physics-based thermal creep model is developed based on the understanding of the microstructure in Fe-Cr alloys. This model is associated with a transition state theory based framework that considers the distribution of internal stresses at sub-material point level. The thermally activated dislocation glide and climb mechanisms are coupled in the obstacle-bypass processes for both dislocation and precipitate-type barriers. A kinetic law is proposed to track the dislocation densities evolution in the subgrain interior and in the cell wall. The predicted results show that this model, embedded in the visco-plastic self-consistent (VPSC) framework, captures well the creepmore » behaviors for primary and steady-state stages under various loading conditions. We also discuss the roles of the mechanisms involved.« less
Left Ventricular Endocardium Tracking by Fusion of Biomechanical and Deformable Models
Gu, Jason
2014-01-01
This paper presents a framework for tracking left ventricular (LV) endocardium through 2D echocardiography image sequence. The framework is based on fusion of biomechanical (BM) model of the heart with the parametric deformable model. The BM model constitutive equation consists of passive and active strain energy functions. The deformations of the LV are obtained by solving the constitutive equations using ABAQUS FEM in each frame in the cardiac cycle. The strain energy functions are defined in two user subroutines for active and passive phases. Average fusion technique is used to fuse the BM and deformable model contours. Experimental results are conducted to verify the detected contours and the results are evaluated by comparing themto a created gold standard. The results and the evaluation proved that the framework has the tremendous potential to track and segment the LV through the whole cardiac cycle. PMID:24587814
Knowledge Extraction from Atomically Resolved Images.
Vlcek, Lukas; Maksov, Artem; Pan, Minghu; Vasudevan, Rama K; Kalinin, Sergei V
2017-10-24
Tremendous strides in experimental capabilities of scanning transmission electron microscopy and scanning tunneling microscopy (STM) over the past 30 years made atomically resolved imaging routine. However, consistent integration and use of atomically resolved data with generative models is unavailable, so information on local thermodynamics and other microscopic driving forces encoded in the observed atomic configurations remains hidden. Here, we present a framework based on statistical distance minimization to consistently utilize the information available from atomic configurations obtained from an atomically resolved image and extract meaningful physical interaction parameters. We illustrate the applicability of the framework on an STM image of a FeSe x Te 1-x superconductor, with the segregation of the chalcogen atoms investigated using a nonideal interacting solid solution model. This universal method makes full use of the microscopic degrees of freedom sampled in an atomically resolved image and can be extended via Bayesian inference toward unbiased model selection with uncertainty quantification.
Data free inference with processed data products
Chowdhary, K.; Najm, H. N.
2014-07-12
Here, we consider the context of probabilistic inference of model parameters given error bars or confidence intervals on model output values, when the data is unavailable. We introduce a class of algorithms in a Bayesian framework, relying on maximum entropy arguments and approximate Bayesian computation methods, to generate consistent data with the given summary statistics. Once we obtain consistent data sets, we pool the respective posteriors, to arrive at a single, averaged density on the parameters. This approach allows us to perform accurate forward uncertainty propagation consistent with the reported statistics.
Modeling of RF/MHD coupling using NIMROD, GENRAY, and the Integrated Plasma Simulator
NASA Astrophysics Data System (ADS)
Jenkins, Thomas; Schnack, D. D.; Sovinec, C. R.; Hegna, C. C.; Callen, J. D.; Ebrahimi, F.; Kruger, S. E.; Carlsson, J.; Held, E. D.; Ji, J.-Y.; Harvey, R. W.; Smirnov, A. P.
2009-05-01
We summarize ongoing theoretical/numerical work relevant to the development of a self--consistent framework for the inclusion of RF effects in fluid simulations; specifically considering resistive tearing mode stabilization in tokamak (DIII--D--like) geometry via ECCD. Relatively simple (though non--self--consistent) models for the RF--induced currents are incorporated into the fluid equations, markedly reducing the width of the nonlinearly saturated magnetic islands generated by tearing modes. We report our progress toward the self--consistent modeling of these RF--induced currents. The initial interfacing of the NIMROD* code with the GENRAY/CQL3D** codes (calculating RF propagation and energy/momentum deposition) via the Integrated Plasma Simulator (IPS) framework*** is explained, equilibration of RF--induced currents over the plasma flux surfaces is investigated, and studies exploring the efficient reduction of saturated island widths through time modulation and spatial localization of the ECCD are presented. *[Sovinec et al., JCP 195, 355 (2004)] **[www.compxco.com] ***[This research and the IPS development are both part of the SWIM project. Funded by U.S. DoE.
An algebra of discrete event processes
NASA Technical Reports Server (NTRS)
Heymann, Michael; Meyer, George
1991-01-01
This report deals with an algebraic framework for modeling and control of discrete event processes. The report consists of two parts. The first part is introductory, and consists of a tutorial survey of the theory of concurrency in the spirit of Hoare's CSP, and an examination of the suitability of such an algebraic framework for dealing with various aspects of discrete event control. To this end a new concurrency operator is introduced and it is shown how the resulting framework can be applied. It is further shown that a suitable theory that deals with the new concurrency operator must be developed. In the second part of the report the formal algebra of discrete event control is developed. At the present time the second part of the report is still an incomplete and occasionally tentative working paper.
A climate robust integrated modelling framework for regional impact assessment of climate change
NASA Astrophysics Data System (ADS)
Janssen, Gijs; Bakker, Alexander; van Ek, Remco; Groot, Annemarie; Kroes, Joop; Kuiper, Marijn; Schipper, Peter; van Walsum, Paul; Wamelink, Wieger; Mol, Janet
2013-04-01
Decision making towards climate proofing the water management of regional catchments can benefit greatly from the availability of a climate robust integrated modelling framework, capable of a consistent assessment of climate change impacts on the various interests present in the catchments. In the Netherlands, much effort has been devoted to developing state-of-the-art regional dynamic groundwater models with a very high spatial resolution (25x25 m2). Still, these models are not completely satisfactory to decision makers because the modelling concepts do not take into account feedbacks between meteorology, vegetation/crop growth, and hydrology. This introduces uncertainties in forecasting the effects of climate change on groundwater, surface water, agricultural yields, and development of groundwater dependent terrestrial ecosystems. These uncertainties add to the uncertainties about the predictions on climate change itself. In order to create an integrated, climate robust modelling framework, we coupled existing model codes on hydrology, agriculture and nature that are currently in use at the different research institutes in the Netherlands. The modelling framework consists of the model codes MODFLOW (groundwater flow), MetaSWAP (vadose zone), WOFOST (crop growth), SMART2-SUMO2 (soil-vegetation) and NTM3 (nature valuation). MODFLOW, MetaSWAP and WOFOST are coupled online (i.e. exchange information on time step basis). Thus, changes in meteorology and CO2-concentrations affect crop growth and feedbacks between crop growth, vadose zone water movement and groundwater recharge are accounted for. The model chain WOFOST-MetaSWAP-MODFLOW generates hydrological input for the ecological prediction model combination SMART2-SUMO2-NTM3. The modelling framework was used to support the regional water management decision making process in the 267 km2 Baakse Beek-Veengoot catchment in the east of the Netherlands. Computations were performed for regionalized 30-year climate change scenarios developed by KNMI for precipitation and reference evapotranspiration according to Penman-Monteith. Special focus in the project was on the role of uncertainty. How valid is the information that is generated by this modelling framework? What are the most important uncertainties of the input data, how do they affect the results of the model chain and how can the uncertainties of the data, results, and model concepts be quantified and communicated? Besides these technical issues, an important part of the study was devoted to the perception of stakeholders. Stakeholder analysis and additional working sessions yielded insight into how the models, their results and the uncertainties are perceived, how the modelling framework and results connect to the stakeholders' information demands and what kind of additional information is needed for adequate support on decision making.
Understanding HIV disclosure: A review and application of the Disclosure Processes Model
Chaudoir, Stephenie R.; Fisher, Jeffrey D.; Simoni, Jane M.
2014-01-01
HIV disclosure is a critical component of HIV/AIDS prevention and treatment efforts, yet the field lacks a comprehensive theoretical framework with which to study how HIV-positive individuals make decisions about disclosing their serostatus and how these decisions affect them. Recent theorizing in the context of the Disclosure Processes Model has suggested that the disclosure process consists of antecedent goals, the disclosure event itself, mediating processes and outcomes, and a feedback loop. In this paper, we apply this new theoretical framework to HIV disclosure in order to review the current state of the literature, identify gaps in existing research, and highlight the implications of the framework for future work in this area. PMID:21514708
David M. Bell; Andrew N. Gray
2015-01-01
Models of forest succession provide an appealing conceptual framework for understanding forest dynamics, but uncertainty in the degree to which patterns are regionally consistent might limit the application of successional theory in forest management. Remeasurements of forest inventory networks provide an opportunity to assess this consistency, improving our...
Rangarajan, Srinivas; Maravelias, Christos T.; Mavrikakis, Manos
2017-11-09
Here, we present a general optimization-based framework for (i) ab initio and experimental data driven mechanistic modeling and (ii) optimal catalyst design of heterogeneous catalytic systems. Both cases are formulated as a nonlinear optimization problem that is subject to a mean-field microkinetic model and thermodynamic consistency requirements as constraints, for which we seek sparse solutions through a ridge (L 2 regularization) penalty. The solution procedure involves an iterative sequence of forward simulation of the differential algebraic equations pertaining to the microkinetic model using a numerical tool capable of handling stiff systems, sensitivity calculations using linear algebra, and gradient-based nonlinear optimization.more » A multistart approach is used to explore the solution space, and a hierarchical clustering procedure is implemented for statistically classifying potentially competing solutions. An example of methanol synthesis through hydrogenation of CO and CO 2 on a Cu-based catalyst is used to illustrate the framework. The framework is fast, is robust, and can be used to comprehensively explore the model solution and design space of any heterogeneous catalytic system.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rangarajan, Srinivas; Maravelias, Christos T.; Mavrikakis, Manos
Here, we present a general optimization-based framework for (i) ab initio and experimental data driven mechanistic modeling and (ii) optimal catalyst design of heterogeneous catalytic systems. Both cases are formulated as a nonlinear optimization problem that is subject to a mean-field microkinetic model and thermodynamic consistency requirements as constraints, for which we seek sparse solutions through a ridge (L 2 regularization) penalty. The solution procedure involves an iterative sequence of forward simulation of the differential algebraic equations pertaining to the microkinetic model using a numerical tool capable of handling stiff systems, sensitivity calculations using linear algebra, and gradient-based nonlinear optimization.more » A multistart approach is used to explore the solution space, and a hierarchical clustering procedure is implemented for statistically classifying potentially competing solutions. An example of methanol synthesis through hydrogenation of CO and CO 2 on a Cu-based catalyst is used to illustrate the framework. The framework is fast, is robust, and can be used to comprehensively explore the model solution and design space of any heterogeneous catalytic system.« less
Seward, Kirsty; Wolfenden, Luke; Wiggers, John; Finch, Meghan; Wyse, Rebecca; Oldmeadow, Christopher; Presseau, Justin; Clinton-McHarg, Tara; Yoong, Sze Lin
2017-04-04
While there are number of frameworks which focus on supporting the implementation of evidence based approaches, few psychometrically valid measures exist to assess constructs within these frameworks. This study aimed to develop and psychometrically assess a scale measuring each domain of the Theoretical Domains Framework for use in assessing the implementation of dietary guidelines within a non-health care setting (childcare services). A 75 item 14-domain Theoretical Domains Framework Questionnaire (TDFQ) was developed and administered via telephone interview to 202 centre based childcare service cooks who had a role in planning the service menu. Confirmatory factor analysis (CFA) was undertaken to assess the reliability, discriminant validity and goodness of fit of the 14-domain theoretical domain framework measure. For the CFA, five iterative processes of adjustment were undertaken where 14 items were removed, resulting in a final measure consisting of 14 domains and 61 items. For the final measure: the Chi-Square goodness of fit statistic was 3447.19; the Standardized Root Mean Square Residual (SRMR) was 0.070; the Root Mean Square Error of Approximation (RMSEA) was 0.072; and the Comparative Fit Index (CFI) had a value of 0.78. While only one of the three indices support goodness of fit of the measurement model tested, a 14-domain model with 61 items showed good discriminant validity and internally consistent items. Future research should aim to assess the psychometric properties of the developed TDFQ in other community-based settings.
Computational Model for Ethnographically Informed Systems Design
NASA Astrophysics Data System (ADS)
Iqbal, Rahat; James, Anne; Shah, Nazaraf; Terken, Jacuqes
This paper presents a computational model for ethnographically informed systems design that can support complex and distributed cooperative activities. This model is based on an ethnographic framework consisting of three important dimensions (e.g., distributed coordination, awareness of work and plans and procedure), and the BDI (Belief, Desire and Intention) model of intelligent agents. The ethnographic framework is used to conduct ethnographic analysis and to organise ethnographically driven information into three dimensions, whereas the BDI model allows such information to be mapped upon the underlying concepts of multi-agent systems. The advantage of this model is that it is built upon an adaptation of existing mature and well-understood techniques. By the use of this model, we also address the cognitive aspects of systems design.
Kitson, Nicole A; Price, Morgan; Lau, Francis Y; Showler, Grey
2013-10-17
Medication errors are a common type of preventable errors in health care causing unnecessary patient harm, hospitalization, and even fatality. Improving communication between providers and between providers and patients is a key aspect of decreasing medication errors and improving patient safety. Medication management requires extensive collaboration and communication across roles and care settings, which can reduce (or contribute to) medication-related errors. Medication management involves key recurrent activities (determine need, prescribe, dispense, administer, and monitor/evaluate) with information communicated within and between each. Despite its importance, there is a lack of conceptual models that explore medication communication specifically across roles and settings. This research seeks to address that gap. The Circle of Care Modeling (CCM) approach was used to build a model of medication communication activities across the circle of care. CCM positions the patient in the centre of his or her own healthcare system; providers and other roles are then modeled around the patient as a web of relationships. Recurrent medication communication activities were mapped to the medication management framework. The research occurred in three iterations, to test and revise the model: Iteration 1 consisted of a literature review and internal team discussion, Iteration 2 consisted of interviews, observation, and a discussion group at a Community Health Centre, and Iteration 3 consisted of interviews and a discussion group in the larger community. Each iteration provided further detail to the Circle of Care medication communication model. Specific medication communication activities were mapped along each communication pathway between roles and to the medication management framework. We could not map all medication communication activities to the medication management framework; we added Coordinate as a separate and distinct recurrent activity. We saw many examples of coordination activities, for instance, Medical Office Assistants acting as a liaison between pharmacists and family physicians to clarify prescription details. Through the use of CCM we were able to unearth tacitly held knowledge to expand our understanding of medication communication. Drawing out the coordination activities could be a missing piece for us to better understand how to streamline and improve multi-step communication processes with a goal of improving patient safety.
2013-01-01
Background Medication errors are a common type of preventable errors in health care causing unnecessary patient harm, hospitalization, and even fatality. Improving communication between providers and between providers and patients is a key aspect of decreasing medication errors and improving patient safety. Medication management requires extensive collaboration and communication across roles and care settings, which can reduce (or contribute to) medication-related errors. Medication management involves key recurrent activities (determine need, prescribe, dispense, administer, and monitor/evaluate) with information communicated within and between each. Despite its importance, there is a lack of conceptual models that explore medication communication specifically across roles and settings. This research seeks to address that gap. Methods The Circle of Care Modeling (CCM) approach was used to build a model of medication communication activities across the circle of care. CCM positions the patient in the centre of his or her own healthcare system; providers and other roles are then modeled around the patient as a web of relationships. Recurrent medication communication activities were mapped to the medication management framework. The research occurred in three iterations, to test and revise the model: Iteration 1 consisted of a literature review and internal team discussion, Iteration 2 consisted of interviews, observation, and a discussion group at a Community Health Centre, and Iteration 3 consisted of interviews and a discussion group in the larger community. Results Each iteration provided further detail to the Circle of Care medication communication model. Specific medication communication activities were mapped along each communication pathway between roles and to the medication management framework. We could not map all medication communication activities to the medication management framework; we added Coordinate as a separate and distinct recurrent activity. We saw many examples of coordination activities, for instance, Medical Office Assistants acting as a liaison between pharmacists and family physicians to clarify prescription details. Conclusions Through the use of CCM we were able to unearth tacitly held knowledge to expand our understanding of medication communication. Drawing out the coordination activities could be a missing piece for us to better understand how to streamline and improve multi-step communication processes with a goal of improving patient safety. PMID:24134454
Modeling electrokinetic flows by consistent implicit incompressible smoothed particle hydrodynamics
Pan, Wenxiao; Kim, Kyungjoo; Perego, Mauro; ...
2017-01-03
In this paper, we present a consistent implicit incompressible smoothed particle hydrodynamics (I 2SPH) discretization of Navier–Stokes, Poisson–Boltzmann, and advection–diffusion equations subject to Dirichlet or Robin boundary conditions. It is applied to model various two and three dimensional electrokinetic flows in simple or complex geometries. The accuracy and convergence of the consistent I 2SPH are examined via comparison with analytical solutions, grid-based numerical solutions, or empirical models. Lastly, the new method provides a framework to explore broader applications of SPH in microfluidics and complex fluids with charged objects, such as colloids and biomolecules, in arbitrary complex geometries.
Characterizing the Influence of Hemispheric Transport on Regional Air Pollution
Expansion of the coupled WRF-CMAQ modeling system to hemispheric scales is pursued to enable the development of a robust modeling framework in which the interactions between atmospheric processes occurring at various spatial and temporal scales can be examined in a consistent man...
Although the literature is replete with QSAR models developed for many toxic effects caused by reversible chemical interactions, the development of QSARs for the toxic effects of reactive chemicals lacks a consistent approach. While limitations exit, an appropriate starting-point...
Kononowicz, Andrzej A; Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil
2014-01-23
Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients' interactivity by enriching them with computational models of physiological and pathological processes. The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the model. The third element is the description of four integration strategies, and the last element consisted of evaluation profiles specifying the relevant feasibility features and acceptance thresholds for specific purposes. The group of experts who evaluated the virtual patient exemplar found higher integration more interesting, but at the same time they were more concerned with the validity of the result. The observed differences were not statistically significant. This paper outlines a framework for the integration of computational models into virtual patients. The opportunities and challenges of model exploitation are discussed from a number of user perspectives, considering different levels of model integration. The long-term aim for future research is to isolate the most crucial factors in the framework and to determine their influence on the integration outcome.
Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil
2014-01-01
Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the model. The third element is the description of four integration strategies, and the last element consisted of evaluation profiles specifying the relevant feasibility features and acceptance thresholds for specific purposes. The group of experts who evaluated the virtual patient exemplar found higher integration more interesting, but at the same time they were more concerned with the validity of the result. The observed differences were not statistically significant. Conclusions This paper outlines a framework for the integration of computational models into virtual patients. The opportunities and challenges of model exploitation are discussed from a number of user perspectives, considering different levels of model integration. The long-term aim for future research is to isolate the most crucial factors in the framework and to determine their influence on the integration outcome. PMID:24463466
A framework and a measurement instrument for sustainability of work practices in long-term care
2011-01-01
Background In health care, many organizations are working on quality improvement and/or innovation of their care practices. Although the effectiveness of improvement processes has been studied extensively, little attention has been given to sustainability of the changed work practices after implementation. The objective of this study is to develop a theoretical framework and measurement instrument for sustainability. To this end sustainability is conceptualized with two dimensions: routinization and institutionalization. Methods The exploratory methodological design consisted of three phases: a) framework development; b) instrument development; and c) field testing in former improvement teams in a quality improvement program for health care (N teams = 63, N individual = 112). Data were collected not until at least one year had passed after implementation. Underlying constructs and their interrelations were explored using Structural Equation Modeling and Principal Component Analyses. Internal consistency was computed with Cronbach's alpha coefficient. A long and a short version of the instrument are proposed. Results The χ2- difference test of the -2 Log Likelihood estimates demonstrated that the hierarchical two factor model with routinization and institutionalization as separate constructs showed a better fit than the one factor model (p < .01). Secondly, construct validity of the instrument was strong as indicated by the high factor loadings of the items. Finally, the internal consistency of the subscales was good. Conclusions The theoretical framework offers a valuable starting point for the analysis of sustainability on the level of actual changed work practices. Even though the two dimensions routinization and institutionalization are related, they are clearly distinguishable and each has distinct value in the discussion of sustainability. Finally, the subscales conformed to psychometric properties defined in literature. The instrument can be used in the evaluation of improvement projects. PMID:22087884
A framework for longitudinal data analysis via shape regression
NASA Astrophysics Data System (ADS)
Fishbaugh, James; Durrleman, Stanley; Piven, Joseph; Gerig, Guido
2012-02-01
Traditional longitudinal analysis begins by extracting desired clinical measurements, such as volume or head circumference, from discrete imaging data. Typically, the continuous evolution of a scalar measurement is estimated by choosing a 1D regression model, such as kernel regression or fitting a polynomial of fixed degree. This type of analysis not only leads to separate models for each measurement, but there is no clear anatomical or biological interpretation to aid in the selection of the appropriate paradigm. In this paper, we propose a consistent framework for the analysis of longitudinal data by estimating the continuous evolution of shape over time as twice differentiable flows of deformations. In contrast to 1D regression models, one model is chosen to realistically capture the growth of anatomical structures. From the continuous evolution of shape, we can simply extract any clinical measurements of interest. We demonstrate on real anatomical surfaces that volume extracted from a continuous shape evolution is consistent with a 1D regression performed on the discrete measurements. We further show how the visualization of shape progression can aid in the search for significant measurements. Finally, we present an example on a shape complex of the brain (left hemisphere, right hemisphere, cerebellum) that demonstrates a potential clinical application for our framework.
Group sparse multiview patch alignment framework with view consistency for image classification.
Gui, Jie; Tao, Dacheng; Sun, Zhenan; Luo, Yong; You, Xinge; Tang, Yuan Yan
2014-07-01
No single feature can satisfactorily characterize the semantic concepts of an image. Multiview learning aims to unify different kinds of features to produce a consensual and efficient representation. This paper redefines part optimization in the patch alignment framework (PAF) and develops a group sparse multiview patch alignment framework (GSM-PAF). The new part optimization considers not only the complementary properties of different views, but also view consistency. In particular, view consistency models the correlations between all possible combinations of any two kinds of view. In contrast to conventional dimensionality reduction algorithms that perform feature extraction and feature selection independently, GSM-PAF enjoys joint feature extraction and feature selection by exploiting l(2,1)-norm on the projection matrix to achieve row sparsity, which leads to the simultaneous selection of relevant features and learning transformation, and thus makes the algorithm more discriminative. Experiments on two real-world image data sets demonstrate the effectiveness of GSM-PAF for image classification.
NASA Astrophysics Data System (ADS)
Fang, Y.; Huang, M.; Liu, C.; Li, H.; Leung, L. R.
2013-11-01
Physical and biogeochemical processes regulate soil carbon dynamics and CO2 flux to and from the atmosphere, influencing global climate changes. Integration of these processes into Earth system models (e.g., community land models (CLMs)), however, currently faces three major challenges: (1) extensive efforts are required to modify modeling structures and to rewrite computer programs to incorporate new or updated processes as new knowledge is being generated, (2) computational cost is prohibitively expensive to simulate biogeochemical processes in land models due to large variations in the rates of biogeochemical processes, and (3) various mathematical representations of biogeochemical processes exist to incorporate different aspects of fundamental mechanisms, but systematic evaluation of the different mathematical representations is difficult, if not impossible. To address these challenges, we propose a new computational framework to easily incorporate physical and biogeochemical processes into land models. The new framework consists of a new biogeochemical module, Next Generation BioGeoChemical Module (NGBGC), version 1.0, with a generic algorithm and reaction database so that new and updated processes can be incorporated into land models without the need to manually set up the ordinary differential equations to be solved numerically. The reaction database consists of processes of nutrient flow through the terrestrial ecosystems in plants, litter, and soil. This framework facilitates effective comparison studies of biogeochemical cycles in an ecosystem using different conceptual models under the same land modeling framework. The approach was first implemented in CLM and benchmarked against simulations from the original CLM-CN code. A case study was then provided to demonstrate the advantages of using the new approach to incorporate a phosphorus cycle into CLM. To our knowledge, the phosphorus-incorporated CLM is a new model that can be used to simulate phosphorus limitation on the productivity of terrestrial ecosystems. The method presented here could in theory be applied to simulate biogeochemical cycles in other Earth system models.
Genetic Algorithm-Based Model Order Reduction of Aeroservoelastic Systems with Consistant States
NASA Technical Reports Server (NTRS)
Zhu, Jin; Wang, Yi; Pant, Kapil; Suh, Peter M.; Brenner, Martin J.
2017-01-01
This paper presents a model order reduction framework to construct linear parameter-varying reduced-order models of flexible aircraft for aeroservoelasticity analysis and control synthesis in broad two-dimensional flight parameter space. Genetic algorithms are used to automatically determine physical states for reduction and to generate reduced-order models at grid points within parameter space while minimizing the trial-and-error process. In addition, balanced truncation for unstable systems is used in conjunction with the congruence transformation technique to achieve locally optimal realization and weak fulfillment of state consistency across the entire parameter space. Therefore, aeroservoelasticity reduced-order models at any flight condition can be obtained simply through model interpolation. The methodology is applied to the pitch-plant model of the X-56A Multi-Use Technology Testbed currently being tested at NASA Armstrong Flight Research Center for flutter suppression and gust load alleviation. The present studies indicate that the reduced-order model with more than 12× reduction in the number of states relative to the original model is able to accurately predict system response among all input-output channels. The genetic-algorithm-guided approach exceeds manual and empirical state selection in terms of efficiency and accuracy. The interpolated aeroservoelasticity reduced order models exhibit smooth pole transition and continuously varying gains along a set of prescribed flight conditions, which verifies consistent state representation obtained by congruence transformation. The present model order reduction framework can be used by control engineers for robust aeroservoelasticity controller synthesis and novel vehicle design.
Page, M. P. A.; Norris, D.
2009-01-01
We briefly review the considerable evidence for a common ordering mechanism underlying both immediate serial recall (ISR) tasks (e.g. digit span, non-word repetition) and the learning of phonological word forms. In addition, we discuss how recent work on the Hebb repetition effect is consistent with the idea that learning in this task is itself a laboratory analogue of the sequence-learning component of phonological word-form learning. In this light, we present a unifying modelling framework that seeks to account for ISR and Hebb repetition effects, while being extensible to word-form learning. Because word-form learning is performed in the service of later word recognition, our modelling framework also subsumes a mechanism for word recognition from continuous speech. Simulations of a computational implementation of the modelling framework are presented and are shown to be in accordance with data from the Hebb repetition paradigm. PMID:19933143
An evaluation of bias in propensity score-adjusted non-linear regression models.
Wan, Fei; Mitra, Nandita
2018-03-01
Propensity score methods are commonly used to adjust for observed confounding when estimating the conditional treatment effect in observational studies. One popular method, covariate adjustment of the propensity score in a regression model, has been empirically shown to be biased in non-linear models. However, no compelling underlying theoretical reason has been presented. We propose a new framework to investigate bias and consistency of propensity score-adjusted treatment effects in non-linear models that uses a simple geometric approach to forge a link between the consistency of the propensity score estimator and the collapsibility of non-linear models. Under this framework, we demonstrate that adjustment of the propensity score in an outcome model results in the decomposition of observed covariates into the propensity score and a remainder term. Omission of this remainder term from a non-collapsible regression model leads to biased estimates of the conditional odds ratio and conditional hazard ratio, but not for the conditional rate ratio. We further show, via simulation studies, that the bias in these propensity score-adjusted estimators increases with larger treatment effect size, larger covariate effects, and increasing dissimilarity between the coefficients of the covariates in the treatment model versus the outcome model.
NASA Astrophysics Data System (ADS)
Chavali, Raghu Vamsi Krishna
The large-scale deployment of PV technology is very sensitive to the material and process costs. There are several potential candidates among p-n heterojunction (HJ) solar cells competing for higher efficiencies at lower material and process costs. These systems are, however, generally complex, involve diverse materials, and are not well understood. The direct translation of classical p-n homojunction theory to p-n HJ cells may not always be self-consistent and can lead, therefore, to misinterpretation of experimental results. Ultimately, this translation may not be useful for modeling and characterization of these solar cells. Hence, there is a strong need to redefine/reinterpret the modeling/characterization methodologies for HJ solar cells to produce a self-consistent framework for optimizing HJ solar cell designs. Towards this goal, we explore the physics and interpret characterization experiments of p-n HJs using Silicon HJ (HIT) solar cells. We will: (1) identify the key HJ properties that affect the cell efficiency; (2) analyze the dependence of key HJ properties on the carrier transport under light and dark conditions; (3) provide a selfconsistent multi-probe approach to extract the HJ parameters using several characterization techniques including dark I-V, light I-V, C-V, impedance spectroscopy, and Suns-Voc; (4) propose design guidelines to address the HJ bottlenecks of HIT cells; and (5) develop a process-to-module modeling framework to establish the module performance limits. The guidelines resulting from this multi-scale and self-consistent framework can be used to improve performance of HIT cells as well as other HJ based solar cells.
NASA Astrophysics Data System (ADS)
Tucker, Gregory E.; Lancaster, Stephen T.; Gasparini, Nicole M.; Bras, Rafael L.; Rybarczyk, Scott M.
2001-10-01
We describe a new set of data structures and algorithms for dynamic terrain modeling using a triangulated irregular network (TINs). The framework provides an efficient method for storing, accessing, and updating a Delaunay triangulation and its associated Voronoi diagram. The basic data structure consists of three interconnected data objects: triangles, nodes, and directed edges. Encapsulating each of these geometric elements within a data object makes it possible to essentially decouple the TIN representation from the modeling applications that make use of it. Both the triangulation and its corresponding Voronoi diagram can be rapidly retrieved or updated, making these methods well suited to adaptive remeshing schemes. We develop a set of algorithms for defining drainage networks and identifying closed depressions (e.g., lakes) for hydrologic and geomorphic modeling applications. We also outline simple numerical algorithms for solving network routing and 2D transport equations within the TIN framework. The methods are illustrated with two example applications, a landscape evolution model and a distributed rainfall-runoff model.
Integrated modeling applications for tokamak experiments with OMFIT
NASA Astrophysics Data System (ADS)
Meneghini, O.; Smith, S. P.; Lao, L. L.; Izacard, O.; Ren, Q.; Park, J. M.; Candy, J.; Wang, Z.; Luna, C. J.; Izzo, V. A.; Grierson, B. A.; Snyder, P. B.; Holland, C.; Penna, J.; Lu, G.; Raum, P.; McCubbin, A.; Orlov, D. M.; Belli, E. A.; Ferraro, N. M.; Prater, R.; Osborne, T. H.; Turnbull, A. D.; Staebler, G. M.
2015-08-01
One modeling framework for integrated tasks (OMFIT) is a comprehensive integrated modeling framework which has been developed to enable physics codes to interact in complicated workflows, and support scientists at all stages of the modeling cycle. The OMFIT development follows a unique bottom-up approach, where the framework design and capabilities organically evolve to support progressive integration of the components that are required to accomplish physics goals of increasing complexity. OMFIT provides a workflow for easily generating full kinetic equilibrium reconstructions that are constrained by magnetic and motional Stark effect measurements, and kinetic profile information that includes fast-ion pressure modeled by a transport code. It was found that magnetic measurements can be used to quantify the amount of anomalous fast-ion diffusion that is present in DIII-D discharges, and provide an estimate that is consistent with what would be needed for transport simulations to match the measured neutron rates. OMFIT was used to streamline edge-stability analyses, and evaluate the effect of resonant magnetic perturbation (RMP) on the pedestal stability, which have been found to be consistent with the experimental observations. The development of a five-dimensional numerical fluid model for estimating the effects of the interaction between magnetohydrodynamic (MHD) and microturbulence, and its systematic verification against analytic models was also supported by the framework. OMFIT was used for optimizing an innovative high-harmonic fast wave system proposed for DIII-D. For a parallel refractive index {{n}\\parallel}>3 , the conditions for strong electron-Landau damping were found to be independent of launched {{n}\\parallel} and poloidal angle. OMFIT has been the platform of choice for developing a neural-network based approach to efficiently perform a non-linear multivariate regression of local transport fluxes as a function of local dimensionless parameters. Transport predictions for thousands of DIII-D discharges showed excellent agreement with the power balance calculations across the whole plasma radius and over a broad range of operating regimes. Concerning predictive transport simulations, the framework made possible the design and automation of a workflow that enables self-consistent predictions of kinetic profiles and the plasma equilibrium. It is found that the feedback between the transport fluxes and plasma equilibrium can significantly affect the kinetic profiles predictions. Such a rich set of results provide tangible evidence of how bottom-up approaches can potentially provide a fast track to integrated modeling solutions that are functional, cost-effective, and in sync with the research effort of the community.
Nilsen, Vegard; Wyller, John
2016-01-01
Dose-response models are essential to quantitative microbial risk assessment (QMRA), providing a link between levels of human exposure to pathogens and the probability of negative health outcomes. In drinking water studies, the class of semi-mechanistic models known as single-hit models, such as the exponential and the exact beta-Poisson, has seen widespread use. In this work, an attempt is made to carefully develop the general mathematical single-hit framework while explicitly accounting for variation in (1) host susceptibility and (2) pathogen infectivity. This allows a precise interpretation of the so-called single-hit probability and precise identification of a set of statistical independence assumptions that are sufficient to arrive at single-hit models. Further analysis of the model framework is facilitated by formulating the single-hit models compactly using probability generating and moment generating functions. Among the more practically relevant conclusions drawn are: (1) for any dose distribution, variation in host susceptibility always reduces the single-hit risk compared to a constant host susceptibility (assuming equal mean susceptibilities), (2) the model-consistent representation of complete host immunity is formally demonstrated to be a simple scaling of the response, (3) the model-consistent expression for the total risk from repeated exposures deviates (gives lower risk) from the conventional expression used in applications, and (4) a model-consistent expression for the mean per-exposure dose that produces the correct total risk from repeated exposures is developed. © 2016 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Burlatsky, S. F.; Gummalla, M.; O'Neill, J.; Atrazhev, V. V.; Varyukhin, A. N.; Dmitriev, D. V.; Erikhman, N. S.
2012-10-01
Under typical Polymer Electrolyte Membrane Fuel Cell (PEMFC) fuel cell operating conditions, part of the membrane electrode assembly is subjected to humidity cycling due to variation of inlet gas RH and/or flow rate. Cyclic membrane hydration/dehydration would cause cyclic swelling/shrinking of the unconstrained membrane. In a constrained membrane, it causes cyclic stress resulting in mechanical failure in the area adjacent to the gas inlet. A mathematical modeling framework for prediction of the lifetime of a PEMFC membrane subjected to hydration cycling is developed in this paper. The model predicts membrane lifetime as a function of RH cycling amplitude and membrane mechanical properties. The modeling framework consists of three model components: a fuel cell RH distribution model, a hydration/dehydration induced stress model that predicts stress distribution in the membrane, and a damage accrual model that predicts membrane lifetime. Short descriptions of the model components along with overall framework are presented in the paper. The model was used for lifetime prediction of a GORE-SELECT membrane.
Statistical label fusion with hierarchical performance models
Asman, Andrew J.; Dagley, Alexander S.; Landman, Bennett A.
2014-01-01
Label fusion is a critical step in many image segmentation frameworks (e.g., multi-atlas segmentation) as it provides a mechanism for generalizing a collection of labeled examples into a single estimate of the underlying segmentation. In the multi-label case, typical label fusion algorithms treat all labels equally – fully neglecting the known, yet complex, anatomical relationships exhibited in the data. To address this problem, we propose a generalized statistical fusion framework using hierarchical models of rater performance. Building on the seminal work in statistical fusion, we reformulate the traditional rater performance model from a multi-tiered hierarchical perspective. This new approach provides a natural framework for leveraging known anatomical relationships and accurately modeling the types of errors that raters (or atlases) make within a hierarchically consistent formulation. Herein, we describe several contributions. First, we derive a theoretical advancement to the statistical fusion framework that enables the simultaneous estimation of multiple (hierarchical) performance models within the statistical fusion context. Second, we demonstrate that the proposed hierarchical formulation is highly amenable to the state-of-the-art advancements that have been made to the statistical fusion framework. Lastly, in an empirical whole-brain segmentation task we demonstrate substantial qualitative and significant quantitative improvement in overall segmentation accuracy. PMID:24817809
A Classroom Entry and Exit Game of Supply with Price-Taking Firms
ERIC Educational Resources Information Center
Cheung, Stephen L.
2005-01-01
The author describes a classroom game demonstrating the process of adjustment to long-run equilibrium in a market consisting of price-taking firms. This game unites and extends key insights from several simpler games in a framework more consistent with the standard textbook model of a competitive industry. Because firms have increasing marginal…
Consistency and Inconsistency in A Level Students' Understandings of a Number of Chemical Reactions.
ERIC Educational Resources Information Center
Kwen, Boo Hong
1996-01-01
Explores A level students' conceptions of some common chemical reactions. Findings indicate that students apply frameworks consistently across groups of events that they perceive to be similar. What was found to be lacking was the scientists' view of all the reactions being regarded as realizations of the same underlying conceptual model. Contains…
ERIC Educational Resources Information Center
Rikkerink, Marleen; Verbeeten, Henk; Simons, Robert-Jan; Ritzen, Henk
2016-01-01
This study presents the development process of a new model of educational innovation, that involves the use of digital technologies. The model is based on a broad theoretical framework together with research involving this long-term case study. The backbone of the model consists of a fundamental revision of a multi-level Organizational Learning…
Managing changes in the enterprise architecture modelling context
NASA Astrophysics Data System (ADS)
Khanh Dam, Hoa; Lê, Lam-Son; Ghose, Aditya
2016-07-01
Enterprise architecture (EA) models the whole enterprise in various aspects regarding both business processes and information technology resources. As the organisation grows, the architecture of its systems and processes must also evolve to meet the demands of the business environment. Evolving an EA model may involve making changes to various components across different levels of the EA. As a result, an important issue before making a change to an EA model is assessing the ripple effect of the change, i.e. change impact analysis. Another critical issue is change propagation: given a set of primary changes that have been made to the EA model, what additional secondary changes are needed to maintain consistency across multiple levels of the EA. There has been however limited work on supporting the maintenance and evolution of EA models. This article proposes an EA description language, namely ChangeAwareHierarchicalEA, integrated with an evolution framework to support both change impact analysis and change propagation within an EA model. The core part of our framework is a technique for computing the impact of a change and a new method for generating interactive repair plans from Alloy consistency rules that constrain the EA model.
Modelling Framework and Assistive Device for Peripheral Intravenous Injections
NASA Astrophysics Data System (ADS)
Kam, Kin F.; Robinson, Martin P.; Gilbert, Mathew A.; Pelah, Adar
2016-02-01
Intravenous access for blood sampling or drug administration that requires peripheral venepuncture is perhaps the most common invasive procedure practiced in hospitals, clinics and general practice surgeries.We describe an idealised mathematical framework for modelling the dynamics of the peripheral venepuncture process. Basic assumptions of the model are confirmed through motion analysis of needle trajectories during venepuncture, taken from video recordings of a skilled practitioner injecting into a practice kit. The framework is also applied to the design and construction of a proposed device for accurate needle guidance during venepuncture administration, assessed as consistent and repeatable in application and does not lead to over puncture. The study provides insights into the ubiquitous peripheral venepuncture process and may contribute to applications in training and in the design of new devices, including for use in robotic automation.
A Cosserat crystal plasticity and phase field theory for grain boundary migration
NASA Astrophysics Data System (ADS)
Ask, Anna; Forest, Samuel; Appolaire, Benoit; Ammar, Kais; Salman, Oguz Umut
2018-06-01
The microstructure evolution due to thermomechanical treatment of metals can largely be described by viscoplastic deformation, nucleation and grain growth. These processes take place over different length and time scales which present significant challenges when formulating simulation models. In particular, no overall unified field framework exists to model concurrent viscoplastic deformation and recrystallization and grain growth in metal polycrystals. In this work a thermodynamically consistent diffuse interface framework incorporating crystal viscoplasticity and grain boundary migration is elaborated. The Kobayashi-Warren-Carter (KWC) phase field model is extended to incorporate the full mechanical coupling with material and lattice rotations and evolution of dislocation densities. The Cosserat crystal plasticity theory is shown to be the appropriate framework to formulate the coupling between phase field and mechanics with proper distinction between bulk and grain boundary behaviour.
ERIC Educational Resources Information Center
Rijmen, Frank; Jeon, Minjeong; von Davier, Matthias; Rabe-Hesketh, Sophia
2014-01-01
Second-order item response theory models have been used for assessments consisting of several domains, such as content areas. We extend the second-order model to a third-order model for assessments that include subdomains nested in domains. Using a graphical model framework, it is shown how the model does not suffer from the curse of…
Schiavazzi, Daniele E.; Baretta, Alessia; Pennati, Giancarlo; Hsia, Tain-Yen; Marsden, Alison L.
2017-01-01
Summary Computational models of cardiovascular physiology can inform clinical decision-making, providing a physically consistent framework to assess vascular pressures and flow distributions, and aiding in treatment planning. In particular, lumped parameter network (LPN) models that make an analogy to electrical circuits offer a fast and surprisingly realistic method to reproduce the circulatory physiology. The complexity of LPN models can vary significantly to account, for example, for cardiac and valve function, respiration, autoregulation, and time-dependent hemodynamics. More complex models provide insight into detailed physiological mechanisms, but their utility is maximized if one can quickly identify patient specific parameters. The clinical utility of LPN models with many parameters will be greatly enhanced by automated parameter identification, particularly if parameter tuning can match non-invasively obtained clinical data. We present a framework for automated tuning of 0D lumped model parameters to match clinical data. We demonstrate the utility of this framework through application to single ventricle pediatric patients with Norwood physiology. Through a combination of local identifiability, Bayesian estimation and maximum a posteriori simplex optimization, we show the ability to automatically determine physiologically consistent point estimates of the parameters and to quantify uncertainty induced by errors and assumptions in the collected clinical data. We show that multi-level estimation, that is, updating the parameter prior information through sub-model analysis, can lead to a significant reduction in the parameter marginal posterior variance. We first consider virtual patient conditions, with clinical targets generated through model solutions, and second application to a cohort of four single-ventricle patients with Norwood physiology. PMID:27155892
NASA Astrophysics Data System (ADS)
Luscher, Darby J.; Bronkhorst, Curt A.; Alleman, Coleman N.; Addessio, Francis L.
2013-09-01
A physically consistent framework for combining pressure-volume-temperature equations of state with crystal plasticity models is developed for the application of modeling the response of single and polycrystals under shock conditions. The particular model is developed for copper, thus the approach focuses on crystals of cubic symmetry although many of the concepts in the approach are applicable to crystals of lower symmetry. We employ a multiplicative decomposition of the deformation gradient into isochoric elastic, thermoelastic dilation, and plastic parts leading to a definition of isochoric elastic Green-Lagrange strain. This finite deformation kinematic decomposition enables a decomposition of Helmholtz free-energy into terms reflecting dilatational thermoelasticity, strain energy due to long-range isochoric elastic deformation of the lattice and a term reflecting energy stored in short range elastic lattice deformation due to evolving defect structures. A model for the single crystal response of copper is implemented consistent with the framework into a three-dimensional Lagrangian finite element code. Simulations exhibit favorable agreement with single and bicrystal experimental data for shock pressures ranging from 3 to 110 GPa.
A constitutive model for magnetostriction based on thermodynamic framework
NASA Astrophysics Data System (ADS)
Ho, Kwangsoo
2016-08-01
This work presents a general framework for the continuum-based formulation of dissipative materials with magneto-mechanical coupling in the viewpoint of irreversible thermodynamics. The thermodynamically consistent model developed for the magnetic hysteresis is extended to include the magnetostrictive effect. The dissipative and hysteretic response of magnetostrictive materials is captured through the introduction of internal state variables. The evolution rate of magnetostrictive strain as well as magnetization is derived from thermodynamic and dissipative potentials in accordance with the general principles of thermodynamics. It is then demonstrated that the constitutive model is competent to describe the magneto-mechanical behavior by comparing simulation results with the experimental data reported in the literature.
Tempest: Tools for Addressing the Needs of Next-Generation Climate Models
NASA Astrophysics Data System (ADS)
Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.
2015-12-01
Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.
Threat driven modeling framework using petri nets for e-learning system.
Khamparia, Aditya; Pandey, Babita
2016-01-01
Vulnerabilities at various levels are main cause of security risks in e-learning system. This paper presents a modified threat driven modeling framework, to identify the threats after risk assessment which requires mitigation and how to mitigate those threats. To model those threat mitigations aspects oriented stochastic petri nets are used. This paper included security metrics based on vulnerabilities present in e-learning system. The Common Vulnerability Scoring System designed to provide a normalized method for rating vulnerabilities which will be used as basis in metric definitions and calculations. A case study has been also proposed which shows the need and feasibility of using aspect oriented stochastic petri net models for threat modeling which improves reliability, consistency and robustness of the e-learning system.
Abbass-Dick, Jennifer; Dennis, Cindy-Lee
Targeting mothers and fathers in breast-feeding promotion programs is recommended as research has found that father's support positively impacts breast-feeding duration and exclusivity. Breast-feeding coparenting refers to the manner in which parents work together to achieve their breast-feeding goals. The Breast-feeding Coparenting Framework was developed on the basis of diverse coparenting models and research related to father's involvement with breast-feeding. This framework consists of 5 components: joint breast-feeding goal setting, shared breast-feeding responsibility, proactive breast-feeding support, father's/partner's parental-child interactions, and productive communication and problem solving. This framework may be of value to policy makers and program providers working to improve breast-feeding outcomes.
NASA Astrophysics Data System (ADS)
Fuhrmann, Tamar; Schneider, Bertrand; Blikstein, Paulo
2018-05-01
The Bifocal Modelling Framework (BMF) is an approach for science learning which links students' physical experimentation with computer modelling in real time, focusing on the comparison of the two media. In this paper, we explore how a Bifocal Modelling implementation supported learning outcomes related to both content and metamodeling knowledge, focusing on the role of designing models. Our study consisted of three conditions implemented with a total of 69 9th grade high-school students. The first and second classes were assigned two implementation modes of BMF: with and without a model design module. The third condition, employed as a control, consisted of a class that received instruction in the school's traditional approach. Our results indicate that students participating in both BMF implementations demonstrated improved content knowledge and a better understanding of metamodeling. However, only the 'BMF-with-design' group improved significantly in both content and metamodeling knowledge. Our qualitative analyses indicate that both BMF groups designed detailed models that included scientific explanations. However only students who engaged in the model design component: (1) completed a detailed model displaying molecular interaction; and (2) developed a critical perspective about models. We discuss the implications of those results for teaching scientific science concepts and metamodeling knowledge.
Knebl, M R; Yang, Z-L; Hutchison, K; Maidment, D R
2005-06-01
This paper develops a framework for regional scale flood modeling that integrates NEXRAD Level III rainfall, GIS, and a hydrological model (HEC-HMS/RAS). The San Antonio River Basin (about 4000 square miles, 10,000 km2) in Central Texas, USA, is the domain of the study because it is a region subject to frequent occurrences of severe flash flooding. A major flood in the summer of 2002 is chosen as a case to examine the modeling framework. The model consists of a rainfall-runoff model (HEC-HMS) that converts precipitation excess to overland flow and channel runoff, as well as a hydraulic model (HEC-RAS) that models unsteady state flow through the river channel network based on the HEC-HMS-derived hydrographs. HEC-HMS is run on a 4 x 4 km grid in the domain, a resolution consistent with the resolution of NEXRAD rainfall taken from the local river authority. Watershed parameters are calibrated manually to produce a good simulation of discharge at 12 subbasins. With the calibrated discharge, HEC-RAS is capable of producing floodplain polygons that are comparable to the satellite imagery. The modeling framework presented in this study incorporates a portion of the recently developed GIS tool named Map to Map that has been created on a local scale and extends it to a regional scale. The results of this research will benefit future modeling efforts by providing a tool for hydrological forecasts of flooding on a regional scale. While designed for the San Antonio River Basin, this regional scale model may be used as a prototype for model applications in other areas of the country.
NASA Astrophysics Data System (ADS)
Lyu, Pin; Chen, Wenli; Li, Hui; Shen, Lian
2017-11-01
In recent studies, Yang, Meneveau & Shen (Physics of Fluids, 2014; Renewable Energy, 2014) developed a hybrid numerical framework for simulation of offshore wind farm. The framework consists of simulation of nonlinear surface waves using a high-order spectral method, large-eddy simulation of wind turbulence on a wave-surface-fitted curvilinear grid, and an actuator disk model for wind turbines. In the present study, several more precise wind turbine models, including the actuator line model, actuator disk model with rotation, and nacelle model, are introduced into the computation. Besides offshore wind turbines on fixed piles, the new computational framework has the capability to investigate the interaction among wind, waves, and floating wind turbines. In this study, onshore, offshore fixed pile, and offshore floating wind farms are compared in terms of flow field statistics and wind turbine power extraction rate. The authors gratefully acknowledge financial support from China Scholarship Council (No. 201606120186) and the Institute on the Environment of University of Minnesota.
NASA Astrophysics Data System (ADS)
Sarofim, M. C.; Martinich, J.; Waldhoff, S.; DeAngelo, B. J.; McFarland, J.; Jantarasami, L.; Shouse, K.; Crimmins, A.; Li, J.
2014-12-01
The Climate Change Impacts and Risk Analysis (CIRA) project establishes a new multi-model framework to systematically assess the physical impacts, economic damages, and risks from climate change. The primary goal of this framework is to estimate the degree to which climate change impacts and damages in the United States are avoided or reduced in the 21st century under multiple greenhouse gas (GHG) emissions mitigation scenarios. The first phase of the CIRA project is a modeling exercise that included two integrated assessment models and 15 sectoral models encompassing five broad impacts sectors: water resources, electric power, infrastructure, human health, and ecosystems. Three consistent socioeconomic and climate scenarios are used to analyze the benefits of global GHG mitigation targets: a reference scenario and two policy scenarios with total radiative forcing targets in 2100 of 4.5 W/m2 and 3.7 W/m2. In this exercise, the implications of key uncertainties are explored, including climate sensitivity, climate model, natural variability, and model structures and parameters. This presentation describes the motivations and goals of the CIRA project; the design and academic contribution of the first CIRA modeling exercise; and briefly summarizes several papers published in a special issue of Climatic Change. The results across impact sectors show that GHG mitigation provides benefits to the United States that increase over time, the effects of climate change can be strongly influenced by near-term policy choices, adaptation can reduce net damages, and impacts exhibit spatial and temporal patterns that may inform mitigation and adaptation policy discussions.
NASA Technical Reports Server (NTRS)
Plitau, Denis; Prasad, Narasimha S.
2012-01-01
The Active Sensing of CO2 Emissions over Nights Days and Seasons (ASCENDS) mission recommended by the NRC Decadal Survey has a desired accuracy of 0.3% in carbon dioxide mixing ratio (XCO2) retrievals requiring careful selection and optimization of the instrument parameters. NASA Langley Research Center (LaRC) is investigating 1.57 micron carbon dioxide as well as the 1.26-1.27 micron oxygen bands for our proposed ASCENDS mission requirements investigation. Simulation studies are underway for these bands to select optimum instrument parameters. The simulations are based on a multi-wavelength lidar modeling framework being developed at NASA LaRC to predict the performance of CO2 and O2 sensing from space and airborne platforms. The modeling framework consists of a lidar simulation module and a line-by-line calculation component with interchangeable lineshape routines to test the performance of alternative lineshape models in the simulations. As an option the line-by-line radiative transfer model (LBLRTM) program may also be used for line-by-line calculations. The modeling framework is being used to perform error analysis, establish optimum measurement wavelengths as well as to identify the best lineshape models to be used in CO2 and O2 retrievals. Several additional programs for HITRAN database management and related simulations are planned to be included in the framework. The description of the modeling framework with selected results of the simulation studies for CO2 and O2 sensing is presented in this paper.
Two classes of ODE models with switch-like behavior.
Just, Winfried; Korb, Mason; Elbert, Ben; Young, Todd
2013-12-01
In cases where the same real-world system can be modeled both by an ODE system ⅅ and a Boolean system , it is of interest to identify conditions under which the two systems will be consistent, that is, will make qualitatively equivalent predictions. In this note we introduce two broad classes of relatively simple models that provide a convenient framework for studying such questions. In contrast to the widely known class of Glass networks, the right-hand sides of our ODEs are Lipschitz-continuous. We prove that if has certain structures, consistency between ⅅ and is implied by sufficient separation of time scales in one class of our models. Namely, if the trajectories of are "one-stepping" then we prove a strong form of consistency and if has a certain monotonicity property then there is a weaker consistency between ⅅ and . These results appear to point to more general structure properties that favor consistency between ODE and Boolean models.
NASA Technical Reports Server (NTRS)
Agena, S. M.; Pusey, M. L.; Bogle, I. D.
1999-01-01
A thermodynamic framework (UNIQUAC model with temperature dependent parameters) is applied to model the salt-induced protein crystallization equilibrium, i.e., protein solubility. The framework introduces a term for the solubility product describing protein transfer between the liquid and solid phase and a term for the solution behavior describing deviation from ideal solution. Protein solubility is modeled as a function of salt concentration and temperature for a four-component system consisting of a protein, pseudo solvent (water and buffer), cation, and anion (salt). Two different systems, lysozyme with sodium chloride and concanavalin A with ammonium sulfate, are investigated. Comparison of the modeled and experimental protein solubility data results in an average root mean square deviation of 5.8%, demonstrating that the model closely follows the experimental behavior. Model calculations and model parameters are reviewed to examine the model and protein crystallization process. Copyright 1999 John Wiley & Sons, Inc.
Lee, Ki-Sun; Shin, Sang-Wan; Lee, Sang-Pyo; Kim, Jong-Eun; Kim, Jee-Hwan; Lee, Jeong-Yol
The purpose of this pilot study was to evaluate and compare polyetherketoneketone (PEKK) with different framework materials for implant-supported prostheses by means of a three-dimensional finite element analysis (3D-FEA) based on cone beam computed tomography (CBCT) and computer-aided design (CAD) data. A geometric model that consisted of four maxillary implants supporting a prosthesis framework was constructed from CBCT and CAD data of a treated patient. Three different materials (zirconia, titanium, and PEKK) were selected, and their material properties were simulated using FEA software in the generated geometric model. In the PEKK framework (ie, low elastic modulus) group, the stress transferred to the implant and simulated adjacent tissue was reduced when compressive stress was dominant, but increased when tensile stress was dominant. This study suggests that the shock-absorbing effects of a resilient implant-supported framework are limited in some areas and that rigid framework material shows a favorable stress distribution and safety of overall components of the prosthesis.
Modeling spray drift and runoff-related inputs of pesticides to receiving water.
Zhang, Xuyang; Luo, Yuzhou; Goh, Kean S
2018-03-01
Pesticides move to surface water via various pathways including surface runoff, spray drift and subsurface flow. Little is known about the relative contributions of surface runoff and spray drift in agricultural watersheds. This study develops a modeling framework to address the contribution of spray drift to the total loadings of pesticides in receiving water bodies. The modeling framework consists of a GIS module for identifying drift potential, the AgDRIFT model for simulating spray drift, and the Soil and Water Assessment Tool (SWAT) for simulating various hydrological and landscape processes including surface runoff and transport of pesticides. The modeling framework was applied on the Orestimba Creek Watershed, California. Monitoring data collected from daily samples were used for model evaluation. Pesticide mass deposition on the Orestimba Creek ranged from 0.08 to 6.09% of applied mass. Monitoring data suggests that surface runoff was the major pathway for pesticide entering water bodies, accounting for 76% of the annual loading; the rest 24% from spray drift. The results from the modeling framework showed 81 and 19%, respectively, for runoff and spray drift. Spray drift contributed over half of the mass loading during summer months. The slightly lower spray drift contribution as predicted by the modeling framework was mainly due to SWAT's under-prediction of pesticide mass loading during summer and over-prediction of the loading during winter. Although model simulations were associated with various sources of uncertainties, the overall performance of the modeling framework was satisfactory as evaluated by multiple statistics: for simulation of daily flow, the Nash-Sutcliffe Efficiency Coefficient (NSE) ranged from 0.61 to 0.74 and the percent bias (PBIAS) < 28%; for daily pesticide loading, NSE = 0.18 and PBIAS = -1.6%. This modeling framework will be useful for assessing the relative exposure from pesticides related to spray drift and runoff in receiving waters and the design of management practices for mitigating pesticide exposure within a watershed. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Liu, Yaoze; Engel, Bernard A.; Flanagan, Dennis C.; Gitau, Margaret W.; McMillan, Sara K.; Chaubey, Indrajeet; Singh, Shweta
2018-05-01
Best management practices (BMPs) are popular approaches used to improve hydrology and water quality. Uncertainties in BMP effectiveness over time may result in overestimating long-term efficiency in watershed planning strategies. To represent varying long-term BMP effectiveness in hydrologic/water quality models, a high level and forward-looking modeling framework was developed. The components in the framework consist of establishment period efficiency, starting efficiency, efficiency for each storm event, efficiency between maintenance, and efficiency over the life cycle. Combined, they represent long-term efficiency for a specific type of practice and specific environmental concern (runoff/pollutant). An approach for possible implementation of the framework was discussed. The long-term impacts of grass buffer strips (agricultural BMP) and bioretention systems (urban BMP) in reducing total phosphorus were simulated to demonstrate the framework. Data gaps were captured in estimating the long-term performance of the BMPs. A Bayesian method was used to match the simulated distribution of long-term BMP efficiencies with the observed distribution with the assumption that the observed data represented long-term BMP efficiencies. The simulated distribution matched the observed distribution well with only small total predictive uncertainties. With additional data, the same method can be used to further improve the simulation results. The modeling framework and results of this study, which can be adopted in hydrologic/water quality models to better represent long-term BMP effectiveness, can help improve decision support systems for creating long-term stormwater management strategies for watershed management projects.
ERIC Educational Resources Information Center
Patz, Richard J.; Junker, Brian W.; Johnson, Matthew S.; Mariano, Louis T.
2002-01-01
Discusses the hierarchical rater model (HRM) of R. Patz (1996) and shows how it can be used to scale examinees and items, model aspects of consensus among raters, and model individual rater severity and consistency effects. Also shows how the HRM fits into the generalizability theory framework. Compares the HRM to the conventional item response…
Model-based software process improvement
NASA Technical Reports Server (NTRS)
Zettervall, Brenda T.
1994-01-01
The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.
NL(q) Theory: A Neural Control Framework with Global Asymptotic Stability Criteria.
Vandewalle, Joos; De Moor, Bart L.R.; Suykens, Johan A.K.
1997-06-01
In this paper a framework for model-based neural control design is presented, consisting of nonlinear state space models and controllers, parametrized by multilayer feedforward neural networks. The models and closed-loop systems are transformed into so-called NL(q) system form. NL(q) systems represent a large class of nonlinear dynamical systems consisting of q layers with alternating linear and static nonlinear operators that satisfy a sector condition. For such NL(q)s sufficient conditions for global asymptotic stability, input/output stability (dissipativity with finite L(2)-gain) and robust stability and performance are presented. The stability criteria are expressed as linear matrix inequalities. In the analysis problem it is shown how stability of a given controller can be checked. In the synthesis problem two methods for neural control design are discussed. In the first method Narendra's dynamic backpropagation for tracking on a set of specific reference inputs is modified with an NL(q) stability constraint in order to ensure, e.g., closed-loop stability. In a second method control design is done without tracking on specific reference inputs, but based on the input/output stability criteria itself, within a standard plant framework as this is done, for example, in H( infinity ) control theory and &mgr; theory. Copyright 1997 Elsevier Science Ltd.
Adaptive invasive species distribution models: A framework for modeling incipient invasions
Uden, Daniel R.; Allen, Craig R.; Angeler, David G.; Corral, Lucia; Fricke, Kent A.
2015-01-01
The utilization of species distribution model(s) (SDM) for approximating, explaining, and predicting changes in species’ geographic locations is increasingly promoted for proactive ecological management. Although frameworks for modeling non-invasive species distributions are relatively well developed, their counterparts for invasive species—which may not be at equilibrium within recipient environments and often exhibit rapid transformations—are lacking. Additionally, adaptive ecological management strategies address the causes and effects of biological invasions and other complex issues in social-ecological systems. We conducted a review of biological invasions, species distribution models, and adaptive practices in ecological management, and developed a framework for adaptive, niche-based, invasive species distribution model (iSDM) development and utilization. This iterative, 10-step framework promotes consistency and transparency in iSDM development, allows for changes in invasive drivers and filters, integrates mechanistic and correlative modeling techniques, balances the avoidance of type 1 and type 2 errors in predictions, encourages the linking of monitoring and management actions, and facilitates incremental improvements in models and management across space, time, and institutional boundaries. These improvements are useful for advancing coordinated invasive species modeling, management and monitoring from local scales to the regional, continental and global scales at which biological invasions occur and harm native ecosystems and economies, as well as for anticipating and responding to biological invasions under continuing global change.
Crystallization of isotactic polypropylene in different shear regimes
NASA Astrophysics Data System (ADS)
Spina, Roberto; Spekowius, Marcel; Hopmann, Christian
2017-10-01
The investigation of the shear-induced crystallization of isotactic polypropylene in isothermal conditions in different shear regimes is the aim of the present research. A multiscale framework is developed and implemented to compute the nucleation and growth of spherulites, based on material parameters needed to connect crystallization kinetics to the molecular material properties. The framework consists of a macro-model based on a Finite Element Method linked to a micro-model based on Cellular Automata. The main results are the evolution of the crystallization degree and spherulite space filling as a function of imposed temperature ash shear rate.
Pagan, Darren C.; Miller, Matthew P.
2014-01-01
A forward modeling diffraction framework is introduced and employed to identify slip system activity in high-energy diffraction microscopy (HEDM) experiments. In the framework, diffraction simulations are conducted on virtual mosaic crystals with orientation gradients consistent with Nye’s model of heterogeneous single slip. Simulated diffraction peaks are then compared against experimental measurements to identify slip system activity. Simulation results compared against diffraction data measured in situ from a silicon single-crystal specimen plastically deformed under single-slip conditions indicate that slip system activity can be identified during HEDM experiments. PMID:24904242
A Prototype for the Support of Integrated Software Process Development and Improvement
NASA Astrophysics Data System (ADS)
Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian
An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.
A KPI framework for process-based benchmarking of hospital information systems.
Jahn, Franziska; Winter, Alfred
2011-01-01
Benchmarking is a major topic for monitoring, directing and elucidating the performance of hospital information systems (HIS). Current approaches neglect the outcome of the processes that are supported by the HIS and their contribution to the hospital's strategic goals. We suggest to benchmark HIS based on clinical documentation processes and their outcome. A framework consisting of a general process model and outcome criteria for clinical documentation processes is introduced.
Capital update factor: a new era approaches.
Grimaldi, P L
1993-02-01
The Health Care Financing Administration (HCFA) has constructed a preliminary model of a new capital update method which is consistent with the framework being developed to refine the update method for PPS operating costs. HCFA's eventual goal is to develop a single update framework for operating and capital costs. Initial results suggest that adopting the new capital update method would reduce capital payments substantially, which might intensify creditor's concerns about extending loans to hospitals.
ERIC Educational Resources Information Center
Burroughs, J. A.; And Others
This paper extends previous numerical results of the Flexible Equal Employment Opportunity (FEEO) model, a goal programing model (developed by A. Charnes, W. W. Cooper, K. A. Lewis, and R. J. Niehaus) consisting of Markoff transition elements imbedded in a goal programing framework with priorities that allow for element alteration to provide the…
ERIC Educational Resources Information Center
St. Clair, Robert
The concept of a speech community is investigated within the theoretical frameworks of sociology and linguistics, and it is concluded that the collective competence models of Ferdinand de Saussure and Noam Chomsky are inadequate. They fail in that they are limited as linguistic models which have consistently overlooked the sociological importance…
ERIC Educational Resources Information Center
Spector, Barbara S.; Burkett, Ruth; Leard, Cyndy
2012-01-01
This paper introduces a model for using informal science education venues as contexts within which to teach the nature of science. The model was initially developed to enable university education students to teach science in elementary schools so as to be consistent with "National Science Education Standards" (NSES) (1996) and "A Framework for…
The SLH framework for modeling quantum input-output networks
Combes, Joshua; Kerckhoff, Joseph; Sarovar, Mohan
2017-09-04
Here, many emerging quantum technologies demand precise engineering and control over networks consisting of quantum mechanical degrees of freedom connected by propagating electromagnetic fields, or quantum input-output networks. Here we review recent progress in theory and experiment related to such quantum input-output networks, with a focus on the SLH framework, a powerful modeling framework for networked quantum systems that is naturally endowed with properties such as modularity and hierarchy. We begin by explaining the physical approximations required to represent any individual node of a network, e.g. atoms in cavity or a mechanical oscillator, and its coupling to quantum fields bymore » an operator triple ( S,L,H). Then we explain how these nodes can be composed into a network with arbitrary connectivity, including coherent feedback channels, using algebraic rules, and how to derive the dynamics of network components and output fields. The second part of the review discusses several extensions to the basic SLH framework that expand its modeling capabilities, and the prospects for modeling integrated implementations of quantum input-output networks. In addition to summarizing major results and recent literature, we discuss the potential applications and limitations of the SLH framework and quantum input-output networks, with the intention of providing context to a reader unfamiliar with the field.« less
The SLH framework for modeling quantum input-output networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Combes, Joshua; Kerckhoff, Joseph; Sarovar, Mohan
Here, many emerging quantum technologies demand precise engineering and control over networks consisting of quantum mechanical degrees of freedom connected by propagating electromagnetic fields, or quantum input-output networks. Here we review recent progress in theory and experiment related to such quantum input-output networks, with a focus on the SLH framework, a powerful modeling framework for networked quantum systems that is naturally endowed with properties such as modularity and hierarchy. We begin by explaining the physical approximations required to represent any individual node of a network, e.g. atoms in cavity or a mechanical oscillator, and its coupling to quantum fields bymore » an operator triple ( S,L,H). Then we explain how these nodes can be composed into a network with arbitrary connectivity, including coherent feedback channels, using algebraic rules, and how to derive the dynamics of network components and output fields. The second part of the review discusses several extensions to the basic SLH framework that expand its modeling capabilities, and the prospects for modeling integrated implementations of quantum input-output networks. In addition to summarizing major results and recent literature, we discuss the potential applications and limitations of the SLH framework and quantum input-output networks, with the intention of providing context to a reader unfamiliar with the field.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waldhoff, Stephanie T.; Martinich, Jeremy; Sarofim, Marcus
2015-07-01
The Climate Change Impacts and Risk Analysis (CIRA) modeling exercise is a unique contribution to the scientific literature on climate change impacts, economic damages, and risk analysis that brings together multiple, national-scale models of impacts and damages in an integrated and consistent fashion to estimate climate change impacts, damages, and the benefits of greenhouse gas (GHG) mitigation actions in the United States. The CIRA project uses three consistent socioeconomic, emissions, and climate scenarios across all models to estimate the benefits of GHG mitigation policies: a Business As Usual (BAU) and two policy scenarios with radiative forcing (RF) stabilization targets ofmore » 4.5 W/m2 and 3.7 W/m2 in 2100. CIRA was also designed to specifically examine the sensitivity of results to uncertainties around climate sensitivity and differences in model structure. The goals of CIRA project are to 1) build a multi-model framework to produce estimates of multiple risks and impacts in the U.S., 2) determine to what degree risks and damages across sectors may be lowered from a BAU to policy scenarios, 3) evaluate key sources of uncertainty along the causal chain, and 4) provide information for multiple audiences and clearly communicate the risks and damages of climate change and the potential benefits of mitigation. This paper describes the motivations, goals, and design of the CIRA modeling exercise and introduces the subsequent papers in this special issue.« less
Modeling and Advanced Control for Sustainable Process Systems
This book chapter introduces a novel process systems engineering framework that integrates process control with sustainability assessment tools for the simultaneous evaluation and optimization of process operations. The implemented control strategy consists of a biologically-insp...
Economic impacts of hurricanes on forest owners
Jeffrey P. Prestemon; Thomas P. Holmes
2010-01-01
We present a conceptual model of the economic impacts of hurricanes on timber producers and consumers, offer a framework indicating how welfare impacts can be estimated using econometric estimates of timber price dynamics, and illustrate the advantages of using a welfare theoretic model, which includes (1) welfare estimates that are consistent with neo-classical...
Pouch, Alison M; Aly, Ahmed H; Lai, Eric K; Yushkevich, Natalie; Stoffers, Rutger H; Gorman, Joseph H; Cheung, Albert T; Gorman, Joseph H; Gorman, Robert C; Yushkevich, Paul A
2017-09-01
Transesophageal echocardiography is the primary imaging modality for preoperative assessment of mitral valves with ischemic mitral regurgitation (IMR). While there are well known echocardiographic insights into the 3D morphology of mitral valves with IMR, such as annular dilation and leaflet tethering, less is understood about how quantification of valve dynamics can inform surgical treatment of IMR or predict short-term recurrence of the disease. As a step towards filling this knowledge gap, we present a novel framework for 4D segmentation and geometric modeling of the mitral valve in real-time 3D echocardiography (rt-3DE). The framework integrates multi-atlas label fusion and template-based medial modeling to generate quantitatively descriptive models of valve dynamics. The novelty of this work is that temporal consistency in the rt-3DE segmentations is enforced during both the segmentation and modeling stages with the use of groupwise label fusion and Kalman filtering. The algorithm is evaluated on rt-3DE data series from 10 patients: five with normal mitral valve morphology and five with severe IMR. In these 10 data series that total 207 individual 3DE images, each 3DE segmentation is validated against manual tracing and temporal consistency between segmentations is demonstrated. The ultimate goal is to generate accurate and consistent representations of valve dynamics that can both visually and quantitatively provide insight into normal and pathological valve function.
Giannoumis, G Anthony
2016-01-01
Research has yet to provide an interdisciplinary framework for examining ICT accessibility as it relates to Universal Design. This article assesses the conceptualizations and interdisciplinarity of ICT accessibility and Universal Design research. This article uses a grounded theory approach to pose a multilevel framework for Universal Design. The macro level, consists of scholarship that examines the context of Universal Design, and is typified by legal and sociological studies that investigate social norms and environments. The meso level, which consists of scholarship that examines activity in Universal Design as an approach to removing barriers for use and participation. The meso level is typified by studies of computer science and engineering that investigate the use of technology as a mechanism of participation. The micro level consists of scholarship that examines individuals and groups in Universal Design as an approach to understanding human characteristics. The micro level is typified by studies of human factors and psychology. This article argues that the multilevel framework for Universal Design may help remove the artificial separation between disciplines concerned with ICT accessibility and promote more fruitful research and development.
Cameron, Kenzie A
2009-03-01
To provide a brief overview of 15 selected persuasion theories and models, and to present examples of their use in health communication research. The theories are categorized as message effects models, attitude-behavior approaches, cognitive processing theories and models, consistency theories, inoculation theory, and functional approaches. As it is often the intent of a practitioner to shape, reinforce, or change a patient's behavior, familiarity with theories of persuasion may lead to the development of novel communication approaches with existing patients. This article serves as an introductory primer to theories of persuasion with applications to health communication research. Understanding key constructs and general formulations of persuasive theories may allow practitioners to employ useful theoretical frameworks when interacting with patients.
Dynamical dark matter: A new framework for dark-matter physics
NASA Astrophysics Data System (ADS)
Dienes, Keith R.; Thomas, Brooks
2013-05-01
Although much remains unknown about the dark matter of the universe, one property is normally considered sacrosanct: dark matter must be stable well beyond cosmological time scales. However, a new framework for dark-matter physics has recently been proposed which challenges this assumption. In the "dynamical dark matter" (DDM) framework, the dark sector consists of a vast ensemble of individual dark-matter components with differing masses, lifetimes, and cosmological abundances. Moreover, the usual requirement of stability is replaced by a delicate balancing between lifetimes and cosmological abundances across the ensemble as a whole. As a result, it is possible for the DDM ensemble to remain consistent with all experimental and observational bounds on dark matter while nevertheless giving rise to collective behaviors which transcend those normally associated with traditional dark-matter candidates. These include a new, non-trivial darkmatter equation of state as well as potentially distinctive signatures in collider and direct-detection experiments. In this review article, we provide a self-contained introduction to the DDM framework and summarize some of the work which has recently been done in this area. We also present an explicit model within the DDM framework, and outline a number of ideas for future investigation.
A data-driven dynamics simulation framework for railway vehicles
NASA Astrophysics Data System (ADS)
Nie, Yinyu; Tang, Zhao; Liu, Fengjia; Chang, Jian; Zhang, Jianjun
2018-03-01
The finite element (FE) method is essential for simulating vehicle dynamics with fine details, especially for train crash simulations. However, factors such as the complexity of meshes and the distortion involved in a large deformation would undermine its calculation efficiency. An alternative method, the multi-body (MB) dynamics simulation provides satisfying time efficiency but limited accuracy when highly nonlinear dynamic process is involved. To maintain the advantages of both methods, this paper proposes a data-driven simulation framework for dynamics simulation of railway vehicles. This framework uses machine learning techniques to extract nonlinear features from training data generated by FE simulations so that specific mesh structures can be formulated by a surrogate element (or surrogate elements) to replace the original mechanical elements, and the dynamics simulation can be implemented by co-simulation with the surrogate element(s) embedded into a MB model. This framework consists of a series of techniques including data collection, feature extraction, training data sampling, surrogate element building, and model evaluation and selection. To verify the feasibility of this framework, we present two case studies, a vertical dynamics simulation and a longitudinal dynamics simulation, based on co-simulation with MATLAB/Simulink and Simpack, and a further comparison with a popular data-driven model (the Kriging model) is provided. The simulation result shows that using the legendre polynomial regression model in building surrogate elements can largely cut down the simulation time without sacrifice in accuracy.
McClelland, Amanda; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D.; Grenfell, Bryan T.
2017-01-01
In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging. PMID:29084216
Lau, Max S Y; Gibson, Gavin J; Adrakey, Hola; McClelland, Amanda; Riley, Steven; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D; Grenfell, Bryan T
2017-10-01
In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging.
Takano, Wataru; Kusajima, Ikuo; Nakamura, Yoshihiko
2016-08-01
It is desirable for robots to be able to linguistically understand human actions during human-robot interactions. Previous research has developed frameworks for encoding human full body motion into model parameters and for classifying motion into specific categories. For full understanding, the motion categories need to be connected to the natural language such that the robots can interpret human motions as linguistic expressions. This paper proposes a novel framework for integrating observation of human motion with that of natural language. This framework consists of two models; the first model statistically learns the relations between motions and their relevant words, and the second statistically learns sentence structures as word n-grams. Integration of these two models allows robots to generate sentences from human motions by searching for words relevant to the motion using the first model and then arranging these words in appropriate order using the second model. This allows making sentences that are the most likely to be generated from the motion. The proposed framework was tested on human full body motion measured by an optical motion capture system. In this, descriptive sentences were manually attached to the motions, and the validity of the system was demonstrated. Copyright © 2016 Elsevier Ltd. All rights reserved.
Moorkanikkara, Srinivas Nageswaran; Blankschtein, Daniel
2010-12-21
How does one design a surfactant mixture using a set of available surfactants such that it exhibits a desired adsorption kinetics behavior? The traditional approach used to address this design problem involves conducting trial-and-error experiments with specific surfactant mixtures. This approach is typically time-consuming and resource-intensive and becomes increasingly challenging when the number of surfactants that can be mixed increases. In this article, we propose a new theoretical framework to identify a surfactant mixture that most closely meets a desired adsorption kinetics behavior. Specifically, the new theoretical framework involves (a) formulating the surfactant mixture design problem as an optimization problem using an adsorption kinetics model and (b) solving the optimization problem using a commercial optimization package. The proposed framework aims to identify the surfactant mixture that most closely satisfies the desired adsorption kinetics behavior subject to the predictive capabilities of the chosen adsorption kinetics model. Experiments can then be conducted at the identified surfactant mixture condition to validate the predictions. We demonstrate the reliability and effectiveness of the proposed theoretical framework through a realistic case study by identifying a nonionic surfactant mixture consisting of up to four alkyl poly(ethylene oxide) surfactants (C(10)E(4), C(12)E(5), C(12)E(6), and C(10)E(8)) such that it most closely exhibits a desired dynamic surface tension (DST) profile. Specifically, we use the Mulqueen-Stebe-Blankschtein (MSB) adsorption kinetics model (Mulqueen, M.; Stebe, K. J.; Blankschtein, D. Langmuir 2001, 17, 5196-5207) to formulate the optimization problem as well as the SNOPT commercial optimization solver to identify a surfactant mixture consisting of these four surfactants that most closely exhibits the desired DST profile. Finally, we compare the experimental DST profile measured at the surfactant mixture condition identified by the new theoretical framework with the desired DST profile and find good agreement between the two profiles.
An Integrated Finite Element-based Simulation Framework: From Hole Piercing to Hole Expansion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Xiaohua; Sun, Xin; Golovashchenko, Segey F.
An integrated finite element-based modeling framework is developed to predict the hole expansion ratio (HER) of AA6111-T4 sheet by considering the piercing-induced damages around the hole edge. Using damage models and parameters calibrated from previously reported tensile stretchability studies, the predicted HER correlates well with experimentally measured HER values for different hole piercing clearances. The hole piercing model shows burrs are not generated on the sheared surface for clearances less than 20%, which corresponds well with the experimental data on pierced holes cross-sections. Finite-element-calculated HER also is not especially sensitive to piercing clearances less than this value. However, as clearancesmore » increase to 30% and further to 40%, the HER values are predicted to be considerably smaller, also consistent with experimental measurements. Upon validation, the integrated modeling framework is used to examine the effects of different hole piercing and hole expansion conditions on the critical HERs for AA6111-T4.« less
Crystal plasticity modeling of irradiation growth in Zircaloy-2
NASA Astrophysics Data System (ADS)
Patra, Anirban; Tomé, Carlos N.; Golubov, Stanislav I.
2017-08-01
A physically based reaction-diffusion model is implemented in the visco-plastic self-consistent (VPSC) crystal plasticity framework to simulate irradiation growth in hcp Zr and its alloys. The reaction-diffusion model accounts for the defects produced by the cascade of displaced atoms, their diffusion to lattice sinks and the contribution to crystallographic strain at the level of single crystals. The VPSC framework accounts for intergranular interactions and irradiation creep, and calculates the strain in the polycrystalline ensemble. A novel scheme is proposed to model the simultaneous evolution of both, number density and radius, of irradiation-induced dislocation loops directly from experimental data of dislocation density evolution during irradiation. This framework is used to predict the irradiation growth behaviour of cold-worked Zircaloy-2 and trends compared to available experimental data. The role of internal stresses in inducing irradiation creep is discussed. Effects of grain size, texture and external stress on the coupled irradiation growth and creep behaviour are also studied and compared with available experimental data.
An epidemiological modeling and data integration framework.
Pfeifer, B; Wurz, M; Hanser, F; Seger, M; Netzer, M; Osl, M; Modre-Osprian, R; Schreier, G; Baumgartner, C
2010-01-01
In this work, a cellular automaton software package for simulating different infectious diseases, storing the simulation results in a data warehouse system and analyzing the obtained results to generate prediction models as well as contingency plans, is proposed. The Brisbane H3N2 flu virus, which has been spreading during the winter season 2009, was used for simulation in the federal state of Tyrol, Austria. The simulation-modeling framework consists of an underlying cellular automaton. The cellular automaton model is parameterized by known disease parameters and geographical as well as demographical conditions are included for simulating the spreading. The data generated by simulation are stored in the back room of the data warehouse using the Talend Open Studio software package, and subsequent statistical and data mining tasks are performed using the tool, termed Knowledge Discovery in Database Designer (KD3). The obtained simulation results were used for generating prediction models for all nine federal states of Austria. The proposed framework provides a powerful and easy to handle interface for parameterizing and simulating different infectious diseases in order to generate prediction models and improve contingency plans for future events.
Starobinsky-like inflation and neutrino masses in a no-scale SO(10) model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ellis, John; Theoretical Physics Department, CERN,CH-1211 Geneva 23; Garcia, Marcos A.G.
2016-11-08
Using a no-scale supergravity framework, we construct an SO(10) model that makes predictions for cosmic microwave background observables similar to those of the Starobinsky model of inflation, and incorporates a double-seesaw model for neutrino masses consistent with oscillation experiments and late-time cosmology. We pay particular attention to the behaviour of the scalar fields during inflation and the subsequent reheating.
Geologic Framework Model Analysis Model Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Clayton
2000-12-19
The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompassmore » the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the repository design. These downstream models include the hydrologic flow models and the radionuclide transport models. All the models and the repository design, in turn, will be incorporated into the Total System Performance Assessment (TSPA) of the potential radioactive waste repository block and vicinity to determine the suitability of Yucca Mountain as a host for the repository. The interrelationship of the three components of the ISM and their interface with downstream uses are illustrated in Figure 2.« less
eClims: An Extensible and Dynamic Integration Framework for Biomedical Information Systems.
Savonnet, Marinette; Leclercq, Eric; Naubourg, Pierre
2016-11-01
Biomedical information systems (BIS) require consideration of three types of variability: data variability induced by new high throughput technologies, schema or model variability induced by large scale studies or new fields of research, and knowledge variability resulting from new discoveries. Beyond data heterogeneity, managing variabilities in the context of BIS requires extensible and dynamic integration process. In this paper, we focus on data and schema variabilities and we propose an integration framework based on ontologies, master data, and semantic annotations. The framework addresses issues related to: 1) collaborative work through a dynamic integration process; 2) variability among studies using an annotation mechanism; and 3) quality control over data and semantic annotations. Our approach relies on two levels of knowledge: BIS-related knowledge is modeled using an application ontology coupled with UML models that allow controlling data completeness and consistency, and domain knowledge is described by a domain ontology, which ensures data coherence. A system build with the eClims framework has been implemented and evaluated in the context of a proteomic platform.
Schindlbeck, Christopher; Pape, Christian; Reithmeier, Eduard
2018-04-16
Alignment of optical components is crucial for the assembly of optical systems to ensure their full functionality. In this paper we present a novel predictor-corrector framework for the sequential assembly of serial optical systems. Therein, we use a hybrid optical simulation model that comprises virtual and identified component positions. The hybrid model is constantly adapted throughout the assembly process with the help of nonlinear identification techniques and wavefront measurements. This enables prediction of the future wavefront at the detector plane and therefore allows for taking corrective measures accordingly during the assembly process if a user-defined tolerance on the wavefront error is violated. We present a novel notation for the so-called hybrid model and outline the work flow of the presented predictor-corrector framework. A beam expander is assembled as demonstrator for experimental verification of the framework. The optical setup consists of a laser, two bi-convex spherical lenses each mounted to a five degree-of-freedom stage to misalign and correct components, and a Shack-Hartmann sensor for wavefront measurements.
A continuum mechanics constitutive framework for transverse isotropic soft tissues
NASA Astrophysics Data System (ADS)
Garcia-Gonzalez, D.; Jérusalem, A.; Garzon-Hernandez, S.; Zaera, R.; Arias, A.
2018-03-01
In this work, a continuum constitutive framework for the mechanical modelling of soft tissues that incorporates strain rate and temperature dependencies as well as the transverse isotropy arising from fibres embedded into a soft matrix is developed. The constitutive formulation is based on a Helmholtz free energy function decoupled into the contribution of a viscous-hyperelastic matrix and the contribution of fibres introducing dispersion dependent transverse isotropy. The proposed framework considers finite deformation kinematics, is thermodynamically consistent and allows for the particularisation of the energy potentials and flow equations of each constitutive branch. In this regard, the approach developed herein provides the basis on which specific constitutive models can be potentially formulated for a wide variety of soft tissues. To illustrate this versatility, the constitutive framework is particularised here for animal and human white matter and skin, for which constitutive models are provided. In both cases, different energy functions are considered: Neo-Hookean, Gent and Ogden. Finally, the ability of the approach at capturing the experimental behaviour of the two soft tissues is confirmed.
A Spatiotemporal Prediction Framework for Air Pollution Based on Deep RNN
NASA Astrophysics Data System (ADS)
Fan, J.; Li, Q.; Hou, J.; Feng, X.; Karimian, H.; Lin, S.
2017-10-01
Time series data in practical applications always contain missing values due to sensor malfunction, network failure, outliers etc. In order to handle missing values in time series, as well as the lack of considering temporal properties in machine learning models, we propose a spatiotemporal prediction framework based on missing value processing algorithms and deep recurrent neural network (DRNN). By using missing tag and missing interval to represent time series patterns, we implement three different missing value fixing algorithms, which are further incorporated into deep neural network that consists of LSTM (Long Short-term Memory) layers and fully connected layers. Real-world air quality and meteorological datasets (Jingjinji area, China) are used for model training and testing. Deep feed forward neural networks (DFNN) and gradient boosting decision trees (GBDT) are trained as baseline models against the proposed DRNN. Performances of three missing value fixing algorithms, as well as different machine learning models are evaluated and analysed. Experiments show that the proposed DRNN framework outperforms both DFNN and GBDT, therefore validating the capacity of the proposed framework. Our results also provides useful insights for better understanding of different strategies that handle missing values.
ERIC Educational Resources Information Center
Stahl, Robert J.; Murphy, Gary T.
Weaknesses in the structure, levels, and sequence of Bloom's taxonomy of cognitive domains emphasize the need for both a new model of how individual learners process information and a new taxonomy of the different levels of memory, thinking, and learning. Both the model and the taxonomy should be consistent with current research findings. The…
One-month validation of the Space Weather Modeling Framework geospace model
NASA Astrophysics Data System (ADS)
Haiducek, J. D.; Welling, D. T.; Ganushkina, N. Y.; Morley, S.; Ozturk, D. S.
2017-12-01
The Space Weather Modeling Framework (SWMF) geospace model consists of a magnetohydrodynamic (MHD) simulation coupled to an inner magnetosphere model and an ionosphere model. This provides a predictive capability for magnetopsheric dynamics, including ground-based and space-based magnetic fields, geomagnetic indices, currents and densities throughout the magnetosphere, cross-polar cap potential, and magnetopause and bow shock locations. The only inputs are solar wind parameters and F10.7 radio flux. We have conducted a rigorous validation effort consisting of a continuous simulation covering the month of January, 2005 using three different model configurations. This provides a relatively large dataset for assessment of the model's predictive capabilities. We find that the model does an excellent job of predicting the Sym-H index, and performs well at predicting Kp and CPCP during active times. Dayside magnetopause and bow shock positions are also well predicted. The model tends to over-predict Kp and CPCP during quiet times and under-predicts the magnitude of AL during disturbances. The model under-predicts the magnitude of night-side geosynchronous Bz, and over-predicts the radial distance to the flank magnetopause and bow shock. This suggests that the model over-predicts stretching of the magnetotail and the overall size of the magnetotail. With the exception of the AL index and the nightside geosynchronous magnetic field, we find the results to be insensitive to grid resolution.
ERIC Educational Resources Information Center
Charalampous, Kyriakos; Kokkinos, Constantinos M.
2013-01-01
The Model of Interpersonal Teacher Behaviour (MITB) provides a widely acclaimed framework for studying the student-teacher interaction. However, the consistently weak psychometric properties of the instruments designed to measure the MITB in educational contexts other than the ones for which the MITB was originally developed, indicate the need for…
Perceived Stress and Wellness in Early Adolescents Using the Neuman Systems Model
ERIC Educational Resources Information Center
Yarcheski, Thomas J.; Mahon, Noreen E.; Yarcheski, Adela; Hanks, Michele M.
2010-01-01
The purpose of this study was to examine the relationship between perceived stress and wellness in early adolescents and to test primary appraisal as a mediator of this relationship using the Neuman Systems Model as the primary framework. The sample consisted of 144 adolescents, ages 12-14, who responded to instruments measuring perceived stress,…
Two-Year Community: Implementing Vision and Change in a Community College Classroom
ERIC Educational Resources Information Center
Lysne, Steven; Miller, Brant
2015-01-01
The purpose of this article is to describe a model for teaching introductory biology coursework within the Vision and Change framework (American Association for the Advancement of Science, 2011). The intent of the new model is to transform instruction by adopting an active, student-centered, and inquiry-based pedagogy consistent with Vision and…
Action Research in Professional Work: Developing New Practices through Design, Dialogue or Learning?
ERIC Educational Resources Information Center
Lahn, Leif Chr.
This paper examines action research that has been carried out in organizations consisting of predominantly highly educated personnel. The paper revolves around discussion of the Scandinavian model of action research, asking to what degree this model, which has been developed within the framework of industrial democracy, might also serve as a…
Arar, Nedal; Knight, Sara J; Modell, Stephen M; Issa, Amalia M
2011-03-01
The main mission of the Genomic Applications in Practice and Prevention Network™ is to advance collaborative efforts involving partners from across the public health sector to realize the promise of genomics in healthcare and disease prevention. We introduce a new framework that supports the Genomic Applications in Practice and Prevention Network mission and leverages the characteristics of the complex adaptive systems approach. We call this framework the Genome-based Knowledge Management in Cycles model (G-KNOMIC). G-KNOMIC proposes that the collaborative work of multidisciplinary teams utilizing genome-based applications will enhance translating evidence-based genomic findings by creating ongoing knowledge management cycles. Each cycle consists of knowledge synthesis, knowledge evaluation, knowledge implementation and knowledge utilization. Our framework acknowledges that all the elements in the knowledge translation process are interconnected and continuously changing. It also recognizes the importance of feedback loops, and the ability of teams to self-organize within a dynamic system. We demonstrate how this framework can be used to improve the adoption of genomic technologies into practice using two case studies of genomic uptake.
Climate Change Impacts on Freshwater Recreational Fishing in the United States
Using a geographic information system, a spatially explicit modeling framework was developed consisting grid cells organized into 2,099 eight-digit hydrologic unit code (HUC-8) polygons for the coterminous United States. Projected temperature and precipitation changes associated...
Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun
2017-02-06
In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human-machine-environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.
A New Biogeochemical Computational Framework Integrated within the Community Land Model
NASA Astrophysics Data System (ADS)
Fang, Y.; Li, H.; Liu, C.; Huang, M.; Leung, L.
2012-12-01
Terrestrial biogeochemical processes, particularly carbon cycle dynamics, have been shown to significantly influence regional and global climate changes. Modeling terrestrial biogeochemical processes within the land component of Earth System Models such as the Community Land model (CLM), however, faces three major challenges: 1) extensive efforts in modifying modeling structures and rewriting computer programs to incorporate biogeochemical processes with increasing complexity, 2) expensive computational cost to solve the governing equations due to numerical stiffness inherited from large variations in the rates of biogeochemical processes, and 3) lack of an efficient framework to systematically evaluate various mathematical representations of biogeochemical processes. To address these challenges, we introduce a new computational framework to incorporate biogeochemical processes into CLM, which consists of a new biogeochemical module with a generic algorithm and reaction database. New and updated biogeochemical processes can be incorporated into CLM without significant code modification. To address the stiffness issue, algorithms and criteria will be developed to identify fast processes, which will be replaced with algebraic equations and decoupled from slow processes. This framework can serve as a generic and user-friendly platform to test out different mechanistic process representations and datasets and gain new insight on the behavior of the terrestrial ecosystems in response to climate change in a systematic way.
A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine
Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun
2017-01-01
In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human–machine–environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines. PMID:28178184
Hybrid modeling in biochemical systems theory by means of functional petri nets.
Wu, Jialiang; Voit, Eberhard
2009-02-01
Many biological systems are genuinely hybrids consisting of interacting discrete and continuous components and processes that often operate at different time scales. It is therefore desirable to create modeling frameworks capable of combining differently structured processes and permitting their analysis over multiple time horizons. During the past 40 years, Biochemical Systems Theory (BST) has been a very successful approach to elucidating metabolic, gene regulatory, and signaling systems. However, its foundation in ordinary differential equations has precluded BST from directly addressing problems containing switches, delays, and stochastic effects. In this study, we extend BST to hybrid modeling within the framework of Hybrid Functional Petri Nets (HFPN). First, we show how the canonical GMA and S-system models in BST can be directly implemented in a standard Petri Net framework. In a second step we demonstrate how to account for different types of time delays as well as for discrete, stochastic, and switching effects. Using representative test cases, we validate the hybrid modeling approach through comparative analyses and simulations with other approaches and highlight the feasibility, quality, and efficiency of the hybrid method.
Two classes of ODE models with switch-like behavior
Just, Winfried; Korb, Mason; Elbert, Ben; Young, Todd
2013-01-01
In cases where the same real-world system can be modeled both by an ODE system ⅅ and a Boolean system 𝔹, it is of interest to identify conditions under which the two systems will be consistent, that is, will make qualitatively equivalent predictions. In this note we introduce two broad classes of relatively simple models that provide a convenient framework for studying such questions. In contrast to the widely known class of Glass networks, the right-hand sides of our ODEs are Lipschitz-continuous. We prove that if 𝔹 has certain structures, consistency between ⅅ and 𝔹 is implied by sufficient separation of time scales in one class of our models. Namely, if the trajectories of 𝔹 are “one-stepping” then we prove a strong form of consistency and if 𝔹 has a certain monotonicity property then there is a weaker consistency between ⅅ and 𝔹. These results appear to point to more general structure properties that favor consistency between ODE and Boolean models. PMID:24244061
Tong, Xiayu; Wang, Zhou-Jing
2016-09-19
This article develops a group decision framework with intuitionistic preference relations. An approach is first devised to rectify an inconsistent intuitionistic preference relation to derive an additive consistent one. A new aggregation operator, the so-called induced intuitionistic ordered weighted averaging (IIOWA) operator, is proposed to aggregate individual intuitionistic fuzzy judgments. By using the mean absolute deviation between the original and rectified intuitionistic preference relations as an order inducing variable, the rectified consistent intuitionistic preference relations are aggregated into a collective preference relation. This treatment is presumably able to assign different weights to different decision-makers' judgments based on the quality of their inputs (in terms of consistency of their original judgments). A solution procedure is then developed for tackling group decision problems with intuitionistic preference relations. A low carbon supplier selection case study is developed to illustrate how to apply the proposed decision model in practice.
Tong, Xiayu; Wang, Zhou-Jing
2016-01-01
This article develops a group decision framework with intuitionistic preference relations. An approach is first devised to rectify an inconsistent intuitionistic preference relation to derive an additive consistent one. A new aggregation operator, the so-called induced intuitionistic ordered weighted averaging (IIOWA) operator, is proposed to aggregate individual intuitionistic fuzzy judgments. By using the mean absolute deviation between the original and rectified intuitionistic preference relations as an order inducing variable, the rectified consistent intuitionistic preference relations are aggregated into a collective preference relation. This treatment is presumably able to assign different weights to different decision-makers’ judgments based on the quality of their inputs (in terms of consistency of their original judgments). A solution procedure is then developed for tackling group decision problems with intuitionistic preference relations. A low carbon supplier selection case study is developed to illustrate how to apply the proposed decision model in practice. PMID:27657097
Embedded sparse representation of fMRI data via group-wise dictionary optimization
NASA Astrophysics Data System (ADS)
Zhu, Dajiang; Lin, Binbin; Faskowitz, Joshua; Ye, Jieping; Thompson, Paul M.
2016-03-01
Sparse learning enables dimension reduction and efficient modeling of high dimensional signals and images, but it may need to be tailored to best suit specific applications and datasets. Here we used sparse learning to efficiently represent functional magnetic resonance imaging (fMRI) data from the human brain. We propose a novel embedded sparse representation (ESR), to identify the most consistent dictionary atoms across different brain datasets via an iterative group-wise dictionary optimization procedure. In this framework, we introduced additional criteria to make the learned dictionary atoms more consistent across different subjects. We successfully identified four common dictionary atoms that follow the external task stimuli with very high accuracy. After projecting the corresponding coefficient vectors back into the 3-D brain volume space, the spatial patterns are also consistent with traditional fMRI analysis results. Our framework reveals common features of brain activation in a population, as a new, efficient fMRI analysis method.
Improving care for patients on antiretroviral therapy through a gap analysis framework.
Massoud, M Rashad; Shakir, Fazila; Livesley, Nigel; Muhire, Martin; Nabwire, Juliana; Ottosson, Amanda; Jean-Baptiste, Rachel; Megere, Humphrey; Karamagi-Nkolo, Esther; Gaudreault, Suzanne; Marks, Pamela; Jennings, Larissa
2015-07-01
To improve quality of care through decreasing existing gaps in the areas of coverage, retention, and wellness of patients receiving HIV care and treatment. The antiretroviral therapy (ART) Framework utilizes improvement methods and the Chronic Care Model to address the coverage, retention, and wellness gaps in HIV care and treatment. This is a time-series study. The ART Framework was applied in five health centers in Buikwe District, Uganda. Quality improvement teams, consisting of healthcare workers and expert patients, were established in each of the five healthcare facilities. The intervention period was October 2010 to September 2012. It consisted of quality improvement teams analyzing their facility and systems of care from the perspective of the Chronic Care Model to identify areas of improvement. They implemented the ART Framework, collected data and assessed outcomes, focused on self-management support for patients, to improve coverage, retention, and wellness gaps in HIV care and treatment. Coverage was defined as every patient who needs ART in the catchment area, receives it. Retention was defined as every patient who receives ART stays on ART, and wellness defined as having a positive clinical, immunological, and/or virological response to treatment without intolerable or unmanageable side-effects. Results from Buikwe show the gaps in coverage, retention, and wellness greatly decreased a gap in coverage of 44-19%, gap in retention of 49-24%, and gap in wellness of 53-14% during a 2-year intervention period. The ART Framework is an innovative and practical tool for HIV program managers to improve HIV care and treatment.
A viable logarithmic f(R) model for inflation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amin, M.; Khalil, S.; Salah, M.
2016-08-18
Inflation in the framework of f(R) modified gravity is revisited. We study the conditions that f(R) should satisfy in order to lead to a viable inflationary model in the original form and in the Einstein frame. Based on these criteria we propose a new logarithmic model as a potential candidate for f(R) theories aiming to describe inflation consistent with observations from Planck satellite (2015). The model predicts scalar spectral index 0.9615
Computational model of lightness perception in high dynamic range imaging
NASA Astrophysics Data System (ADS)
Krawczyk, Grzegorz; Myszkowski, Karol; Seidel, Hans-Peter
2006-02-01
An anchoring theory of lightness perception by Gilchrist et al. [1999] explains many characteristics of human visual system such as lightness constancy and its spectacular failures which are important in the perception of images. The principal concept of this theory is the perception of complex scenes in terms of groups of consistent areas (frameworks). Such areas, following the gestalt theorists, are defined by the regions of common illumination. The key aspect of the image perception is the estimation of lightness within each framework through the anchoring to the luminance perceived as white, followed by the computation of the global lightness. In this paper we provide a computational model for automatic decomposition of HDR images into frameworks. We derive a tone mapping operator which predicts lightness perception of the real world scenes and aims at its accurate reproduction on low dynamic range displays. Furthermore, such a decomposition into frameworks opens new grounds for local image analysis in view of human perception.
A Structural Model Decomposition Framework for Hybrid Systems Diagnosis
NASA Technical Reports Server (NTRS)
Daigle, Matthew; Bregon, Anibal; Roychoudhury, Indranil
2015-01-01
Nowadays, a large number of practical systems in aerospace and industrial environments are best represented as hybrid systems that consist of discrete modes of behavior, each defined by a set of continuous dynamics. These hybrid dynamics make the on-line fault diagnosis task very challenging. In this work, we present a new modeling and diagnosis framework for hybrid systems. Models are composed from sets of user-defined components using a compositional modeling approach. Submodels for residual generation are then generated for a given mode, and reconfigured efficiently when the mode changes. Efficient reconfiguration is established by exploiting causality information within the hybrid system models. The submodels can then be used for fault diagnosis based on residual generation and analysis. We demonstrate the efficient causality reassignment, submodel reconfiguration, and residual generation for fault diagnosis using an electrical circuit case study.
ERIC Educational Resources Information Center
Griffith, William S.; And Others
This document, consisting of seven chapters and 12 appendixes, is a full final report of the Doolittle Family Education Center Experimental In-Service Training Project. Chapter II consists of the history and plan of the project including an explanation of the framework of the model that was used to conceptualize the project. Chapter III is a…
Cascading gravity is ghost free
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rham, Claudia de; Khoury, Justin; Tolley, Andrew J.
2010-06-15
We perform a full perturbative stability analysis of the 6D cascading gravity model in the presence of 3-brane tension. We demonstrate that for sufficiently large tension on the (flat) 3-brane, there are no ghosts at the perturbative level, consistent with results that had previously only been obtained in a specific 5D decoupling limit. These results establish the cascading gravity framework as a consistent infrared modification of gravity.
A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints.
Sundharam, Sakthivel Manikandan; Navet, Nicolas; Altmeyer, Sebastian; Havet, Lionel
2018-02-20
Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system.
Modeling sports highlights using a time-series clustering framework and model interpretation
NASA Astrophysics Data System (ADS)
Radhakrishnan, Regunathan; Otsuka, Isao; Xiong, Ziyou; Divakaran, Ajay
2005-01-01
In our past work on sports highlights extraction, we have shown the utility of detecting audience reaction using an audio classification framework. The audio classes in the framework were chosen based on intuition. In this paper, we present a systematic way of identifying the key audio classes for sports highlights extraction using a time series clustering framework. We treat the low-level audio features as a time series and model the highlight segments as "unusual" events in a background of an "usual" process. The set of audio classes to characterize the sports domain is then identified by analyzing the consistent patterns in each of the clusters output from the time series clustering framework. The distribution of features from the training data so obtained for each of the key audio classes, is parameterized by a Minimum Description Length Gaussian Mixture Model (MDL-GMM). We also interpret the meaning of each of the mixture components of the MDL-GMM for the key audio class (the "highlight" class) that is correlated with highlight moments. Our results show that the "highlight" class is a mixture of audience cheering and commentator's excited speech. Furthermore, we show that the precision-recall performance for highlights extraction based on this "highlight" class is better than that of our previous approach which uses only audience cheering as the key highlight class.
A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints
Navet, Nicolas; Havet, Lionel
2018-01-01
Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system. PMID:29461489
Crystal plasticity modeling of irradiation growth in Zircaloy-2
Patra, Anirban; Tome, Carlos; Golubov, Stanislav I.
2017-05-10
A reaction-diffusion based mean field rate theory model is implemented in the viscoplastic self-consistent (VPSC) crystal plasticity framework to simulate irradiation growth in hcp Zr and its alloys. A novel scheme is proposed to model the evolution (both number density and radius) of irradiation-induced dislocation loops that can be informed directly from experimental data of dislocation density evolution during irradiation. This framework is used to predict the irradiation growth behavior of cold-worked Zircaloy-2 and trends compared to available experimental data. The role of internal stresses in inducing irradiation creep is discussed. Effects of grain size, texture, and external stress onmore » the coupled irradiation growth and creep behavior are also studied.« less
Crystal plasticity modeling of irradiation growth in Zircaloy-2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patra, Anirban; Tome, Carlos; Golubov, Stanislav I.
A reaction-diffusion based mean field rate theory model is implemented in the viscoplastic self-consistent (VPSC) crystal plasticity framework to simulate irradiation growth in hcp Zr and its alloys. A novel scheme is proposed to model the evolution (both number density and radius) of irradiation-induced dislocation loops that can be informed directly from experimental data of dislocation density evolution during irradiation. This framework is used to predict the irradiation growth behavior of cold-worked Zircaloy-2 and trends compared to available experimental data. The role of internal stresses in inducing irradiation creep is discussed. Effects of grain size, texture, and external stress onmore » the coupled irradiation growth and creep behavior are also studied.« less
Inkpen, S Andrew
2016-06-01
Experimental ecologists often invoke trade-offs to describe the constraints they encounter when choosing between alternative experimental designs, such as between laboratory, field, and natural experiments. In making these claims, they tend to rely on Richard Levins' analysis of trade-offs in theoretical model-building. But does Levins' framework apply to experiments? In this paper, I focus this question on one desideratum widely invoked in the modelling literature: generality. Using the case of generality, I assess whether Levins-style treatments of modelling provide workable resources for assessing trade-offs in experimental design. I argue that, of four strategies modellers employ to increase generality, only one may be unproblematically applied to experimental design. Furthermore, modelling desiderata do not have obvious correlates in experimental design, and when we define these desiderata in a way that seem consistent with ecologists' usage, the trade-off framework falls apart. I conclude that a Levins-inspired framework for modelling does not provide the content for a similar approach to experimental practice; this does not, however, mean that it cannot provide the form. Copyright © 2016 Elsevier Ltd. All rights reserved.
Mirus, Benjamin B.; Halford, Keith J.; Sweetkind, Donald; ...
2016-02-18
The suitability of geologic frameworks for extrapolating hydraulic conductivity (K) to length scales commensurate with hydraulic data is difficult to assess. A novel method is presented for evaluating assumed relations between K and geologic interpretations for regional-scale groundwater modeling. The approach relies on simultaneous interpretation of multiple aquifer tests using alternative geologic frameworks of variable complexity, where each framework is incorporated as prior information that assumes homogeneous K within each model unit. This approach is tested at Pahute Mesa within the Nevada National Security Site (USA), where observed drawdowns from eight aquifer tests in complex, highly faulted volcanic rocks providemore » the necessary hydraulic constraints. The investigated volume encompasses 40 mi3 (167 km3) where drawdowns traversed major fault structures and were detected more than 2 mi (3.2 km) from pumping wells. Complexity of the five frameworks assessed ranges from an undifferentiated mass of rock with a single unit to 14 distinct geologic units. Results show that only four geologic units can be justified as hydraulically unique for this location. The approach qualitatively evaluates the consistency of hydraulic property estimates within extents of investigation and effects of geologic frameworks on extrapolation. Distributions of transmissivity are similar within the investigated extents irrespective of the geologic framework. In contrast, the extrapolation of hydraulic properties beyond the volume investigated with interfering aquifer tests is strongly affected by the complexity of a given framework. As a result, testing at Pahute Mesa illustrates how this method can be employed to determine the appropriate level of geologic complexity for large-scale groundwater modeling.« less
Mirus, Benjamin B.; Halford, Keith J.; Sweetkind, Donald; Fenelon, Joseph M.
2016-01-01
The suitability of geologic frameworks for extrapolating hydraulic conductivity (K) to length scales commensurate with hydraulic data is difficult to assess. A novel method is presented for evaluating assumed relations between K and geologic interpretations for regional-scale groundwater modeling. The approach relies on simultaneous interpretation of multiple aquifer tests using alternative geologic frameworks of variable complexity, where each framework is incorporated as prior information that assumes homogeneous K within each model unit. This approach is tested at Pahute Mesa within the Nevada National Security Site (USA), where observed drawdowns from eight aquifer tests in complex, highly faulted volcanic rocks provide the necessary hydraulic constraints. The investigated volume encompasses 40 mi3 (167 km3) where drawdowns traversed major fault structures and were detected more than 2 mi (3.2 km) from pumping wells. Complexity of the five frameworks assessed ranges from an undifferentiated mass of rock with a single unit to 14 distinct geologic units. Results show that only four geologic units can be justified as hydraulically unique for this location. The approach qualitatively evaluates the consistency of hydraulic property estimates within extents of investigation and effects of geologic frameworks on extrapolation. Distributions of transmissivity are similar within the investigated extents irrespective of the geologic framework. In contrast, the extrapolation of hydraulic properties beyond the volume investigated with interfering aquifer tests is strongly affected by the complexity of a given framework. Testing at Pahute Mesa illustrates how this method can be employed to determine the appropriate level of geologic complexity for large-scale groundwater modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mirus, Benjamin B.; Halford, Keith J.; Sweetkind, Donald
The suitability of geologic frameworks for extrapolating hydraulic conductivity (K) to length scales commensurate with hydraulic data is difficult to assess. A novel method is presented for evaluating assumed relations between K and geologic interpretations for regional-scale groundwater modeling. The approach relies on simultaneous interpretation of multiple aquifer tests using alternative geologic frameworks of variable complexity, where each framework is incorporated as prior information that assumes homogeneous K within each model unit. This approach is tested at Pahute Mesa within the Nevada National Security Site (USA), where observed drawdowns from eight aquifer tests in complex, highly faulted volcanic rocks providemore » the necessary hydraulic constraints. The investigated volume encompasses 40 mi3 (167 km3) where drawdowns traversed major fault structures and were detected more than 2 mi (3.2 km) from pumping wells. Complexity of the five frameworks assessed ranges from an undifferentiated mass of rock with a single unit to 14 distinct geologic units. Results show that only four geologic units can be justified as hydraulically unique for this location. The approach qualitatively evaluates the consistency of hydraulic property estimates within extents of investigation and effects of geologic frameworks on extrapolation. Distributions of transmissivity are similar within the investigated extents irrespective of the geologic framework. In contrast, the extrapolation of hydraulic properties beyond the volume investigated with interfering aquifer tests is strongly affected by the complexity of a given framework. As a result, testing at Pahute Mesa illustrates how this method can be employed to determine the appropriate level of geologic complexity for large-scale groundwater modeling.« less
Contributions of cultural services to the ecosystem services agenda
Daniel, Terry C.; Muhar, Andreas; Arnberger, Arne; Aznar, Olivier; Boyd, James W.; Chan, Kai M. A.; Costanza, Robert; Elmqvist, Thomas; Flint, Courtney G.; Gobster, Paul H.; Grêt-Regamey, Adrienne; Lave, Rebecca; Muhar, Susanne; Penker, Marianne; Ribe, Robert G.; Schauppenlehner, Thomas; Sikor, Thomas; Soloviy, Ihor; Spierenburg, Marja; Taczanowska, Karolina; Tam, Jordan; von der Dunk, Andreas
2012-01-01
Cultural ecosystem services (ES) are consistently recognized but not yet adequately defined or integrated within the ES framework. A substantial body of models, methods, and data relevant to cultural services has been developed within the social and behavioral sciences before and outside of the ES approach. A selective review of work in landscape aesthetics, cultural heritage, outdoor recreation, and spiritual significance demonstrates opportunities for operationally defining cultural services in terms of socioecological models, consistent with the larger set of ES. Such models explicitly link ecological structures and functions with cultural values and benefits, facilitating communication between scientists and stakeholders and enabling economic, multicriterion, deliberative evaluation and other methods that can clarify tradeoffs and synergies involving cultural ES. Based on this approach, a common representation is offered that frames cultural services, along with all ES, by the relative contribution of relevant ecological structures and functions and by applicable social evaluation approaches. This perspective provides a foundation for merging ecological and social science epistemologies to define and integrate cultural services better within the broader ES framework. PMID:22615401
Modeling ECCD/MHD coupling using NIMROD, GENRAY, and the Integrated Plasma Simulator
NASA Astrophysics Data System (ADS)
Jenkins, Thomas G.; Schnack, D. D.; Sovinec, C. R.; Hegna, C. C.; Callen, J. D.; Ebrahimi, F.; Kruger, S. E.; Carlsson, J.; Held, E. D.; Ji, J.-Y.; Harvey, R. W.; Smirnov, A. P.; Elwasif, W. R.
2009-11-01
We summarize ongoing theoretical/numerical work relevant to the development of a self--consistent framework for the inclusion of RF effects in fluid simulations; specifically, we consider the stabilization of resistive tearing modes in tokamak geometry by electron cyclotron current drive. In the fluid equations, ad hoc models for the RF--induced currents have previously been shown to shrink or altogether suppress the nonlinearly saturated magnetic islands generated by tearing modes; progress toward a self--consistent model is reported. The interfacing of the NIMROD [1] code with the GENRAY/CQL3D [2] codes (which calculate RF propagation and energy/momentum deposition) via the Integrated Plasma Simulator (IPS) framework [3] is explained, RF-induced rational surface motion and the equilibration of RF--induced currents over plasma flux surfaces are investigated, and the efficient reduction of saturated island widths through time modulation and spatial localization of the ECCD is explored. [1] Sovinec et al., JCP 195, 355 (2004) [2]www.compxco.com [3] Both the IPS development and the research presented here are part of the SWIM project. Funded by U.S. DoE.
Mathematical Modeling of Cellular Metabolism.
Berndt, Nikolaus; Holzhütter, Hermann-Georg
Cellular metabolism basically consists of the conversion of chemical compounds taken up from the extracellular environment into energy (conserved in energy-rich bonds of organic phosphates) and a wide array of organic molecules serving as catalysts (enzymes), information carriers (nucleic acids), and building blocks for cellular structures such as membranes or ribosomes. Metabolic modeling aims at the construction of mathematical representations of the cellular metabolism that can be used to calculate the concentration of cellular molecules and the rates of their mutual chemical interconversion in response to varying external conditions as, for example, hormonal stimuli or supply of essential nutrients. Based on such calculations, it is possible to quantify complex cellular functions as cellular growth, detoxification of drugs and xenobiotic compounds or synthesis of exported molecules. Depending on the specific questions to metabolism addressed, the methodological expertise of the researcher, and available experimental information, different conceptual frameworks have been established, allowing the usage of computational methods to condense experimental information from various layers of organization into (self-) consistent models. Here, we briefly outline the main conceptual frameworks that are currently exploited in metabolism research.
A Human Sensor Network Framework in Support of Near Real Time Situational Geophysical Modeling
NASA Astrophysics Data System (ADS)
Aulov, O.; Price, A.; Smith, J. A.; Halem, M.
2013-12-01
The area of Disaster Management is well established among Federal Agencies such as FEMA, EPA, NOAA and NASA. These agencies have well formulated frameworks for response and mitigation based on near real time satellite and conventional observing networks for assimilation into geophysical models. Forecasts from these models are used to communicate with emergency responders and the general public. More recently, agencies have started using social media to broadcast warnings and alerts to potentially affected communities. In this presentation, we demonstrate the added benefits of mining and assimilating the vast amounts of social media data available from heterogeneous hand held devices and social networks into established operational geophysical modeling frameworks as they apply to the five cornerstones of disaster management - Prevention, Mitigation, Preparedness, Response and Recovery. Often, in situations of extreme events, social media provide the earliest notification of adverse extreme events. However, various forms of social media data also can provide useful geolocated and time stamped in situ observations, complementary to directly sensed conventional observations. We use the concept of a Human Sensor Network where one views social media users as carrying field deployed "sensors" whose posts are the remotely "sensed instrument measurements.' These measurements can act as 'station data' providing the resolution and coverage needed for extreme event specific modeling and validation. Here, we explore the use of social media through the use of a Human Sensor Network (HSN) approach as another data input source for assimilation into geophysical models. Employing the HSN paradigm can provide useful feedback in near real-time, but presents software challenges for rapid access, quality filtering and transforming massive social media data into formats consistent with the operational models. As a use case scenario, we demonstrate the value of HSN for disaster management and mitigation in the wake of Hurricane Sandy. Hurricane Sandy devastated multiple regions along the Atlantic coast causing damage estimated at $68 billion to the Eastern United States. We developed a framework consisting of a set of APIs that harvested over 8 million tweets and 370 thousand Instagram photos mentioning Hurricane Sandy over 4 days from Oct. 29 -Nov. 1, 2012. The flexibility of the framework allows for easy integration with such geophysical models such as GNOME, SLOSH, HySplit, and WRF. We use ElasticSearch, a RESTful, distributed search engine based on Apache Lucene, as the underlying platform for indexing, filtering and extracting feature content from the Twitter and Instagram metadata. We identify microscale events and visually present time varying correlation results of forecasts from the NOAA operational surge model SLOSH with those obtained from our SM database on a Google Earth based map. We are exploring the benefits of our framework to illuminate gaps in our understanding of the use of such data in geophysical models.
ERIC Educational Resources Information Center
Molenaar, Peter C. M.; Nesselroade, John R.
1998-01-01
Pseudo-Maximum Likelihood (p-ML) and Asymptotically Distribution Free (ADF) estimation methods for estimating dynamic factor model parameters within a covariance structure framework were compared through a Monte Carlo simulation. Both methods appear to give consistent model parameter estimates, but only ADF gives standard errors and chi-square…
Effects of Cognitive Load on Driving Performance: The Cognitive Control Hypothesis.
Engström, Johan; Markkula, Gustav; Victor, Trent; Merat, Natasha
2017-08-01
The objective of this paper was to outline an explanatory framework for understanding effects of cognitive load on driving performance and to review the existing experimental literature in the light of this framework. Although there is general consensus that taking the eyes off the forward roadway significantly impairs most aspects of driving, the effects of primarily cognitively loading tasks on driving performance are not well understood. Based on existing models of driver attention, an explanatory framework was outlined. This framework can be summarized in terms of the cognitive control hypothesis: Cognitive load selectively impairs driving subtasks that rely on cognitive control but leaves automatic performance unaffected. An extensive literature review was conducted wherein existing results were reinterpreted based on the proposed framework. It was demonstrated that the general pattern of experimental results reported in the literature aligns well with the cognitive control hypothesis and that several apparent discrepancies between studies can be reconciled based on the proposed framework. More specifically, performance on nonpracticed or inherently variable tasks, relying on cognitive control, is consistently impaired by cognitive load, whereas the performance on automatized (well-practiced and consistently mapped) tasks is unaffected and sometimes even improved. Effects of cognitive load on driving are strongly selective and task dependent. The present results have important implications for the generalization of results obtained from experimental studies to real-world driving. The proposed framework can also serve to guide future research on the potential causal role of cognitive load in real-world crashes.
Boden, Lisa A; McKendrick, Iain J
2017-01-01
Mathematical models are increasingly relied upon as decision support tools, which estimate risks and generate recommendations to underpin public health policies. However, there are no formal agreements about what constitutes professional competencies or duties in mathematical modeling for public health. In this article, we propose a framework to evaluate whether mathematical models that assess human and animal disease risks and control strategies meet standards consistent with ethical "good practice" and are thus "fit for purpose" as evidence in support of policy. This framework is derived from principles of biomedical ethics: independence, transparency (autonomy), beneficence/non-maleficence, and justice. We identify ethical risks associated with model development and implementation and consider the extent to which scientists are accountable for the translation and communication of model results to policymakers so that the strengths and weaknesses of the scientific evidence base and any socioeconomic and ethical impacts of biased or uncertain predictions are clearly understood. We propose principles to operationalize a framework for ethically sound model development and risk communication between scientists and policymakers. These include the creation of science-policy partnerships to mutually define policy questions and communicate results; development of harmonized international standards for model development; and data stewardship and improvement of the traceability and transparency of models via a searchable archive of policy-relevant models. Finally, we suggest that bespoke ethical advisory groups, with relevant expertise and access to these resources, would be beneficial as a bridge between science and policy, advising modelers of potential ethical risks and providing overview of the translation of modeling advice into policy.
Pedrini, Paolo; Bragalanti, Natalia; Groff, Claudio
2017-01-01
Recently-developed methods that integrate multiple data sources arising from the same ecological processes have typically utilized structured data from well-defined sampling protocols (e.g., capture-recapture and telemetry). Despite this new methodological focus, the value of opportunistic data for improving inference about spatial ecological processes is unclear and, perhaps more importantly, no procedures are available to formally test whether parameter estimates are consistent across data sources and whether they are suitable for integration. Using data collected on the reintroduced brown bear population in the Italian Alps, a population of conservation importance, we combined data from three sources: traditional spatial capture-recapture data, telemetry data, and opportunistic data. We developed a fully integrated spatial capture-recapture (SCR) model that included a model-based test for data consistency to first compare model estimates using different combinations of data, and then, by acknowledging data-type differences, evaluate parameter consistency. We demonstrate that opportunistic data lend itself naturally to integration within the SCR framework and highlight the value of opportunistic data for improving inference about space use and population size. This is particularly relevant in studies of rare or elusive species, where the number of spatial encounters is usually small and where additional observations are of high value. In addition, our results highlight the importance of testing and accounting for inconsistencies in spatial information from structured and unstructured data so as to avoid the risk of spurious or averaged estimates of space use and consequently, of population size. Our work supports the use of a single modeling framework to combine spatially-referenced data while also accounting for parameter consistency. PMID:28973034
U.S.A.B.I.L.I.T.Y. Framework for Older Adults.
Caboral-Stevens, Meriam; Whetsell, Martha V; Evangelista, Lorraine S; Cypress, Brigitte; Nickitas, Donna
2015-01-01
The purpose of the current study was to present a framework to determine potential usability of health websites by older adults. Review of the literature showed paucity of nursing theory related to the use of technology and usability, particularly in older adults. The Roy Adaptation Model, a widely used nursing theory, was chosen to provide framework for the new model. Technology constructs from the Technology Acceptance Model and United Theory of Acceptance and Use of Technology and behavioral control construct from the Theory of Planned Behavior were integrated into the construction of the derived model. The Use of Technology for Adaptation by Older Adults and/or Those With Limited Literacy (U.S.A.B.I.L.I.T.Y.) Model was constructed from the integration of diverse theoretical/conceptual perspectives. The four determinants of usability in the conceptual model include (a) efficiency, (b) learnability, (c) perceived user experience, and (d) perceived control. Because of the lack of well-validated survey questionnaires to measure these determinants, a U.S.A.B.I.L.I.T.Y. Survey was developed. A panel of experts evaluated face and content validity of the new instrument. Internal consistency of the new instrument was 0.96. Usability is key to accepting technology. The derived U.S.A.B.I.L.I.T.Y. framework could serve as a guide for nurses in formative evaluation of technology. Copyright 2015, SLACK Incorporated.
A framework using cluster-based hybrid network architecture for collaborative virtual surgery.
Qin, Jing; Choi, Kup-Sze; Poon, Wai-Sang; Heng, Pheng-Ann
2009-12-01
Research on collaborative virtual environments (CVEs) opens the opportunity for simulating the cooperative work in surgical operations. It is however a challenging task to implement a high performance collaborative surgical simulation system because of the difficulty in maintaining state consistency with minimum network latencies, especially when sophisticated deformable models and haptics are involved. In this paper, an integrated framework using cluster-based hybrid network architecture is proposed to support collaborative virtual surgery. Multicast transmission is employed to transmit updated information among participants in order to reduce network latencies, while system consistency is maintained by an administrative server. Reliable multicast is implemented using distributed message acknowledgment based on cluster cooperation and sliding window technique. The robustness of the framework is guaranteed by the failure detection chain which enables smooth transition when participants join and leave the collaboration, including normal and involuntary leaving. Communication overhead is further reduced by implementing a number of management approaches such as computational policies and collaborative mechanisms. The feasibility of the proposed framework is demonstrated by successfully extending an existing standalone orthopedic surgery trainer into a collaborative simulation system. A series of experiments have been conducted to evaluate the system performance. The results demonstrate that the proposed framework is capable of supporting collaborative surgical simulation.
ERIC Educational Resources Information Center
Vanlaar, Gudrun; Kyriakides, Leonidas; Panayiotou, Anastasia; Vandecandelaere, Machteld; McMahon, Léan; De Fraine, Bieke; Van Damme, Jan
2016-01-01
Background: The dynamic model of educational effectiveness (DMEE) is a comprehensive theoretical framework including factors that are important for school learning, based on consistent findings within educational effectiveness research. Purpose: This study investigates the impact of teacher and school factors of DMEE on mathematics and science…
ERIC Educational Resources Information Center
Pellas, Nikolaos; Boumpa, Anna
2017-01-01
This study seeks to investigate the effect of pre-service foreign language teachers' interactions on their continuing professional development (CPD), using a theoretical instructional design framework consisted of the three presence indicators of a Community of Inquiry (CoI) model and the Jigsaw teaching technique. The investigation was performed…
Fully Bayesian Estimation of Data from Single Case Designs
ERIC Educational Resources Information Center
Rindskopf, David
2013-01-01
Single case designs (SCDs) generally consist of a small number of short time series in two or more phases. The analysis of SCDs statistically fits in the framework of a multilevel model, or hierarchical model. The usual analysis does not take into account the uncertainty in the estimation of the random effects. This not only has an effect on the…
Consistent data-driven computational mechanics
NASA Astrophysics Data System (ADS)
González, D.; Chinesta, F.; Cueto, E.
2018-05-01
We present a novel method, within the realm of data-driven computational mechanics, to obtain reliable and thermodynamically sound simulation from experimental data. We thus avoid the need to fit any phenomenological model in the construction of the simulation model. This kind of techniques opens unprecedented possibilities in the framework of data-driven application systems and, particularly, in the paradigm of industry 4.0.
Modeling and Detecting Feature Interactions among Integrated Services of Home Network Systems
NASA Astrophysics Data System (ADS)
Igaki, Hiroshi; Nakamura, Masahide
This paper presents a framework for formalizing and detecting feature interactions (FIs) in the emerging smart home domain. We first establish a model of home network system (HNS), where every networked appliance (or the HNS environment) is characterized as an object consisting of properties and methods. Then, every HNS service is defined as a sequence of method invocations of the appliances. Within the model, we next formalize two kinds of FIs: (a) appliance interactions and (b) environment interactions. An appliance interaction occurs when two method invocations conflict on the same appliance, whereas an environment interaction arises when two method invocations conflict indirectly via the environment. Finally, we propose offline and online methods that detect FIs before service deployment and during execution, respectively. Through a case study with seven practical services, it is shown that the proposed framework is generic enough to capture feature interactions in HNS integrated services. We also discuss several FI resolution schemes within the proposed framework.
A thermo-chemo-mechanically coupled constitutive model for curing of glassy polymers
NASA Astrophysics Data System (ADS)
Sain, Trisha; Loeffel, Kaspar; Chester, Shawn
2018-07-01
Curing of a polymer is the process through which a polymer liquid transitions into a solid polymer, capable of bearing mechanical loads. The curing process is a coupled thermo-chemo-mechanical conversion process which requires a thorough understanding of the system behavior to predict the cure dependent mechanical behavior of the solid polymer. In this paper, a thermodynamically consistent, frame indifferent, thermo-chemo-mechanically coupled continuum level constitutive framework is proposed for thermally cured glassy polymers. The constitutive framework considers the thermodynamics of chemical reactions, as well as the material behavior for a glassy polymer. A stress-free intermediate configuration is introduced within a finite deformation setting to capture the formation of the network in a stress-free configuration. This work considers a definition for the degree of cure based on the chemistry of the curing reactions. A simplified version of the proposed model has been numerically implemented, and simulations are used to understand the capabilities of the model and framework.
Solar Advisor Model User Guide for Version 2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilman, P.; Blair, N.; Mehos, M.
2008-08-01
The Solar Advisor Model (SAM) provides a consistent framework for analyzing and comparing power system costs and performance across the range of solar technologies and markets, from photovoltaic systems for residential and commercial markets to concentrating solar power and large photovoltaic systems for utility markets. This manual describes Version 2.0 of the software, which can model photovoltaic and concentrating solar power technologies for electric applications for several markets. The current version of the Solar Advisor Model does not model solar heating and lighting technologies.
SAINT: A combined simulation language for modeling man-machine systems
NASA Technical Reports Server (NTRS)
Seifert, D. J.
1979-01-01
SAINT (Systems Analysis of Integrated Networks of Tasks) is a network modeling and simulation technique for design and analysis of complex man machine systems. SAINT provides the conceptual framework for representing systems that consist of discrete task elements, continuous state variables, and interactions between them. It also provides a mechanism for combining human performance models and dynamic system behaviors in a single modeling structure. The SAINT technique is described and applications of the SAINT are discussed.
Brunton, Ginny; Thomas, James; O'Mara-Eves, Alison; Jamal, Farah; Oliver, Sandy; Kavanagh, Josephine
2017-12-11
Government policy increasingly supports engaging communities to promote health. It is critical to consider whether such strategies are effective, for whom, and under what circumstances. However, 'community engagement' is defined in diverse ways and employed for different reasons. Considering the theory and context we developed a conceptual framework which informs understanding about what makes an effective (or ineffective) community engagement intervention. We conducted a systematic review of community engagement in public health interventions using: stakeholder involvement; searching, screening, appraisal and coding of research literature; and iterative thematic syntheses and meta-analysis. A conceptual framework of community engagement was refined, following interactions between the framework and each review stage. From 335 included reports, three products emerged: (1) two strong theoretical 'meta-narratives': one, concerning the theory and practice of empowerment/engagement as an independent objective; and a more utilitarian perspective optimally configuring health services to achieve defined outcomes. These informed (2) models that were operationalized in subsequent meta-analysis. Both refined (3) the final conceptual framework. This identified multiple dimensions by which community engagement interventions may differ. Diverse combinations of intervention purpose, theory and implementation were noted, including: ways of defining communities and health needs; initial motivations for community engagement; types of participation; conditions and actions necessary for engagement; and potential issues influencing impact. Some dimensions consistently co-occurred, leading to three overarching models of effective engagement which either: utilised peer-led delivery; employed varying degrees of collaboration between communities and health services; or built on empowerment philosophies. Our conceptual framework and models are useful tools for considering appropriate and effective approaches to community engagement. These should be tested and adapted to facilitate intervention design and evaluation. Using this framework may disentangle the relative effectiveness of different models of community engagement, promoting effective, sustainable and appropriate initiatives.
A consistent framework for Horton regression statistics that leads to a modified Hack's law
Furey, P.R.; Troutman, B.M.
2008-01-01
A statistical framework is introduced that resolves important problems with the interpretation and use of traditional Horton regression statistics. The framework is based on a univariate regression model that leads to an alternative expression for Horton ratio, connects Horton regression statistics to distributional simple scaling, and improves the accuracy in estimating Horton plot parameters. The model is used to examine data for drainage area A and mainstream length L from two groups of basins located in different physiographic settings. Results show that confidence intervals for the Horton plot regression statistics are quite wide. Nonetheless, an analysis of covariance shows that regression intercepts, but not regression slopes, can be used to distinguish between basin groups. The univariate model is generalized to include n > 1 dependent variables. For the case where the dependent variables represent ln A and ln L, the generalized model performs somewhat better at distinguishing between basin groups than two separate univariate models. The generalized model leads to a modification of Hack's law where L depends on both A and Strahler order ??. Data show that ?? plays a statistically significant role in the modified Hack's law expression. ?? 2008 Elsevier B.V.
NEIMiner: nanomaterial environmental impact data miner.
Tang, Kaizhi; Liu, Xiong; Harper, Stacey L; Steevens, Jeffery A; Xu, Roger
2013-01-01
As more engineered nanomaterials (eNM) are developed for a wide range of applications, it is crucial to minimize any unintended environmental impacts resulting from the application of eNM. To realize this vision, industry and policymakers must base risk management decisions on sound scientific information about the environmental fate of eNM, their availability to receptor organisms (eg, uptake), and any resultant biological effects (eg, toxicity). To address this critical need, we developed a model-driven, data mining system called NEIMiner, to study nanomaterial environmental impact (NEI). NEIMiner consists of four components: NEI modeling framework, data integration, data management and access, and model building. The NEI modeling framework defines the scope of NEI modeling and the strategy of integrating NEI models to form a layered, comprehensive predictability. The data integration layer brings together heterogeneous data sources related to NEI via automatic web services and web scraping technologies. The data management and access layer reuses and extends a popular content management system (CMS), Drupal, and consists of modules that model the complex data structure for NEI-related bibliography and characterization data. The model building layer provides an advanced analysis capability for NEI data. Together, these components provide significant value to the process of aggregating and analyzing large-scale distributed NEI data. A prototype of the NEIMiner system is available at http://neiminer.i-a-i.com/.
NEIMiner: nanomaterial environmental impact data miner
Tang, Kaizhi; Liu, Xiong; Harper, Stacey L; Steevens, Jeffery A; Xu, Roger
2013-01-01
As more engineered nanomaterials (eNM) are developed for a wide range of applications, it is crucial to minimize any unintended environmental impacts resulting from the application of eNM. To realize this vision, industry and policymakers must base risk management decisions on sound scientific information about the environmental fate of eNM, their availability to receptor organisms (eg, uptake), and any resultant biological effects (eg, toxicity). To address this critical need, we developed a model-driven, data mining system called NEIMiner, to study nanomaterial environmental impact (NEI). NEIMiner consists of four components: NEI modeling framework, data integration, data management and access, and model building. The NEI modeling framework defines the scope of NEI modeling and the strategy of integrating NEI models to form a layered, comprehensive predictability. The data integration layer brings together heterogeneous data sources related to NEI via automatic web services and web scraping technologies. The data management and access layer reuses and extends a popular content management system (CMS), Drupal, and consists of modules that model the complex data structure for NEI-related bibliography and characterization data. The model building layer provides an advanced analysis capability for NEI data. Together, these components provide significant value to the process of aggregating and analyzing large-scale distributed NEI data. A prototype of the NEIMiner system is available at http://neiminer.i-a-i.com/. PMID:24098076
Evidence-Centered Design: Recommendations for Implementation and Practice
ERIC Educational Resources Information Center
Hendrickson, Amy; Ewing, Maureen; Kaliski, Pamela; Huff, Kristen
2013-01-01
Evidence-centered design (ECD) is an orientation towards assessment development. It differs from conventional practice in several ways and consists of multiple activities. Each of these activities results in a set of useful documentation: domain analysis, domain modeling, construction of the assessment framework, and assessment…
A symbiotic approach to fluid equations and non-linear flux-driven simulations of plasma dynamics
NASA Astrophysics Data System (ADS)
Halpern, Federico
2017-10-01
The fluid framework is ubiquitous in studies of plasma transport and stability. Typical forms of the fluid equations are motivated by analytical work dating several decades ago, before computer simulations were indispensable, and can be, therefore, not optimal for numerical computation. We demonstrate a new first-principles approach to obtaining manifestly consistent, skew-symmetric fluid models, ensuring internal consistency and conservation properties even in discrete form. Mass, kinetic, and internal energy become quadratic (and always positive) invariants of the system. The model lends itself to a robust, straightforward discretization scheme with inherent non-linear stability. A simpler, drift-ordered form of the equations is obtained, and first results of their numerical implementation as a binary framework for bulk-fluid global plasma simulations are demonstrated. This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Fusion Energy Sciences, Theory Program, under Award No. DE-FG02-95ER54309.
A new fit-for-purpose model testing framework: Decision Crash Tests
NASA Astrophysics Data System (ADS)
Tolson, Bryan; Craig, James
2016-04-01
Decision-makers in water resources are often burdened with selecting appropriate multi-million dollar strategies to mitigate the impacts of climate or land use change. Unfortunately, the suitability of existing hydrologic simulation models to accurately inform decision-making is in doubt because the testing procedures used to evaluate model utility (i.e., model validation) are insufficient. For example, many authors have identified that a good standard framework for model testing called the Klemes Crash Tests (KCTs), which are the classic model validation procedures from Klemeš (1986) that Andréassian et al. (2009) rename as KCTs, have yet to become common practice in hydrology. Furthermore, Andréassian et al. (2009) claim that the progression of hydrological science requires widespread use of KCT and the development of new crash tests. Existing simulation (not forecasting) model testing procedures such as KCTs look backwards (checking for consistency between simulations and past observations) rather than forwards (explicitly assessing if the model is likely to support future decisions). We propose a fundamentally different, forward-looking, decision-oriented hydrologic model testing framework based upon the concept of fit-for-purpose model testing that we call Decision Crash Tests or DCTs. Key DCT elements are i) the model purpose (i.e., decision the model is meant to support) must be identified so that model outputs can be mapped to management decisions ii) the framework evaluates not just the selected hydrologic model but the entire suite of model-building decisions associated with model discretization, calibration etc. The framework is constructed to directly and quantitatively evaluate model suitability. The DCT framework is applied to a model building case study on the Grand River in Ontario, Canada. A hypothetical binary decision scenario is analysed (upgrade or not upgrade the existing flood control structure) under two different sets of model building decisions. In one case, we show the set of model building decisions has a low probability to correctly support the upgrade decision. In the other case, we show evidence suggesting another set of model building decisions has a high probability to correctly support the decision. The proposed DCT framework focuses on what model users typically care about: the management decision in question. The DCT framework will often be very strict and will produce easy to interpret results enabling clear unsuitability determinations. In the past, hydrologic modelling progress has necessarily meant new models and model building methods. Continued progress in hydrologic modelling requires finding clear evidence to motivate researchers to disregard unproductive models and methods and the DCT framework is built to produce this kind of evidence. References: Andréassian, V., C. Perrin, L. Berthet, N. Le Moine, J. Lerat, C. Loumagne, L. Oudin, T. Mathevet, M.-H. Ramos, and A. Valéry (2009), Crash tests for a standardized evaluation of hydrological models. Hydrology and Earth System Sciences, 13, 1757-1764. Klemeš, V. (1986), Operational testing of hydrological simulation models. Hydrological Sciences Journal, 31 (1), 13-24.
Rasmussen, Peter M.; Smith, Amy F.; Sakadžić, Sava; Boas, David A.; Pries, Axel R.; Secomb, Timothy W.; Østergaard, Leif
2017-01-01
Objective In vivo imaging of the microcirculation and network-oriented modeling have emerged as powerful means of studying microvascular function and understanding its physiological significance. Network-oriented modeling may provide the means of summarizing vast amounts of data produced by high-throughput imaging techniques in terms of key, physiological indices. To estimate such indices with sufficient certainty, however, network-oriented analysis must be robust to the inevitable presence of uncertainty due to measurement errors as well as model errors. Methods We propose the Bayesian probabilistic data analysis framework as a means of integrating experimental measurements and network model simulations into a combined and statistically coherent analysis. The framework naturally handles noisy measurements and provides posterior distributions of model parameters as well as physiological indices associated with uncertainty. Results We applied the analysis framework to experimental data from three rat mesentery networks and one mouse brain cortex network. We inferred distributions for more than five hundred unknown pressure and hematocrit boundary conditions. Model predictions were consistent with previous analyses, and remained robust when measurements were omitted from model calibration. Conclusion Our Bayesian probabilistic approach may be suitable for optimizing data acquisition and for analyzing and reporting large datasets acquired as part of microvascular imaging studies. PMID:27987383
Using concept mapping to design an indicator framework for addiction treatment centres.
Nabitz, Udo; van Den Brink, Wim; Jansen, Paul
2005-06-01
The objective of this study is to determine an indicator framework for addiction treatment centres based on the demands of stakeholders and in alignment with the European Foundation for Quality Management (EFQM) Excellence Model. The setting is the Jellinek Centre based in Amsterdam, the Netherlands, which serves as a prototype for an addiction treatment centre. Concept mapping was used in the construction of the indicator framework. During the 1-day workshop, 16 stakeholders generated, prioritized and sorted 73 items concerning quality and performance. Multidimensional scaling and cluster analysis was applied in constructing a framework consisting of two dimensions and eight clusters. The horizontal axis of the indicator framework is named 'Organization' and has two poles, namely, 'Processes' and 'Results'. The vertical axis is named ' Task' and the poles are named 'Efficient treatment' and 'Prevention programs'. The eight clusters in the two-dimensional framework are arranged in the following, prioritized sequence: 'Efficient treatment network', 'Effective service', ' Target group', 'Quality of life', 'Efficient service', 'Knowledge transfer', 'Reducing addiction related problems', and 'Prevention programs'. The most important items in the framework are: 'patients are satisfied with their treatment', 'early interventions', and 'efficient treatment chain'. The indicator framework aligns with three clusters of the results criteria of the EFQM Excellence Model. It is based on the stakeholders' perspectives and is believed to be specific for addiction treatment centres. The study demonstrates that concept mapping is a suitable strategy for generating indicator frameworks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Po-Lun; Rasch, Philip J.; Fast, Jerome D.
A suite of physical parameterizations (deep and shallow convection, turbulent boundary layer, aerosols, cloud microphysics, and cloud fraction) from the global climate model Community Atmosphere Model version 5.1 (CAM5) has been implemented in the regional model Weather Research and Forecasting with chemistry (WRF-Chem). A downscaling modeling framework with consistent physics has also been established in which both global and regional simulations use the same emissions and surface fluxes. The WRF-Chem model with the CAM5 physics suite is run at multiple horizontal resolutions over a domain encompassing the northern Pacific Ocean, northeast Asia, and northwest North America for April 2008 whenmore » the ARCTAS, ARCPAC, and ISDAC field campaigns took place. These simulations are evaluated against field campaign measurements, satellite retrievals, and ground-based observations, and are compared with simulations that use a set of common WRF-Chem Parameterizations. This manuscript describes the implementation of the CAM5 physics suite in WRF-Chem provides an overview of the modeling framework and an initial evaluation of the simulated meteorology, clouds, and aerosols, and quantifies the resolution dependence of the cloud and aerosol parameterizations. We demonstrate that some of the CAM5 biases, such as high estimates of cloud susceptibility to aerosols and the underestimation of aerosol concentrations in the Arctic, can be reduced simply by increasing horizontal resolution. We also show that the CAM5 physics suite performs similarly to a set of parameterizations commonly used in WRF-Chem, but produces higher ice and liquid water condensate amounts and near-surface black carbon concentration. Further evaluations that use other mesoscale model parameterizations and perform other case studies are needed to infer whether one parameterization consistently produces results more consistent with observations.« less
Leveraging constraints and biotelemetry data to pinpoint repetitively used spatial features
Brost, Brian M.; Hooten, Mevin B.; Small, Robert J.
2016-01-01
Satellite telemetry devices collect valuable information concerning the sites visited by animals, including the location of central places like dens, nests, rookeries, or haul‐outs. Existing methods for estimating the location of central places from telemetry data require user‐specified thresholds and ignore common nuances like measurement error. We present a fully model‐based approach for locating central places from telemetry data that accounts for multiple sources of uncertainty and uses all of the available locational data. Our general framework consists of an observation model to account for large telemetry measurement error and animal movement, and a highly flexible mixture model specified using a Dirichlet process to identify the location of central places. We also quantify temporal patterns in central place use by incorporating ancillary behavioral data into the model; however, our framework is also suitable when no such behavioral data exist. We apply the model to a simulated data set as proof of concept. We then illustrate our framework by analyzing an Argos satellite telemetry data set on harbor seals (Phoca vitulina) in the Gulf of Alaska, a species that exhibits fidelity to terrestrial haul‐out sites.
Expanding on Successful Concepts, Models, and Organization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teeguarden, Justin G.; Tan, Yu-Mei; Edwards, Stephen W.
In her letter to the editor1 regarding our recent Feature Article “Completing the Link between Exposure Science and Toxicology for Improved Environmental Health Decision Making: The Aggregate Exposure Pathway Framework” 2, Dr. von Göetz expressed several concerns about terminology, and the perception that we propose the replacement of successful approaches and models for exposure assessment with a concept. We are glad to have the opportunity to address these issues here. If the goal of the AEP framework was to replace existing exposure models or databases for organizing exposure data with a concept, we would share Dr. von Göetz concerns. Instead,more » the outcome we promote is broader use of an organizational framework for exposure science. The framework would support improved generation, organization, and interpretation of data as well as modeling and prediction, not replacement of models. The field of toxicology has seen the benefits of wide use of one or more organizational frameworks (e.g., mode and mechanism of action, adverse outcome pathway). These frameworks influence how experiments are designed, data are collected, curated, stored and interpreted and ultimately how data are used in risk assessment. Exposure science is poised to similarly benefit from broader use of a parallel organizational framework, which Dr. von Göetz correctly points out, is currently used in the exposure modeling community. In our view, the concepts used so effectively in the exposure modeling community, expanded upon in the AEP framework, could see wider adoption by the field as a whole. The value of such a framework was recognized by the National Academy of Sciences.3 Replacement of models, databases, or any application with the AEP framework was not proposed in our article. The positive role broader more consistent use of such a framework might have in enabling and advancing “general activities such as data acquisition, organization…,” and exposure modeling was discussed in some detail. Like Dr. von Göetz, we recognized the challenges associated with acceptance of the terminology, definitions, and structure proposed in the paper. To address these challenges, an expert workshop was held in May, 2016 to consider and revise the “basic elements” outlined in the paper. The attendees produced revisions to the terminology (e.g., key events) that align with terminology currently in use in the field. We were also careful in our paper to acknowledge a point raised by Dr. von Göetz, that the term AEP implies aggregation, providing these clarifications: “The simplest form of an AEP represents a single source and a single pathway and may more commonly be referred to as an exposure pathway,”; and “An aggregate exposure pathway may represent multiple sources and transfer through single pathways to the TSE, single sources and transfer through multiple pathways to the target site exposure (TSE), or any combination of these.” These clarifications address the concern that the AEP term is not accurate or logical, and further expands upon the word “aggregate” in a broader context. Our use of AEP is consistent with the definition for “aggregate exposure”, which refers to the combined exposures to a single chemical across multiple routes and pathways.3 The AEP framework embraces existing methods for collection, prediction, organization, and interpretation of human and ecological exposure data cited by Dr. von Göetz. We remain hopeful that wider recognition and use of an organizing concept for exposure information across the exposure science, toxicology and epidemiology communities advances the development of the kind of infrastructure and models Dr. von Göetz discusses. This outcome would be a step forward, rather than a step backward.« less
Dubé, Monique G; Duinker, Peter; Greig, Lorne; Carver, Martin; Servos, Mark; McMaster, Mark; Noble, Bram; Schreier, Hans; Jackson, Lee; Munkittrick, Kelly R
2013-07-01
From 2008 to 2013, a series of studies supported by the Canadian Water Network were conducted in Canadian watersheds in an effort to improve methods to assess cumulative effects. These studies fit under a common framework for watershed cumulative effects assessment (CEA). This article presents an introduction to the Special Series on Watershed CEA in IEAM including the framework and its impetus, a brief introduction to each of the articles in the series, challenges, and a path forward. The framework includes a regional water monitoring program that produces 3 core outputs: an accumulated state assessment, stressor-response relationships, and development of predictive cumulative effects scenario models. The framework considers core values, indicators, thresholds, and use of consistent terminology. It emphasizes that CEA requires 2 components, accumulated state quantification and predictive scenario forecasting. It recognizes both of these components must be supported by a regional, multiscale monitoring program. Copyright © 2013 SETAC.
AIDS susceptibility in a migrant population: perception and behavior.
McBride, D C; Weatherby, N L; Inciardi, J A; Gillespie, S A
1999-01-01
Within the framework of the Health Belief Model, this paper examines correlates of perception of AIDS susceptibility among 846 drug-using migrant farm workers and their sex partners. Significant but relatively small differences by ethnicity and gender were found. The data showed a consistent significant statistical relationship between frequency of drug use, high-risk sexual behavior, and perception of AIDS susceptibility. Perception of AIDS susceptibility was significantly related to a subsequent reduction in sexual risk behaviors. Consistent with the Health Belief Model, the data suggest that increasing perception of AIDS susceptibility may be an important motivator in reducing high-risk behaviors.
Kaack, Lorraine; Bender, Miriam; Finch, Michael; Borns, Linda; Grasham, Katherine; Avolio, Alice; Clausen, Shawna; Terese, Nadine A; Johnstone, Diane; Williams, Marjory
The Veterans Health Administration (VHA) Office of Nursing Services (ONS) was an early adopter of Clinical Nurse Leader (CNL) practice, generating some of the earliest pilot data of CNL practice effectiveness. In 2011 the VHA ONS CNL Implementation & Evaluation Service (CNL I&E) piloted a curriculum to facilitate CNL transition to effective practice at local VHA settings. In 2015, the CNL I&E and local VHA setting stakeholders collaborated to refine the program, based on lessons learned at the national and local level. The workgroup reviewed the literature to identify theoretical frameworks for CNL practice and practice development. The workgroup selected Benner et al.'s Novice-to-Expert model as the defining framework for CNL practice development, and Bender et al.'s CNL Practice Model as the defining framework for CNL practice integration. The selected frameworks were cross-walked against existing curriculum elements to identify and clarify additional practice development needs. The work generated key insights into: core stages of transition to effective practice; CNL progress and expectations for each stage; and organizational support structures necessary for CNL success at each stage. The refined CNL development model is a robust tool that can be applied to support consistent and effective integration of CNL practice into care delivery. Published by Elsevier Inc.
Plasmonic Circuit Theory for Multiresonant Light Funneling to a Single Spatial Hot Spot.
Hughes, Tyler W; Fan, Shanhui
2016-09-14
We present a theoretical framework, based on plasmonic circuit models, for generating a multiresonant field intensity enhancement spectrum at a single "hot spot" in a plasmonic device. We introduce a circuit model, consisting of an array of coupled LC resonators, that directs current asymmetrically in the array, and we show that this circuit can funnel energy efficiently from each resonance to a single element. We implement the circuit model in a plasmonic nanostructure consisting of a series of metal bars of differing length, with nearest neighbor metal bars strongly coupled electromagnetically through air gaps. The resulting nanostructure resonantly traps different wavelengths of incident light in separate gap regions, yet it funnels the energy of different resonances to a common location, which is consistent with our circuit model. Our work is important for a number of applications of plasmonic nanoantennas in spectroscopy, such as in single-molecule fluorescence spectroscopy or Raman spectroscopy.
An index-based robust decision making framework for watershed management in a changing climate.
Kim, Yeonjoo; Chung, Eun-Sung
2014-03-01
This study developed an index-based robust decision making framework for watershed management dealing with water quantity and quality issues in a changing climate. It consists of two parts of management alternative development and analysis. The first part for alternative development consists of six steps: 1) to understand the watershed components and process using HSPF model, 2) to identify the spatial vulnerability ranking using two indices: potential streamflow depletion (PSD) and potential water quality deterioration (PWQD), 3) to quantify the residents' preferences on water management demands and calculate the watershed evaluation index which is the weighted combinations of PSD and PWQD, 4) to set the quantitative targets for water quantity and quality, 5) to develop a list of feasible alternatives and 6) to eliminate the unacceptable alternatives. The second part for alternative analysis has three steps: 7) to analyze all selected alternatives with a hydrologic simulation model considering various climate change scenarios, 8) to quantify the alternative evaluation index including social and hydrologic criteria with utilizing multi-criteria decision analysis methods and 9) to prioritize all options based on a minimax regret strategy for robust decision. This framework considers the uncertainty inherent in climate models and climate change scenarios with utilizing the minimax regret strategy, a decision making strategy under deep uncertainty and thus this procedure derives the robust prioritization based on the multiple utilities of alternatives from various scenarios. In this study, the proposed procedure was applied to the Korean urban watershed, which has suffered from streamflow depletion and water quality deterioration. Our application shows that the framework provides a useful watershed management tool for incorporating quantitative and qualitative information into the evaluation of various policies with regard to water resource planning and management. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Neal, Lucy S.; Dalvi, Mohit; Folberth, Gerd; McInnes, Rachel N.; Agnew, Paul; O'Connor, Fiona M.; Savage, Nicholas H.; Tilbee, Marie
2017-11-01
There is a clear need for the development of modelling frameworks for both climate change and air quality to help inform policies for addressing these issues simultaneously. This paper presents an initial attempt to develop a single modelling framework, by introducing a greater degree of consistency in the meteorological modelling framework by using a two-step, one-way nested configuration of models, from a global composition-climate model (GCCM) (140 km resolution) to a regional composition-climate model covering Europe (RCCM) (50 km resolution) and finally to a high (12 km) resolution model over the UK (AQUM). The latter model is used to produce routine air quality forecasts for the UK. All three models are based on the Met Office's Unified Model (MetUM). In order to better understand the impact of resolution on the downscaling of projections of future climate and air quality, we have used this nest of models to simulate a 5-year period using present-day emissions and under present-day climate conditions. We also consider the impact of running the higher-resolution model with higher spatial resolution emissions, rather than simply regridding emissions from the RCCM. We present an evaluation of the models compared to in situ air quality observations over the UK, plus a comparison against an independent 1 km resolution gridded dataset, derived from a combination of modelling and observations, effectively producing an analysis of annual mean surface pollutant concentrations. We show that using a high-resolution model over the UK has some benefits in improving air quality modelling, but that the use of higher spatial resolution emissions is important to capture local variations in concentrations, particularly for primary pollutants such as nitrogen dioxide and sulfur dioxide. For secondary pollutants such as ozone and the secondary component of PM10, the benefits of a higher-resolution nested model are more limited and reasons for this are discussed. This study highlights the point that the resolution of models is not the only factor in determining model performance - consistency between nested models is also important.
A dynamic water-quality modeling framework for the Neuse River estuary, North Carolina
Bales, Jerad D.; Robbins, Jeanne C.
1999-01-01
As a result of fish kills in the Neuse River estuary in 1995, nutrient reduction strategies were developed for point and nonpoint sources in the basin. However, because of the interannual variability in the natural system and the resulting complex hydrologic-nutrient inter- actions, it is difficult to detect through a short-term observational program the effects of management activities on Neuse River estuary water quality and aquatic health. A properly constructed water-quality model can be used to evaluate some of the potential effects of manage- ment actions on estuarine water quality. Such a model can be used to predict estuarine response to present and proposed nutrient strategies under the same set of meteorological and hydrologic conditions, thus removing the vagaries of weather and streamflow from the analysis. A two-dimensional, laterally averaged hydrodynamic and water-quality modeling framework was developed for the Neuse River estuary by using previously collected data. Development of the modeling framework consisted of (1) computational grid development, (2) assembly of data for model boundary conditions and model testing, (3) selection of initial values of model parameters, and (4) limited model testing. The model domain extends from Streets Ferry to Oriental, N.C., includes seven lateral embayments that have continual exchange with the main- stem of the estuary, three point-source discharges, and three tributary streams. Thirty-five computational segments represent the mainstem of the estuary, and the entire framework contains a total of 60 computa- tional segments. Each computational cell is 0.5 meter thick; segment lengths range from 500 meters to 7,125 meters. Data that were used to develop the modeling framework were collected during March through October 1991 and represent the most comprehensive data set available prior to 1997. Most of the data were collected by the North Carolina Division of Water Quality, the University of North Carolina Institute of Marine Sciences, and the U.S. Geological Survey. Limitations in the modeling framework were clearly identified. These limitations formed the basis for a set of suggestions to refine the Neuse River estuary water-quality model.
Evaluating MJO Event Initiation and Decay in the Skeleton Model using an RMM-like Index
2015-11-25
climatology and document 35 the occurrence of primary, continuing, and terminating MJO events in the skeleton model. The 36 overall amount of MJO...solutions in a framework consistent with observations including MJO event 104 climatology and the precursor conditions associated with the initiation and...the 112 7 model along with several applications that include a comparison to the observed MJO event 113 climatology and identification of
A spatial operator algebra for manipulator modeling and control
NASA Technical Reports Server (NTRS)
Rodriguez, G.; Jain, A.; Kreutz-Delgado, K.
1991-01-01
A recently developed spatial operator algebra for manipulator modeling, control, and trajectory design is discussed. The elements of this algebra are linear operators whose domain and range spaces consist of forces, moments, velocities, and accelerations. The effect of these operators is equivalent to a spatial recursion along the span of a manipulator. Inversion of operators can be efficiently obtained via techniques of recursive filtering and smoothing. The operator algebra provides a high-level framework for describing the dynamic and kinematic behavior of a manipulator and for control and trajectory design algorithms. The interpretation of expressions within the algebraic framework leads to enhanced conceptual and physical understanding of manipulator dynamics and kinematics.
Analysis and Modeling of DIII-D Experiments With OMFIT and Neural Networks
NASA Astrophysics Data System (ADS)
Meneghini, O.; Luna, C.; Smith, S. P.; Lao, L. L.; GA Theory Team
2013-10-01
The OMFIT integrated modeling framework is designed to facilitate experimental data analysis and enable integrated simulations. This talk introduces this framework and presents a selection of its applications to the DIII-D experiment. Examples include kinetic equilibrium reconstruction analysis; evaluation of MHD stability in the core and in the edge; and self-consistent predictive steady-state transport modeling. The OMFIT framework also provides the platform for an innovative approach based on neural networks to predict electron and ion energy fluxes. In our study a multi-layer feed-forward back-propagation neural network is built and trained over a database of DIII-D data. It is found that given the same parameters that the highest fidelity models use, the neural network model is able to predict to a large degree the heat transport profiles observed in the DIII-D experiments. Once the network is built, the numerical cost of evaluating the transport coefficients is virtually nonexistent, thus making the neural network model particularly well suited for plasma control and quick exploration of operational scenarios. The implementation of the neural network model and benchmark with experimental results and gyro-kinetic models will be discussed. Work supported in part by the US DOE under DE-FG02-95ER54309.
Validation of an Information-Motivation-Behavioral Skills model of diabetes self-care (IMB-DSC).
Osborn, Chandra Y; Egede, Leonard E
2010-04-01
Comprehensive behavior change frameworks are needed to provide guidance for the design, implementation, and evaluation of diabetes self-care programs in diverse populations. We applied the Information-Motivation-Behavioral Skills (IMB) model, a well-validated, comprehensive health behavior change framework, to diabetes self-care. Patients with diabetes were recruited from an outpatient clinic. Information gathered pertained to demographics, diabetes knowledge (information); diabetes fatalism (personal motivation); social support (social motivation); and diabetes self-care (behavior). Hemoglobin A1C values were extracted from the patient medical record. Structural equation models tested the IMB framework. More diabetes knowledge (r=0.22 p<0.05), less fatalistic attitudes (r=-0.20, p<0.05), and more social support (r=0.27, p<0.01) were independent, direct predictors of diabetes self-care behavior; and through behavior, were related to glycemic control (r=-0.20, p<0.05). Consistent with the IMB model, having more information (more diabetes knowledge), personal motivation (less fatalistic attitudes), and social motivation (more social support) was associated with behavior; and behavior was the sole predictor of glycemic control. The IMB model is an appropriate, comprehensive health behavior change framework for diabetes self-care. The findings indicate that in addition to knowledge, diabetes education programs should target personal and social motivation to effect behavior change. 2009 Elsevier Ireland Ltd. All rights reserved.
A thermodynamic framework for the study of crystallization in polymers
NASA Astrophysics Data System (ADS)
Rao, I. J.; Rajagopal, K. R.
In this paper, we present a new thermodynamic framework within the context of continuum mechanics, to predict the behavior of crystallizing polymers. The constitutive models that are developed within this thermodynamic setting are able to describe the main features of the crystallization process. The model is capable of capturing the transition from a fluid like behavior to a solid like behavior in a rational manner without appealing to any adhoc transition criterion. The anisotropy of the crystalline phase is built into the model and the specific anisotropy of the crystalline phase depends on the deformation in the melt. These features are incorporated into a recent framework that associates different natural configurations and material symmetries with distinct microstructural features within the body that arise during the process under consideration. Specific models are generated by choosing particular forms for the internal energy, entropy and the rate of dissipation. Equations governing the evolution of the natural configurations and the rate of crystallization are obtained by maximizing the rate of dissipation, subject to appropriate constraints. The initiation criterion, marking the onset of crystallization, arises naturally in this setting in terms of the thermodynamic functions. The model generated within such a framework is used to simulate bi-axial extension of a polymer film that is undergoing crystallization. The predictions of the theory that has been proposed are consistent with the experimental results (see [28] and [7]).
Modeling of prepregs during automated draping sequences
NASA Astrophysics Data System (ADS)
Krogh, Christian; Glud, Jens A.; Jakobsen, Johnny
2017-10-01
The behavior of wowen prepreg fabric during automated draping sequences is investigated. A drape tool under development with an arrangement of grippers facilitates the placement of a woven prepreg fabric in a mold. It is essential that the draped configuration is free from wrinkles and other defects. The present study aims at setting up a virtual draping framework capable of modeling the draping process from the initial flat fabric to the final double curved shape and aims at assisting the development of an automated drape tool. The virtual draping framework consists of a kinematic mapping algorithm used to generate target points on the mold which are used as input to a draping sequence planner. The draping sequence planner prescribes the displacement history for each gripper in the drape tool and these displacements are then applied to each gripper in a transient model of the draping sequence. The model is based on a transient finite element analysis with the material's constitutive behavior currently being approximated as linear elastic orthotropic. In-plane tensile and bias-extension tests as well as bending tests are conducted and used as input for the model. The virtual draping framework shows a good potential for obtaining a better understanding of the drape process and guide the development of the drape tool. However, results obtained from using the framework on a simple test case indicate that the generation of draping sequences is non-trivial.
Directed Consultation, the SEALS Model, and Teachers' Classroom Management
ERIC Educational Resources Information Center
Motoca, Luci M.; Farmer, Thomas W.; Hamm, Jill V.; Byun, Soo-yong; Lee, David L.; Brooks, Debbie S.; Rucker, Nkecha; Moohr, Michele M.
2014-01-01
Directed consultation is presented as a professional development framework to guide and support teachers in the implementation of evidence-based interventions that involve contextual and process-oriented approaches designed to be incorporated into daily classroom management. This approach consists of four components: pre-intervention observations…
ERIC Educational Resources Information Center
Davis, Stephen H.; Leon, Ronald J.
2009-01-01
The complexities of public education today require new, distributed models of school leadership in which teachers play a central role. The most effective teachers assume leadership roles as instructors and professional colleagues. In this article, we propose a framework for developing teacher leadership that consists of four intersecting domains:…
Briggs, Andrew M; Jordan, Joanne E; Jennings, Matthew; Speerin, Robyn; Bragge, Peter; Chua, Jason; Woolf, Anthony D; Slater, Helen
2017-04-01
To develop a globally informed framework to evaluate readiness for implementation and success after implementation of musculoskeletal models of care (MOCs). Three phases were undertaken: 1) a qualitative study with 27 Australian subject matter experts (SMEs) to develop a draft framework; 2) an eDelphi study with an international panel of 93 SMEs across 30 nations to evaluate face validity, and refine and establish consensus on the framework components; and 3) translation of the framework into a user-focused resource and evaluation of its acceptability with the eDelphi panel. A comprehensive evaluation framework was developed for judging the readiness and success of musculoskeletal MOCs. The framework consists of 9 domains, with each domain containing a number of themes underpinned by detailed elements. In the first Delphi round, scores of "partly agree" or "completely agree" with the draft framework ranged 96.7%-100%. In the second round, "essential" scores ranged 58.6%-98.9%, resulting in 14 of 34 themes being classified as essential. SMEs strongly agreed or agreed that the final framework was useful (98.8%), usable (95.1%), credible (100%) and appealing (93.9%). Overall, 96.3% strongly supported or supported the final structure of the framework as it was presented, while 100%, 96.3%, and 100% strongly supported or supported the content within the readiness, initiating implementation, and success streams, respectively. An empirically derived framework to evaluate the readiness and success of musculoskeletal MOCs was strongly supported by an international panel of SMEs. The framework provides an important internationally applicable benchmark for the development, implementation, and evaluation of musculoskeletal MOCs. © 2016, American College of Rheumatology.
Probabilistic economic frameworks for disaster risk management
NASA Astrophysics Data System (ADS)
Dulac, Guillaume; Forni, Marc
2013-04-01
Starting from the general concept of risk, we set up an economic analysis framework for Disaster Risk Management (DRM) investment. It builds on uncertainty management techniques - notably Monte Carlo simulations - and includes both a risk and performance metrics adapted to recurring issues in disaster risk management as entertained by governments and international organisations. This type of framework proves to be enlightening in several regards, and is thought to ease the promotion of DRM projects as "investments" rather than "costs to be born" and allow for meaningful comparison between DRM and other sectors. We then look at the specificities of disaster risk investments of medium to large scales through this framework, where some "invariants" can be identified, notably: (i) it makes more sense to perform analysis over long-term horizons -space and time scales are somewhat linked; (ii) profiling of the fluctuations of the gains and losses of DRM investments over long periods requires the ability to handle possibly highly volatile variables; (iii) complexity increases with the scale which results in a higher sensitivity of the analytic framework on the results; (iv) as the perimeter of analysis (time, theme and space-wise) is widened, intrinsic parameters of the project tend to weight lighter. This puts DRM in a very different perspective from traditional modelling, which usually builds on more intrinsic features of the disaster as it relates to the scientific knowledge about hazard(s). As models hardly accommodate for such complexity or "data entropy" (they require highly structured inputs), there is a need for a complementary approach to understand risk at global scale. The proposed framework suggests opting for flexible ad hoc modelling of specific issues consistent with one's objective, risk and performance metrics. Such tailored solutions are strongly context-dependant (time and budget, sensitivity of the studied variable in the economic framework) and can range from simple elicitation of data from a subject matter expert to calibrate a probability distribution to more advanced stochastic modelling. This approach can be referred to more as a proficiency in the language of uncertainty rather than modelling per se in the sense that it allows for greater flexibility to adapt a given context. In a real decision making context, one seldom has neither time nor budget resources to investigate all of these variables thoroughly, hence the importance of being able to prioritize the level of effort among them. Under the proposed framework, this can be done in an optimised fashion. The point here consists in applying probabilistic sensitivity analysis together with the fundamentals of the economic value of information; the framework as built is well suited to such considerations, and variables can be ranked according to their contribution to risk understanding. Efforts to deal with second order uncertainties on variables prove to be valuable when dealing with the economic value of sample information.
A surety engineering framework to reduce cognitive systems risks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caudell, Thomas P.; Peercy, David Eugene; Caldera, Eva O.
Cognitive science research investigates the advancement of human cognition and neuroscience capabilities. Addressing risks associated with these advancements can counter potential program failures, legal and ethical issues, constraints to scientific research, and product vulnerabilities. Survey results, focus group discussions, cognitive science experts, and surety researchers concur technical risks exist that could impact cognitive science research in areas such as medicine, privacy, human enhancement, law and policy, military applications, and national security (SAND2006-6895). This SAND report documents a surety engineering framework and a process for identifying cognitive system technical, ethical, legal and societal risks and applying appropriate surety methods to reducemore » such risks. The framework consists of several models: Specification, Design, Evaluation, Risk, and Maturity. Two detailed case studies are included to illustrate the use of the process and framework. Several Appendices provide detailed information on existing cognitive system architectures; ethical, legal, and societal risk research; surety methods and technologies; and educing information research with a case study vignette. The process and framework provide a model for how cognitive systems research and full-scale product development can apply surety engineering to reduce perceived and actual risks.« less
A national framework for monitoring and reporting on environmental sustainability in Canada.
Marshall, I B; Scott Smith, C A; Selby, C J
1996-01-01
In 1991, a collaborative project to revise the terrestrial component of a national ecological framework was undertaken with a wide range of stakeholders. This spatial framework consists of multiple, nested levels of ecological generalization with linkages to existing federal and provincial scientific databases. The broadest level of generalization is the ecozone. Macroclimate, major vegetation types and subcontinental scale physiographic formations constitute the definitive components of these major ecosystems. Ecozones are subdivided into approximately 200 ecoregions which are based on properties like regional physiography, surficial geology, climate, vegetation, soil, water and fauna. The ecozone and ecoregion levels of the framework have been depicted on a national map coverage at 1:7 500 000 scale. Ecoregions have been subdivided into ecodistricts based primarily on landform, parent material, topography, soils, waterbodies and vegetation at a scale (1:2 000 000) useful for environmental resource management, monitoring and modelling activities. Nested within the ecodistricts are the polygons that make up the Soil Landscapes of Canada series of 1:1 000 000 scale soil maps. The framework is supported by an ARC-INFO GIS at Agriculture Canada. The data model allows linkage to associated databases on climate, land use and socio-economic attributes.
De Geest, Sabina; Sullivan Marx, Eileen M; Rich, Victoria; Spichiger, Elisabeth; Schwendimann, Rene; Spirig, Rebecca; Van Malderen, Greet
2010-09-01
Academic service partnerships (ASPs) are structured linkages between academe and service which have demonstrated higher levels of innovation. In the absence of descriptions in the literature on financial frameworks to support ASPs, the purpose of this paper is to present the supporting financial frameworks of a Swiss and a U.S. ASP. This paper used a case study approach. Two frameworks are presented. The U.S. model presented consists of a variety of ASPs, all linked to the School of Nursing of the University of Pennsylvania. The structural integration and governance system is elucidated. Each ASP has its own source of revenue or grant support with the goal to be fiscally in the black. Joint appointments are used as an instrument to realize these ASPs. The Swiss ASP entails a detailed description of the financial framework of one ASP between the Institute of Nursing Science at the University of Basel and the Inselspital Bern University Hospital. Balance in the partnership, in terms of both benefit and cost between both partners, was a main principle that guided the development of the financial framework and the translation of the ASP in budgetary terms. The model builds on a number of assumptions and provides the partnership management within a simple framework for monitoring and evaluation of the progress of the partnership. In operationalizing an ASP, careful budgetary planning should be an integral part of the preparation and evaluation of the collaboration. The proposed Swiss and U.S. financial frameworks allow doing so. Outcomes of care can be improved with strong nursing service and academic partnerships. Sustaining such partnerships requires attention to financial and contractual arrangements.
NASA Astrophysics Data System (ADS)
Christianson, D. S.; Varadharajan, C.; Detto, M.; Faybishenko, B.; Gimenez, B.; Jardine, K.; Negron Juarez, R. I.; Pastorello, G.; Powell, T.; Warren, J.; Wolfe, B.; McDowell, N. G.; Kueppers, L. M.; Chambers, J.; Agarwal, D.
2016-12-01
The U.S. Department of Energy's (DOE) Next Generation Ecosystem Experiment (NGEE) Tropics project aims to develop a process-rich tropical forest ecosystem model that is parameterized and benchmarked by field observations. Thus, data synthesis, quality assurance and quality control (QA/QC), and data product generation of a diverse and complex set of ecohydrological observations, including sapflux, leaf surface temperature, soil water content, and leaf gas exchange from sites across the Tropics, are required to support model simulations. We have developed a metadata reporting framework, implemented in conjunction with the NGEE Tropics Data Archive tool, to enable cross-site and cross-method comparison, data interpretability, and QA/QC. We employed a modified User-Centered Design approach, which involved short development cycles based on user-identified needs, and iterative testing with data providers and users. The metadata reporting framework currently has been implemented for sensor-based observations and leverages several existing metadata protocols. The framework consists of templates that define a multi-scale measurement position hierarchy, descriptions of measurement settings, and details about data collection and data file organization. The framework also enables data providers to define data-access permission settings, provenance, and referencing to enable appropriate data usage, citation, and attribution. In addition to describing the metadata reporting framework, we discuss tradeoffs and impressions from both data providers and users during the development process, focusing on the scalability, usability, and efficiency of the framework.
NASA Astrophysics Data System (ADS)
Chen, Mingjie; Izady, Azizallah; Abdalla, Osman A.; Amerjeed, Mansoor
2018-02-01
Bayesian inference using Markov Chain Monte Carlo (MCMC) provides an explicit framework for stochastic calibration of hydrogeologic models accounting for uncertainties; however, the MCMC sampling entails a large number of model calls, and could easily become computationally unwieldy if the high-fidelity hydrogeologic model simulation is time consuming. This study proposes a surrogate-based Bayesian framework to address this notorious issue, and illustrates the methodology by inverse modeling a regional MODFLOW model. The high-fidelity groundwater model is approximated by a fast statistical model using Bagging Multivariate Adaptive Regression Spline (BMARS) algorithm, and hence the MCMC sampling can be efficiently performed. In this study, the MODFLOW model is developed to simulate the groundwater flow in an arid region of Oman consisting of mountain-coast aquifers, and used to run representative simulations to generate training dataset for BMARS model construction. A BMARS-based Sobol' method is also employed to efficiently calculate input parameter sensitivities, which are used to evaluate and rank their importance for the groundwater flow model system. According to sensitivity analysis, insensitive parameters are screened out of Bayesian inversion of the MODFLOW model, further saving computing efforts. The posterior probability distribution of input parameters is efficiently inferred from the prescribed prior distribution using observed head data, demonstrating that the presented BMARS-based Bayesian framework is an efficient tool to reduce parameter uncertainties of a groundwater system.
NASA Astrophysics Data System (ADS)
Vervatis, Vassilios; De Mey, Pierre; Ayoub, Nadia; Kailas, Marios; Sofianos, Sarantis
2017-04-01
The project entitled Stochastic Coastal/Regional Uncertainty Modelling (SCRUM) aims at strengthening CMEMS in the areas of ocean uncertainty quantification, ensemble consistency verification and ensemble data assimilation. The project has been initiated by the University of Athens and LEGOS/CNRS research teams, in the framework of CMEMS Service Evolution. The work is based on stochastic modelling of ocean physics and biogeochemistry in the Bay of Biscay, on an identical sub-grid configuration of the IBI-MFC system in its latest CMEMS operational version V2. In a first step, we use a perturbed tendencies scheme to generate ensembles describing uncertainties in open ocean and on the shelf, focusing on upper ocean processes. In a second step, we introduce two methodologies (i.e. rank histograms and array modes) aimed at checking the consistency of the above ensembles with respect to TAC data and arrays. Preliminary results highlight that wind uncertainties dominate all other atmosphere-ocean sources of model errors. The ensemble spread in medium-range ensembles is approximately 0.01 m for SSH and 0.15 °C for SST, though these values vary depending on season and cross shelf regions. Ecosystem model uncertainties emerging from perturbations in physics appear to be moderately larger than those perturbing the concentration of the biogeochemical compartments, resulting in total chlorophyll spread at about 0.01 mg.m-3. First consistency results show that the model ensemble and the pseudo-ensemble of OSTIA (L4) observation SSTs appear to exhibit nonzero joint probabilities with each other since error vicinities overlap. Rank histograms show that the model ensemble is initially under-dispersive, though results improve in the context of seasonal-range ensembles.
Mefford, Linda C; Alligood, Martha R
2011-11-01
To explore the influences of intensity of nursing care and consistency of nursing caregivers on health and economic outcomes using Levine's Conservation Model of Nursing as the guiding theoretical framework. Professional nursing practice models are increasingly being used although limited research is available regarding their efficacy. A structural equation modelling approach tested the influence of intensity of nursing care (direct care by professional nurses and patient-nurse ratio) and consistency of nursing caregivers on morbidity and resource utilization in a neonatal intensive care unit (NICU) setting using primary nursing. Consistency of nursing caregivers served as a powerful mediator of length of stay and the duration of mechanical ventilation, supplemental oxygen therapy and parenteral nutrition. Analysis of nursing intensity indicators revealed that a mix of professional nurses and assistive personnel was effective. Providing consistency of nursing caregivers may significantly improve both health and economic outcomes. New evidence was found to support the efficacy of the primary nursing model in the NICU. Designing nursing care delivery systems in acute inpatient settings with an emphasis on consistency of nursing caregivers could improve health outcomes, increase organizational effectiveness, and enhance satisfaction of nursing staff, patients, and families. © 2011 Blackwell Publishing Ltd.
Consistency of the Planck CMB data and ΛCDM cosmology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shafieloo, Arman; Hazra, Dhiraj Kumar, E-mail: shafieloo@kasi.re.kr, E-mail: dhiraj.kumar.hazra@apc.univ-paris7.fr
We test the consistency between Planck temperature and polarization power spectra and the concordance model of Λ Cold Dark Matter cosmology (ΛCDM) within the framework of Crossing statistics. We find that Planck TT best fit ΛCDM power spectrum is completely consistent with EE power spectrum data while EE best fit ΛCDM power spectrum is not consistent with TT data. However, this does not point to any systematic or model-data discrepancy since in the Planck EE data, uncertainties are much larger compared to the TT data. We also investigate the possibility of any deviation from ΛCDM model analyzing the Planck 2015more » data. Results from TT, TE and EE data analysis indicate that no deviation is required beyond the flexibility of the concordance ΛCDM model. Our analysis thus rules out any strong evidence for beyond the concordance model in the Planck spectra data. We also report a mild amplitude difference comparing temperature and polarization data, where temperature data seems to have slightly lower amplitude than expected (consistently at all multiples), as we assume both temperature and polarization data are realizations of the same underlying cosmology.« less
Boden, Lisa A.; McKendrick, Iain J.
2017-01-01
Mathematical models are increasingly relied upon as decision support tools, which estimate risks and generate recommendations to underpin public health policies. However, there are no formal agreements about what constitutes professional competencies or duties in mathematical modeling for public health. In this article, we propose a framework to evaluate whether mathematical models that assess human and animal disease risks and control strategies meet standards consistent with ethical “good practice” and are thus “fit for purpose” as evidence in support of policy. This framework is derived from principles of biomedical ethics: independence, transparency (autonomy), beneficence/non-maleficence, and justice. We identify ethical risks associated with model development and implementation and consider the extent to which scientists are accountable for the translation and communication of model results to policymakers so that the strengths and weaknesses of the scientific evidence base and any socioeconomic and ethical impacts of biased or uncertain predictions are clearly understood. We propose principles to operationalize a framework for ethically sound model development and risk communication between scientists and policymakers. These include the creation of science–policy partnerships to mutually define policy questions and communicate results; development of harmonized international standards for model development; and data stewardship and improvement of the traceability and transparency of models via a searchable archive of policy-relevant models. Finally, we suggest that bespoke ethical advisory groups, with relevant expertise and access to these resources, would be beneficial as a bridge between science and policy, advising modelers of potential ethical risks and providing overview of the translation of modeling advice into policy. PMID:28424768
Fermion hierarchy from sfermion anarchy
Altmannshofer, Wolfgang; Frugiuele, Claudia; Harnik, Roni
2014-12-31
We present a framework to generate the hierarchical flavor structure of Standard Model quarks and leptons from loops of superpartners. The simplest model consists of the minimal supersymmetric standard model with tree level Yukawa couplings for the third generation only and anarchic squark and slepton mass matrices. Agreement with constraints from low energy flavor observables, in particular Kaon mixing, is obtained for supersymmetric particles with masses at the PeV scale or above. In our framework both the second and the first generation fermion masses are generated at 1-loop. Despite this, a novel mechanism generates a hierarchy among the first andmore » second generations without imposing a symmetry or small parameters. A second-to-first generation mass ratio of order 100 is typical. The minimal supersymmetric standard model thus includes all the necessary ingredients to realize a fermion spectrum that is qualitatively similar to observation, with hierarchical masses and mixing. The minimal framework produces only a few quantitative discrepancies with observation, most notably the muon mass is too low. Furthermore, we discuss simple modifications which resolve this and also investigate the compatibility of our model with gauge and Yukawa coupling Unification.« less
Multiscale modelling for tokamak pedestals
NASA Astrophysics Data System (ADS)
Abel, I. G.
2018-04-01
Pedestal modelling is crucial to predict the performance of future fusion devices. Current modelling efforts suffer either from a lack of kinetic physics, or an excess of computational complexity. To ameliorate these problems, we take a first-principles multiscale approach to the pedestal. We will present three separate sets of equations, covering the dynamics of edge localised modes (ELMs), the inter-ELM pedestal and pedestal turbulence, respectively. Precisely how these equations should be coupled to each other is covered in detail. This framework is completely self-consistent; it is derived from first principles by means of an asymptotic expansion of the fundamental Vlasov-Landau-Maxwell system in appropriate small parameters. The derivation exploits the narrowness of the pedestal region, the smallness of the thermal gyroradius and the low plasma (the ratio of thermal to magnetic pressures) typical of current pedestal operation to achieve its simplifications. The relationship between this framework and gyrokinetics is analysed, and possibilities to directly match our systems of equations onto multiscale gyrokinetics are explored. A detailed comparison between our model and other models in the literature is performed. Finally, the potential for matching this framework onto an open-field-line region is briefly discussed.
Salvini, G; Ligtenberg, A; van Paassen, A; Bregt, A K; Avitabile, V; Herold, M
2016-05-01
Finding land use strategies that merge land-based climate change mitigation measures and adaptation strategies is still an open issue in climate discourse. This article explores synergies and trade-offs between REDD+, a scheme that focuses mainly on mitigation through forest conservation, with "Climate Smart Agriculture", an approach that emphasizes adaptive agriculture. We introduce a framework for ex-ante assessment of the impact of land management policies and interventions and for quantifying their impacts on land-based mitigation and adaptation goals. The framework includes a companion modelling (ComMod) process informed by interviews with policymakers, local experts and local farmers. The ComMod process consists of a Role-Playing Game with local farmers and an Agent Based Model. The game provided a participatory means to develop policy and climate change scenarios. These scenarios were then used as inputs to the Agent Based Model, a spatially explicit model to simulate landscape dynamics and the associated carbon emissions over decades. We applied the framework using as case study a community in central Vietnam, characterized by deforestation for subsistence agriculture and cultivation of acacias as a cash crop. The main findings show that the framework is useful in guiding consideration of local stakeholders' goals, needs and constraints. Additionally the framework provided beneficial information to policymakers, pointing to ways that policies might be re-designed to make them better tailored to local circumstances and therefore more effective in addressing synergistically climate change mitigation and adaptation objectives. Copyright © 2015 Elsevier Ltd. All rights reserved.
Reinforcement Learning Using a Continuous Time Actor-Critic Framework with Spiking Neurons
Frémaux, Nicolas; Sprekeler, Henning; Gerstner, Wulfram
2013-01-01
Animals repeat rewarded behaviors, but the physiological basis of reward-based learning has only been partially elucidated. On one hand, experimental evidence shows that the neuromodulator dopamine carries information about rewards and affects synaptic plasticity. On the other hand, the theory of reinforcement learning provides a framework for reward-based learning. Recent models of reward-modulated spike-timing-dependent plasticity have made first steps towards bridging the gap between the two approaches, but faced two problems. First, reinforcement learning is typically formulated in a discrete framework, ill-adapted to the description of natural situations. Second, biologically plausible models of reward-modulated spike-timing-dependent plasticity require precise calculation of the reward prediction error, yet it remains to be shown how this can be computed by neurons. Here we propose a solution to these problems by extending the continuous temporal difference (TD) learning of Doya (2000) to the case of spiking neurons in an actor-critic network operating in continuous time, and with continuous state and action representations. In our model, the critic learns to predict expected future rewards in real time. Its activity, together with actual rewards, conditions the delivery of a neuromodulatory TD signal to itself and to the actor, which is responsible for action choice. In simulations, we show that such an architecture can solve a Morris water-maze-like navigation task, in a number of trials consistent with reported animal performance. We also use our model to solve the acrobot and the cartpole problems, two complex motor control tasks. Our model provides a plausible way of computing reward prediction error in the brain. Moreover, the analytically derived learning rule is consistent with experimental evidence for dopamine-modulated spike-timing-dependent plasticity. PMID:23592970
Reinforcement learning using a continuous time actor-critic framework with spiking neurons.
Frémaux, Nicolas; Sprekeler, Henning; Gerstner, Wulfram
2013-04-01
Animals repeat rewarded behaviors, but the physiological basis of reward-based learning has only been partially elucidated. On one hand, experimental evidence shows that the neuromodulator dopamine carries information about rewards and affects synaptic plasticity. On the other hand, the theory of reinforcement learning provides a framework for reward-based learning. Recent models of reward-modulated spike-timing-dependent plasticity have made first steps towards bridging the gap between the two approaches, but faced two problems. First, reinforcement learning is typically formulated in a discrete framework, ill-adapted to the description of natural situations. Second, biologically plausible models of reward-modulated spike-timing-dependent plasticity require precise calculation of the reward prediction error, yet it remains to be shown how this can be computed by neurons. Here we propose a solution to these problems by extending the continuous temporal difference (TD) learning of Doya (2000) to the case of spiking neurons in an actor-critic network operating in continuous time, and with continuous state and action representations. In our model, the critic learns to predict expected future rewards in real time. Its activity, together with actual rewards, conditions the delivery of a neuromodulatory TD signal to itself and to the actor, which is responsible for action choice. In simulations, we show that such an architecture can solve a Morris water-maze-like navigation task, in a number of trials consistent with reported animal performance. We also use our model to solve the acrobot and the cartpole problems, two complex motor control tasks. Our model provides a plausible way of computing reward prediction error in the brain. Moreover, the analytically derived learning rule is consistent with experimental evidence for dopamine-modulated spike-timing-dependent plasticity.
Saa, Pedro; Nielsen, Lars K.
2015-01-01
Kinetic models provide the means to understand and predict the dynamic behaviour of enzymes upon different perturbations. Despite their obvious advantages, classical parameterizations require large amounts of data to fit their parameters. Particularly, enzymes displaying complex reaction and regulatory (allosteric) mechanisms require a great number of parameters and are therefore often represented by approximate formulae, thereby facilitating the fitting but ignoring many real kinetic behaviours. Here, we show that full exploration of the plausible kinetic space for any enzyme can be achieved using sampling strategies provided a thermodynamically feasible parameterization is used. To this end, we developed a General Reaction Assembly and Sampling Platform (GRASP) capable of consistently parameterizing and sampling accurate kinetic models using minimal reference data. The former integrates the generalized MWC model and the elementary reaction formalism. By formulating the appropriate thermodynamic constraints, our framework enables parameterization of any oligomeric enzyme kinetics without sacrificing complexity or using simplifying assumptions. This thermodynamically safe parameterization relies on the definition of a reference state upon which feasible parameter sets can be efficiently sampled. Uniform sampling of the kinetics space enabled dissecting enzyme catalysis and revealing the impact of thermodynamics on reaction kinetics. Our analysis distinguished three reaction elasticity regions for common biochemical reactions: a steep linear region (0> ΔGr >-2 kJ/mol), a transition region (-2> ΔGr >-20 kJ/mol) and a constant elasticity region (ΔGr <-20 kJ/mol). We also applied this framework to model more complex kinetic behaviours such as the monomeric cooperativity of the mammalian glucokinase and the ultrasensitive response of the phosphoenolpyruvate carboxylase of Escherichia coli. In both cases, our approach described appropriately not only the kinetic behaviour of these enzymes, but it also provided insights about the particular features underpinning the observed kinetics. Overall, this framework will enable systematic parameterization and sampling of enzymatic reactions. PMID:25874556
ERIC Educational Resources Information Center
Hoe, Siu Loon; McShane, Steven
2010-01-01
Purpose: The topic of organizational learning is populated with many theories and models; many relate to the enduring organizational learning framework consisting of knowledge acquisition, knowledge dissemination, and knowledge use. However, most of the research either emphasizes structural knowledge acquisition and dissemination as a composite…
Solid Waste Management Planning--A Methodology
ERIC Educational Resources Information Center
Theisen, Hilary M.; And Others
1975-01-01
This article presents a twofold solid waste management plan consisting of a basic design methodology and a decision-making methodology. The former provides a framework for the developing plan while the latter builds flexibility into the design so that there is a model for use during the planning process. (MA)
A Framework for the Specification of the Semantics and the Dynamics of Instructional Applications
ERIC Educational Resources Information Center
Buendia-Garcia, Felix; Diaz, Paloma
2003-01-01
An instructional application consists of a set of resources and activities to implement interacting, interrelated, and structured experiences oriented towards achieving specific educational objectives. The development of computer-based instructional applications has to follow a well defined process, so models for computer-based instructional…
Semiotic Mediation within an AT Frame
ERIC Educational Resources Information Center
Maracci, Mirko; Mariotti, Maria Alessandra
2013-01-01
This article is meant to present a specific elaboration of the notion of mediation in relation to the use of artefacts to enhance mathematics teaching and learning: the elaboration offered by the Theory of Semiotic Mediation. In particular, it provides an explicit model--consistent with the activity-actions-operations framework--of the actions…
Tuomisto, Hanna L.; Scheelbeek, Pauline F.D.; Chalabi, Zaid; Green, Rosemary; Smith, Richard D.; Haines, Andy; Dangour, Alan D.
2017-01-01
Environmental changes are likely to affect agricultural production over the next decades. The interactions between environmental change, agricultural yields and crop quality, and the critical pathways to future diets and health outcomes are largely undefined. There are currently no quantitative models to test the impact of multiple environmental changes on nutrition and health outcomes. Using an interdisciplinary approach, we developed a framework to link the multiple interactions between environmental change, agricultural productivity and crop quality, population-level food availability, dietary intake and health outcomes, with a specific focus on fruits and vegetables. The main components of the framework consist of: i) socio-economic and societal factors, ii) environmental change stressors, iii) interventions and policies, iv) food system activities, v) food and nutrition security, and vi) health and well-being outcomes. The framework, based on currently available evidence, provides an overview of the multidimensional and complex interactions with feedback between environmental change, production of fruits and vegetables, diets and health, and forms the analytical basis for future modelling and scenario testing. PMID:29511740
Jackson, Tracie R.; Fenelon, Joseph M.
2018-05-31
This report identifies water-level trends in wells and provides a conceptual framework that explains the hydrologic stresses and factors causing the trends in the Pahute Mesa–Oasis Valley (PMOV) groundwater basin, southern Nevada. Water levels in 79 wells were analyzed for trends between 1966 and 2016. The magnitude and duration of water-level responses to hydrologic stresses were analyzed graphically, statistically, and with water-level models.The conceptual framework consists of multiple stress-specific conceptual models to explain water-level responses to the following hydrologic stresses: recharge, evapotranspiration, pumping, nuclear testing, and wellbore equilibration. Dominant hydrologic stresses affecting water-level trends in each well were used to categorize trends as nonstatic, transient, or steady state.The conceptual framework of water-level responses to hydrologic stresses and trend analyses provide a comprehensive understanding of the PMOV basin and vicinity. The trend analysis links water-level fluctuations in wells to hydrologic stresses and potential factors causing the trends. Transient and steady-state trend categorizations can be used to determine the appropriate water-level data for groundwater studies.
NASA Astrophysics Data System (ADS)
Beretta, Gian Paolo
2014-10-01
By suitable reformulations, we cast the mathematical frameworks of several well-known different approaches to the description of nonequilibrium dynamics into a unified formulation valid in all these contexts, which extends to such frameworks the concept of steepest entropy ascent (SEA) dynamics introduced by the present author in previous works on quantum thermodynamics. Actually, the present formulation constitutes a generalization also for the quantum thermodynamics framework. The analysis emphasizes that in the SEA modeling principle a key role is played by the geometrical metric with respect to which to measure the length of a trajectory in state space. In the near-thermodynamic-equilibrium limit, the metric tensor is directly related to the Onsager's generalized resistivity tensor. Therefore, through the identification of a suitable metric field which generalizes the Onsager generalized resistance to the arbitrarily far-nonequilibrium domain, most of the existing theories of nonequilibrium thermodynamics can be cast in such a way that the state exhibits the spontaneous tendency to evolve in state space along the path of SEA compatible with the conservation constraints and the boundary conditions. The resulting unified family of SEA dynamical models is intrinsically and strongly consistent with the second law of thermodynamics. The non-negativity of the entropy production is a general and readily proved feature of SEA dynamics. In several of the different approaches to nonequilibrium description we consider here, the SEA concept has not been investigated before. We believe it defines the precise meaning and the domain of general validity of the so-called maximum entropy production principle. Therefore, it is hoped that the present unifying approach may prove useful in providing a fresh basis for effective, thermodynamically consistent, numerical models and theoretical treatments of irreversible conservative relaxation towards equilibrium from far nonequilibrium states. The mathematical frameworks we consider are the following: (A) statistical or information-theoretic models of relaxation; (B) small-scale and rarefied gas dynamics (i.e., kinetic models for the Boltzmann equation); (C) rational extended thermodynamics, macroscopic nonequilibrium thermodynamics, and chemical kinetics; (D) mesoscopic nonequilibrium thermodynamics, continuum mechanics with fluctuations; and (E) quantum statistical mechanics, quantum thermodynamics, mesoscopic nonequilibrium quantum thermodynamics, and intrinsic quantum thermodynamics.
An Uncertainty Quantification Framework for Prognostics and Condition-Based Monitoring
NASA Technical Reports Server (NTRS)
Sankararaman, Shankar; Goebel, Kai
2014-01-01
This paper presents a computational framework for uncertainty quantification in prognostics in the context of condition-based monitoring of aerospace systems. The different sources of uncertainty and the various uncertainty quantification activities in condition-based prognostics are outlined in detail, and it is demonstrated that the Bayesian subjective approach is suitable for interpreting uncertainty in online monitoring. A state-space model-based framework for prognostics, that can rigorously account for the various sources of uncertainty, is presented. Prognostics consists of two important steps. First, the state of the system is estimated using Bayesian tracking, and then, the future states of the system are predicted until failure, thereby computing the remaining useful life of the system. The proposed framework is illustrated using the power system of a planetary rover test-bed, which is being developed and studied at NASA Ames Research Center.
NASA Astrophysics Data System (ADS)
Wong, John-Michael; Stojadinovic, Bozidar
2005-05-01
A framework has been defined for storing and retrieving civil infrastructure monitoring data over a network. The framework consists of two primary components: metadata and network communications. The metadata component provides the descriptions and data definitions necessary for cataloging and searching monitoring data. The communications component provides Java classes for remotely accessing the data. Packages of Enterprise JavaBeans and data handling utility classes are written to use the underlying metadata information to build real-time monitoring applications. The utility of the framework was evaluated using wireless accelerometers on a shaking table earthquake simulation test of a reinforced concrete bridge column. The NEESgrid data and metadata repository services were used as a backend storage implementation. A web interface was created to demonstrate the utility of the data model and provides an example health monitoring application.
NASA Astrophysics Data System (ADS)
Corvo, Arthur Francis
Given the reality that active and competitive participation in the 21 st century requires American students to deepen their scientific and mathematical knowledge base, the National Research Council (NRC) proposed a new conceptual framework for K--12 science education. The framework consists of an integration of what the NRC report refers to as the three dimensions: scientific and engineering practices, crosscutting concepts, and core ideas in four disciplinary areas (physical, life and earth/spaces sciences, and engineering/technology). The Next Generation Science Standards (NGSS ), which are derived from this new framework, were released in April 2013 and have implications on teacher learning and development in Science, Technology, Engineering, and Mathematics (STEM). Given the NGSS's recent introduction, there is little research on how teachers can prepare for its release. To meet this research need, I implemented a self-study aimed at examining my teaching practices and classroom outcomes through the lens of the NRC's conceptual framework and the NGSS. The self-study employed design-based research (DBR) methods to investigate what happened in my secondary classroom when I designed, enacted, and reflected on units of study for my science, engineering, and mathematics classes. I utilized various best practices including Learning for Use (LfU) and Understanding by Design (UbD) models for instructional design, talk moves as a tool for promoting discourse, and modeling instruction for these designed units of study. The DBR strategy was chosen to promote reflective cycles, which are consistent with and in support of the self-study framework. A multiple case, mixed-methods approach was used for data collection and analysis. The findings in the study are reported by study phase in terms of unit planning, unit enactment, and unit reflection. The findings have implications for science teaching, teacher professional development, and teacher education.
Evaluating data worth for ground-water management under uncertainty
Wagner, B.J.
1999-01-01
A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models-a chance-constrained ground-water management model and an integer-programing sampling network design model-to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information-i.e., the projected reduction in management costs-with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models - a chance-constrained ground-water management model and an integer-programming sampling network design model - to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information - i.e., the projected reduction in management costs - with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.
Klein, Karsten; Wolff, Astrid C; Ziebold, Oliver; Liebscher, Thomas
2008-01-01
The ICW eHealth Framework (eHF) is a powerful infrastructure and platform for the development of service-oriented solutions in the health care business. It is the culmination of many years of experience of ICW in the development and use of in-house health care solutions and represents the foundation of ICW product developments based on the Java Enterprise Edition (Java EE). The ICW eHealth Framework has been leveraged to allow development by external partners - enabling adopters a straightforward integration into ICW solutions. The ICW eHealth Framework consists of reusable software components, development tools, architectural guidelines and conventions defining a full software-development and product lifecycle. From the perspective of a partner, the framework provides services and infrastructure capabilities for integrating applications within an eHF-based solution. This article introduces the ICW eHealth Framework's basic architectural concepts and technologies. It provides an overview of its module and component model, describes the development platform that supports the complete software development lifecycle of health care applications and outlines technological aspects, mainly focusing on application development frameworks and open standards.
A unifying retinex model based on non-local differential operators
NASA Astrophysics Data System (ADS)
Zosso, Dominique; Tran, Giang; Osher, Stanley
2013-02-01
In this paper, we present a unifying framework for retinex that is able to reproduce many of the existing retinex implementations within a single model. The fundamental assumption, as shared with many retinex models, is that the observed image is a multiplication between the illumination and the true underlying reflectance of the object. Starting from Morel's 2010 PDE model for retinex, where illumination is supposed to vary smoothly and where the reflectance is thus recovered from a hard-thresholded Laplacian of the observed image in a Poisson equation, we define our retinex model in similar but more general two steps. First, look for a filtered gradient that is the solution of an optimization problem consisting of two terms: The first term is a sparsity prior of the reflectance, such as the TV or H1 norm, while the second term is a quadratic fidelity prior of the reflectance gradient with respect to the observed image gradients. In a second step, since this filtered gradient almost certainly is not a consistent image gradient, we then look for a reflectance whose actual gradient comes close. Beyond unifying existing models, we are able to derive entirely novel retinex formulations by using more interesting non-local versions for the sparsity and fidelity prior. Hence we define within a single framework new retinex instances particularly suited for texture-preserving shadow removal, cartoon-texture decomposition, color and hyperspectral image enhancement.
A Framework for Translating a High Level Security Policy into Low Level Security Mechanisms
NASA Astrophysics Data System (ADS)
Hassan, Ahmed A.; Bahgat, Waleed M.
2010-01-01
Security policies have different components; firewall, active directory, and IDS are some examples of these components. Enforcement of network security policies to low level security mechanisms faces some essential difficulties. Consistency, verification, and maintenance are the major ones of these difficulties. One approach to overcome these difficulties is to automate the process of translation of high level security policy into low level security mechanisms. This paper introduces a framework of an automation process that translates a high level security policy into low level security mechanisms. The framework is described in terms of three phases; in the first phase all network assets are categorized according to their roles in the network security and relations between them are identified to constitute the network security model. This proposed model is based on organization based access control (OrBAC). However, the proposed model extend the OrBAC model to include not only access control policy but also some other administrative security policies like auditing policy. Besides, the proposed model enables matching of each rule of the high level security policy with the corresponding ones of the low level security policy. Through the second phase of the proposed framework, the high level security policy is mapped into the network security model. The second phase could be considered as a translation of the high level security policy into an intermediate model level. Finally, the intermediate model level is translated automatically into low level security mechanism. The paper illustrates the applicability of proposed approach through an application example.
NASA Astrophysics Data System (ADS)
Dodov, B.
2017-12-01
Stochastic simulation of realistic and statistically robust patterns of Tropical Cyclone (TC) induced precipitation is a challenging task. It is even more challenging in a catastrophe modeling context, where tens of thousands of typhoon seasons need to be simulated in order to provide a complete view of flood risk. Ultimately, one could run a coupled global climate model and regional Numerical Weather Prediction (NWP) model, but this approach is not feasible in the catastrophe modeling context and, most importantly, may not provide TC track patterns consistent with observations. Rather, we propose to leverage NWP output for the observed TC precipitation patterns (in terms of downscaled reanalysis 1979-2015) collected on a Lagrangian frame along the historical TC tracks and reduced to the leading spatial principal components of the data. The reduced data from all TCs is then grouped according to timing, storm evolution stage (developing, mature, dissipating, ETC transitioning) and central pressure and used to build a dictionary of stationary (within a group) and non-stationary (for transitions between groups) covariance models. Provided that the stochastic storm tracks with all the parameters describing the TC evolution are already simulated, a sequence of conditional samples from the covariance models chosen according to the TC characteristics at a given moment in time are concatenated, producing a continuous non-stationary precipitation pattern in a Lagrangian framework. The simulated precipitation for each event is finally distributed along the stochastic TC track and blended with a non-TC background precipitation using a data assimilation technique. The proposed framework provides means of efficient simulation (10000 seasons simulated in a couple of days) and robust typhoon precipitation patterns consistent with observed regional climate and visually undistinguishable from high resolution NWP output. The framework is used to simulate a catalog of 10000 typhoon seasons implemented in a flood risk model for Japan.
Polarizable atomic multipole-based force field for DOPC and POPE membrane lipids
NASA Astrophysics Data System (ADS)
Chu, Huiying; Peng, Xiangda; Li, Yan; Zhang, Yuebin; Min, Hanyi; Li, Guohui
2018-04-01
A polarizable atomic multipole-based force field for the membrane bilayer models 1,2-dioleoyl-phosphocholine (DOPC) and 1-palmitoyl-2-oleoyl-phosphatidylethanolamine (POPE) has been developed. The force field adopts the same framework as the Atomic Multipole Optimized Energetics for Biomolecular Applications (AMOEBA) model, in which the charge distribution of each atom is represented by the permanent atomic monopole, dipole and quadrupole moments. Many-body polarization including the inter- and intra-molecular polarization is modelled in a consistent manner with distributed atomic polarizabilities. The van der Waals parameters were first transferred from existing AMOEBA parameters for small organic molecules and then optimised by fitting to ab initio intermolecular interaction energies between models and a water molecule. Molecular dynamics simulations of the two aqueous DOPC and POPE membrane bilayer systems, consisting of 72 model molecules, were then carried out to validate the force field parameters. Membrane width, area per lipid, volume per lipid, deuterium order parameters, electron density profile, etc. were consistent with experimental values.
Entropy corrected holographic dark energy models in modified gravity
NASA Astrophysics Data System (ADS)
Jawad, Abdul; Azhar, Nadeem; Rani, Shamaila
We consider the power law and the entropy corrected holographic dark energy (HDE) models with Hubble horizon in the dynamical Chern-Simons modified gravity. We explore various cosmological parameters and planes in this framework. The Hubble parameter lies within the consistent range at the present and later epoch for both entropy corrected models. The deceleration parameter explains the accelerated expansion of the universe. The equation of state (EoS) parameter corresponds to quintessence and cold dark matter (ΛCDM) limit. The ωΛ-ωΛ‧ approaches to ΛCDM limit and freezing region in both entropy corrected models. The statefinder parameters are consistent with ΛCDM limit and dark energy (DE) models. The generalized second law of thermodynamics remain valid in all cases of interacting parameter. It is interesting to mention here that our results of Hubble, EoS parameter and ωΛ-ωΛ‧ plane show consistency with the present observations like Planck, WP, BAO, H0, SNLS and nine-year WMAP.
Martin, Jordan S; Suarez, Scott A
2017-08-01
Interest in quantifying consistent among-individual variation in primate behavior, also known as personality, has grown rapidly in recent decades. Although behavioral coding is the most frequently utilized method for assessing primate personality, limitations in current statistical practice prevent researchers' from utilizing the full potential of their coding datasets. These limitations include the use of extensive data aggregation, not modeling biologically relevant sources of individual variance during repeatability estimation, not partitioning between-individual (co)variance prior to modeling personality structure, the misuse of principal component analysis, and an over-reliance upon exploratory statistical techniques to compare personality models across populations, species, and data collection methods. In this paper, we propose a statistical framework for primate personality research designed to address these limitations. Our framework synthesizes recently developed mixed-effects modeling approaches for quantifying behavioral variation with an information-theoretic model selection paradigm for confirmatory personality research. After detailing a multi-step analytic procedure for personality assessment and model comparison, we employ this framework to evaluate seven models of personality structure in zoo-housed bonobos (Pan paniscus). We find that differences between sexes, ages, zoos, time of observation, and social group composition contributed to significant behavioral variance. Independently of these factors, however, personality nonetheless accounted for a moderate to high proportion of variance in average behavior across observational periods. A personality structure derived from past rating research receives the strongest support relative to our model set. This model suggests that personality variation across the measured behavioral traits is best described by two correlated but distinct dimensions reflecting individual differences in affiliation and sociability (Agreeableness) as well as activity level, social play, and neophilia toward non-threatening stimuli (Openness). These results underscore the utility of our framework for quantifying personality in primates and facilitating greater integration between the behavioral ecological and comparative psychological approaches to personality research. © 2017 Wiley Periodicals, Inc.
A grain boundary damage model for delamination
NASA Astrophysics Data System (ADS)
Messner, M. C.; Beaudoin, A. J.; Dodds, R. H.
2015-07-01
Intergranular failure in metallic materials represents a multiscale damage mechanism: some feature of the material microstructure triggers the separation of grain boundaries on the microscale, but the intergranular fractures develop into long cracks on the macroscale. This work develops a multiscale model of grain boundary damage for modeling intergranular delamination—a failure of one particular family of grain boundaries sharing a common normal direction. The key feature of the model is a physically-consistent and mesh independent, multiscale scheme that homogenizes damage at many grain boundaries on the microscale into a single damage parameter on the macroscale to characterize material failure across a plane. The specific application of the damage framework developed here considers delamination failure in modern Al-Li alloys. However, the framework may be readily applied to other metals or composites and to other non-delamination interface geometries—for example, multiple populations of material interfaces with different geometric characteristics.
A Tale of Two Trails: Exploring Different Paths to Success
Walker, Jennifer G.; Evenson, Kelly R.; Davis, William J.; Bors, Philip; Rodríguez, Daniel A.
2016-01-01
Background This comparative case study investigates 2 successful community trail initiatives, using the Active Living By Design (ALBD) Community Action Model as an analytical framework. The model includes 5 strategies: preparation, promotion, programs, policy, and physical projects. Methods Key stakeholders at 2 sites participated in in-depth interviews (N = 14). Data were analyzed for content using Atlas Ti and grouped according to the 5 strategies. Results Preparation Securing trail resources was challenging, but shared responsibilities facilitated trail development. Promotions The initiatives demonstrated minimal physical activity encouragement strategies. Programs Community stakeholders did not coordinate programmatic opportunities for routine physical activity. Policy Trails’ inclusion in regional greenway master plans contributed to trail funding and development. Policies that were formally institutionalized and enforced led to more consistent trail construction and safer conditions for users. Physical Projects Consistent standards for way finding signage and design safety features enhanced trail usability and safety. Conclusions Communities with different levels of government support contributed unique lessons to inform best practices of trail initiatives. This study revealed a disparity between trail development and use-encouragement strategies, which may limit trails’ impact on physical activity. The ALBD Community Action Model provided a viable framework to structure cross-disciplinary community trail initiatives. PMID:21597125
NASA Astrophysics Data System (ADS)
Boyer, Elisebeth C.
This research investigates how three preservice elementary teachers were prepared to teach science using a Discursive Model of Meaning Making. The research is divided into two parts. The first consists of the nature of the participants’ learning experiences in a science methods course within a school-university Professional Development School partnership. This part of the investigation used Constant Comparative Analysis of field notes gathered through participant observation of the methods course. The analysis investigated how the methods instructors employed productive questioning, talk moves, and a coherent research based Teaching Science as Argument Framework. The second part of the study consisted of an investigation into how the participants applied what they experienced during the methods course in their initial science teaching experiences, as well as how the participants made sense of their initial science teaching. Data consisted of teaching videos of the participants during their initial science teaching experiences and self-analysis videos created by the participants. This part of the research used Discourse Analysis of the teaching and self-analysis videos. These inquiries provide insight into what aspects of the methods course were taken up by the participants and how they made sense of their practices. Findings are: 1) Throughout the methods course, instructors modeled how the Teaching Science as Argument Framework can be used to negotiate scientific understanding by employing a Discursive Model of Meaning Making. 2) During lesson plan conferences the Discursive Model was emphasized as participants planned classroom discussion and explored possible student responses enabling them to anticipate how they could attempt to increase student understanding. 3) Participants displayed three distinct patterns of adoption of the Teaching Science as Argument Framework (TSAF), involving different discursive practices. They were, • Detached Discursive Approach: Use of some discursive strategies without an apparent connection to the TSAF. • Connected Approach with a Focus on Student Thinking: Intentional use of the Discursive Model informed by aspects of the TSAF. • TSAF Approach: Priority is given to the TSAF supported by substantial application of the Discursive Model. 4) The evidence participants chose to highlight in their self-analysis videos is reflective of their patterns of adoption of the Teaching Science as Argument Framework and their differing discursive practices. Analysis led to the formation of the middle theory that when learning to teach science in the elementary school, teacher commitment to the discourse and practices of science is constructed through participation in a learning community where a discursive model of meaning making is the norm. Curricular and methodological implications, as well as implications for future research are presented.
Wave fluctuations in the system with some Yang-Mills condensates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prokhorov, G., E-mail: zhoraprox@yandex.ru; Pasechnik, R., E-mail: Roman.Pasechnik@thep.lu.se; Vereshkov, G., E-mail: gveresh@gmail.com
2016-12-15
Self-consistent dynamics of non-homogeneous fluctuations and homogeneous and isotropic condensate of Yang–Mills fields was investigated in zero, linear and quasilinear approximations over the wave modes in the framework of N = 4 supersymmetric model in Hamilton gauge in quasiclassical theory. The models with SU(2), SU(3) and SU(4) gauge groups were considered. Particle production effect and effect of generation of longitudinal oscillations were obtained.
High-End Climate Science: Development of Modeling and Related Computing Capabilities
2000-12-01
toward strengthening research on key scientific issues. The Program has supported research that has led to substantial increases in knowledge , improved...provides overall direction and executive oversight of the USGCRP. Within this framework, agencies manage and coordinate Federally supported scientific...critical for the U.S. Global Change Research Program. Such models can be used to look backward to test the consistency of our knowledge of Earth system
Promoting compliance: the patient-provider partnership.
Wilson, B M
1995-07-01
Compliance has been defined traditionally in terms of how well a patient follows through with the recommendations of a health care provider. Patient education has often consisted of a one-way communication of provider to patient. This article advocates a multifaceted approach to compliance issues in which patients and health care providers set mutually agreed upon treatment goals. These goals must be consistent with patients' priorities and lifestyles. Patient compliance issues are examined in the context of three theoretical frameworks: (1) the Health-Belief Model, (2) Locus of Control Theory, and (3) Piaget's Theory of Cognitive Development. The insights gained from these models are then used to provide practical suggestions for enhancing compliance.
Honda, Hidehito; Matsuka, Toshihiko; Ueda, Kazuhiro
2017-05-01
Some researchers on binary choice inference have argued that people make inferences based on simple heuristics, such as recognition, fluency, or familiarity. Others have argued that people make inferences based on available knowledge. To examine the boundary between heuristic and knowledge usage, we examine binary choice inference processes in terms of attribute substitution in heuristic use (Kahneman & Frederick, 2005). In this framework, it is predicted that people will rely on heuristic or knowledge-based inference depending on the subjective difficulty of the inference task. We conducted competitive tests of binary choice inference models representing simple heuristics (fluency and familiarity heuristics) and knowledge-based inference models. We found that a simple heuristic model (especially a familiarity heuristic model) explained inference patterns for subjectively difficult inference tasks, and that a knowledge-based inference model explained subjectively easy inference tasks. These results were consistent with the predictions of the attribute substitution framework. Issues on usage of simple heuristics and psychological processes are discussed. Copyright © 2016 Cognitive Science Society, Inc.
Water quality, health, and human occupations.
Blakeney, Anne B; Marshall, Amy
2009-01-01
To introduce evidence of the critical link between water quality and human occupations. A participatory action research design was used to complete a three-phase project. Phase 1 included mapping the watershed of Letcher County, Kentucky. Phase 2 consisted of surveying 122 Letcher County health professionals. Phase 3, the primary focus of this article, consisted of interviews with Letcher County adults regarding their lived experiences with water. The Occupational Therapy Practice Framework: Domain and Process (American Occupational Therapy Association, 2002) was used to structure questions. The Model of Occupational Justice provided the theoretical framework for presentation of the results. The watershed in Letcher County, Kentucky, is polluted as a result of specific coal mining practices and a lack of adequate infrastructure. As a result, citizens experience occupational injustice in the forms of occupational imbalance, occupational deprivation, and occupational alienation.
An architecture for consolidating multidimensional time-series data onto a common coordinate grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shippert, Tim; Gaustad, Krista
In this paper, consolidating measurement data for use by data models or in inter-comparison studies frequently requires transforming the data onto a common grid. Standard methods for interpolating multidimensional data are often not appropriate for data with non-homogenous dimensionality, and are hard to implement in a consistent manner for different datastreams. In addition, these challenges are increased when dealing with the automated procedures necessary for use with continuous, operational datastreams. In this paper we introduce a method of applying a series of one-dimensional transformations to merge data onto a common grid, examine the challenges of ensuring consistent application of datamore » consolidation methods, present a framework for addressing those challenges, and describe the implementation of such a framework for the Atmospheric Radiation Measurement (ARM) program.« less
An architecture for consolidating multidimensional time-series data onto a common coordinate grid
Shippert, Tim; Gaustad, Krista
2016-12-16
In this paper, consolidating measurement data for use by data models or in inter-comparison studies frequently requires transforming the data onto a common grid. Standard methods for interpolating multidimensional data are often not appropriate for data with non-homogenous dimensionality, and are hard to implement in a consistent manner for different datastreams. In addition, these challenges are increased when dealing with the automated procedures necessary for use with continuous, operational datastreams. In this paper we introduce a method of applying a series of one-dimensional transformations to merge data onto a common grid, examine the challenges of ensuring consistent application of datamore » consolidation methods, present a framework for addressing those challenges, and describe the implementation of such a framework for the Atmospheric Radiation Measurement (ARM) program.« less
Drag Reduction of an Airfoil Using Deep Learning
NASA Astrophysics Data System (ADS)
Jiang, Chiyu; Sun, Anzhu; Marcus, Philip
2017-11-01
We reduced the drag of a 2D airfoil by starting with a NACA-0012 airfoil and used deep learning methods. We created a database which consists of simulations of 2D external flow over randomly generated shapes. We then developed a machine learning framework for external flow field inference given input shapes. Past work which utilized machine learning in Computational Fluid Dynamics focused on estimations of specific flow parameters, but this work is novel in the inference of entire flow fields. We further showed that learned flow patterns are transferable to cases that share certain similarities. This study illustrates the prospects of deeper integration of data-based modeling into current CFD simulation frameworks for faster flow inference and more accurate flow modeling.
Palatini actions and quantum gravity phenomenology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olmo, Gonzalo J., E-mail: gonzalo.olmo@csic.es
2011-10-01
We show that an invariant an universal length scale can be consistently introduced in a generally covariant theory through the gravitational sector using the Palatini approach. The resulting theory is able to capture different aspects of quantum gravity phenomenology in a single framework. In particular, it is found that in this theory field excitations propagating with different energy-densities perceive different background metrics, which is a fundamental characteristic of the DSR and Rainbow Gravity approaches. We illustrate these properties with a particular gravitational model and explicitly show how the soccer ball problem is avoided in this framework. The isotropic and anisotropicmore » cosmologies of this model also avoid the big bang singularity by means of a big bounce.« less
Computing decay rates for new physics theories with FEYNRULES and MADGRAPH 5_AMC@NLO
NASA Astrophysics Data System (ADS)
Alwall, Johan; Duhr, Claude; Fuks, Benjamin; Mattelaer, Olivier; Öztürk, Deniz Gizem; Shen, Chia-Hsien
2015-12-01
We present new features of the FEYNRULES and MADGRAPH 5_AMC@NLO programs for the automatic computation of decay widths that consistently include channels of arbitrary final-state multiplicity. The implementations are generic enough so that they can be used in the framework of any quantum field theory, possibly including higher-dimensional operators. We extend at the same time the conventions of the Universal FEYNRULES Output (or UFO) format to include decay tables and information on the total widths. We finally provide a set of representative examples of the usage of the new functions of the different codes in the framework of the Standard Model, the Higgs Effective Field Theory, the Strongly Interacting Light Higgs model and the Minimal Supersymmetric Standard Model and compare the results to available literature and programs for validation purposes.
Mineralogic Model (MM3.0) Analysis Model Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
C. Lum
2002-02-12
The purpose of this report is to document the Mineralogic Model (MM), Version 3.0 (MM3.0) with regard to data input, modeling methods, assumptions, uncertainties, limitations and validation of the model results, qualification status of the model, and the differences between Version 3.0 and previous versions. A three-dimensional (3-D) Mineralogic Model was developed for Yucca Mountain to support the analyses of hydrologic properties, radionuclide transport, mineral health hazards, repository performance, and repository design. Version 3.0 of the MM was developed from mineralogic data obtained from borehole samples. It consists of matrix mineral abundances as a function of x (easting), y (northing),more » and z (elevation), referenced to the stratigraphic framework defined in Version 3.1 of the Geologic Framework Model (GFM). The MM was developed specifically for incorporation into the 3-D Integrated Site Model (ISM). The MM enables project personnel to obtain calculated mineral abundances at any position, within any region, or within any stratigraphic unit in the model area. The significance of the MM for key aspects of site characterization and performance assessment is explained in the following subsections. This work was conducted in accordance with the Development Plan for the MM (CRWMS M&O 2000). The planning document for this Rev. 00, ICN 02 of this AMR is Technical Work Plan, TWP-NBS-GS-000003, Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01 (CRWMS M&O 2000). The purpose of this ICN is to record changes in the classification of input status by the resolution of the use of TBV software and data in this report. Constraints and limitations of the MM are discussed in the appropriate sections that follow. The MM is one component of the ISM, which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed stratigraphy and structural features of the site into a 3-D model that will be useful in primary downstream models and repository design. These downstream models include the hydrologic flow models and the radionuclide transport models. All the models and the repository design, in turn, will be incorporated into the Total System Performance Assessment (TSPA) of the potential nuclear waste repository block and vicinity to determine the suitability of Yucca Mountain as a host for a repository. The interrelationship of the three components of the ISM and their interface with downstream uses are illustrated in Figure 1. The lateral boundaries of the ISM and its three component models are shown in Figure 2.« less
Program and Project Management Framework
NASA Technical Reports Server (NTRS)
Butler, Cassandra D.
2002-01-01
The primary objective of this project was to develop a framework and system architecture for integrating program and project management tools that may be applied consistently throughout Kennedy Space Center (KSC) to optimize planning, cost estimating, risk management, and project control. Project management methodology used in building interactive systems to accommodate the needs of the project managers is applied as a key component in assessing the usefulness and applicability of the framework and tools developed. Research for the project included investigation and analysis of industrial practices, KSC standards, policies, and techniques, Systems Management Office (SMO) personnel, and other documented experiences of project management experts. In addition, this project documents best practices derived from the literature as well as new or developing project management models, practices, and techniques.
Theory and Practice of Pediatric Bioethics.
Ross, Lainie Friedman
2016-01-01
This article examines two typical bioethics frameworks: the "Four Principles" by Beauchamp and Childress, and the "Four Boxes" by Jonsen, Siegler, and Winslade. I show how they are inadequate to address the ethical issues raised by pediatrics, in part because they do not pay adequate attention to families. I then consider an alternate framework proposed by Buchanan and Brock that focuses on four questions that must be addressed for the patient who lacks decisional capacity. This model also does not give adequate respect for the family, particularly the intimate family. I then describe my own framework, which provides answers to Buchanan and Brock's four questions in a way that is consistent with the intimate family and its need for protection from state intervention.
NASA Astrophysics Data System (ADS)
Mousavi Nezhad, Mohaddeseh; Fisher, Quentin J.; Gironacci, Elia; Rezania, Mohammad
2018-06-01
Reliable prediction of fracture process in shale-gas rocks remains one of the most significant challenges for establishing sustained economic oil and gas production. This paper presents a modeling framework for simulation of crack propagation in heterogeneous shale rocks. The framework is on the basis of a variational approach, consistent with Griffith's theory. The modeling framework is used to reproduce the fracture propagation process in shale rock samples under standard Brazilian disk test conditions. Data collected from the experiments are employed to determine the testing specimens' tensile strength and fracture toughness. To incorporate the effects of shale formation heterogeneity in the simulation of crack paths, fracture properties of the specimens are defined as spatially random fields. A computational strategy on the basis of stochastic finite element theory is developed that allows to incorporate the effects of heterogeneity of shale rocks on the fracture evolution. A parametric study has been carried out to better understand how anisotropy and heterogeneity of the mechanical properties affect both direction of cracks and rock strength.
NASA Astrophysics Data System (ADS)
Bassiouni, Maoya; Higgins, Chad W.; Still, Christopher J.; Good, Stephen P.
2018-06-01
Vegetation controls on soil moisture dynamics are challenging to measure and translate into scale- and site-specific ecohydrological parameters for simple soil water balance models. We hypothesize that empirical probability density functions (pdfs) of relative soil moisture or soil saturation encode sufficient information to determine these ecohydrological parameters. Further, these parameters can be estimated through inverse modeling of the analytical equation for soil saturation pdfs, derived from the commonly used stochastic soil water balance framework. We developed a generalizable Bayesian inference framework to estimate ecohydrological parameters consistent with empirical soil saturation pdfs derived from observations at point, footprint, and satellite scales. We applied the inference method to four sites with different land cover and climate assuming (i) an annual rainfall pattern and (ii) a wet season rainfall pattern with a dry season of negligible rainfall. The Nash-Sutcliffe efficiencies of the analytical model's fit to soil observations ranged from 0.89 to 0.99. The coefficient of variation of posterior parameter distributions ranged from < 1 to 15 %. The parameter identifiability was not significantly improved in the more complex seasonal model; however, small differences in parameter values indicate that the annual model may have absorbed dry season dynamics. Parameter estimates were most constrained for scales and locations at which soil water dynamics are more sensitive to the fitted ecohydrological parameters of interest. In these cases, model inversion converged more slowly but ultimately provided better goodness of fit and lower uncertainty. Results were robust using as few as 100 daily observations randomly sampled from the full records, demonstrating the advantage of analyzing soil saturation pdfs instead of time series to estimate ecohydrological parameters from sparse records. Our work combines modeling and empirical approaches in ecohydrology and provides a simple framework to obtain scale- and site-specific analytical descriptions of soil moisture dynamics consistent with soil moisture observations.
Haji Ali Afzali, Hossein; Gray, Jodi; Karnon, Jonathan
2013-04-01
Decision analytic models play an increasingly important role in the economic evaluation of health technologies. Given uncertainties around the assumptions used to develop such models, several guidelines have been published to identify and assess 'best practice' in the model development process, including general modelling approach (e.g., time horizon), model structure, input data and model performance evaluation. This paper focuses on model performance evaluation. In the absence of a sufficient level of detail around model performance evaluation, concerns regarding the accuracy of model outputs, and hence the credibility of such models, are frequently raised. Following presentation of its components, a review of the application and reporting of model performance evaluation is presented. Taking cardiovascular disease as an illustrative example, the review investigates the use of face validity, internal validity, external validity, and cross model validity. As a part of the performance evaluation process, model calibration is also discussed and its use in applied studies investigated. The review found that the application and reporting of model performance evaluation across 81 studies of treatment for cardiovascular disease was variable. Cross-model validation was reported in 55 % of the reviewed studies, though the level of detail provided varied considerably. We found that very few studies documented other types of validity, and only 6 % of the reviewed articles reported a calibration process. Considering the above findings, we propose a comprehensive model performance evaluation framework (checklist), informed by a review of best-practice guidelines. This framework provides a basis for more accurate and consistent documentation of model performance evaluation. This will improve the peer review process and the comparability of modelling studies. Recognising the fundamental role of decision analytic models in informing public funding decisions, the proposed framework should usefully inform guidelines for preparing submissions to reimbursement bodies.
Computer-aided pulmonary image analysis in small animal models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Ziyue; Mansoor, Awais; Mollura, Daniel J.
Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next.more » The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases.« less
NASA Astrophysics Data System (ADS)
Rahman, Hameedur; Arshad, Haslina; Mahmud, Rozi; Mahayuddin, Zainal Rasyid
2017-10-01
Breast Cancer patients who require breast biopsy has increased over the past years. Augmented Reality guided core biopsy of breast has become the method of choice for researchers. However, this cancer visualization has limitations to the extent of superimposing the 3D imaging data only. In this paper, we are introducing an Augmented Reality visualization framework that enables breast cancer biopsy image guidance by using X-Ray vision technique on a mobile display. This framework consists of 4 phases where it initially acquires the image from CT/MRI and process the medical images into 3D slices, secondly it will purify these 3D grayscale slices into 3D breast tumor model using 3D modeling reconstruction technique. Further, in visualization processing this virtual 3D breast tumor model has been enhanced using X-ray vision technique to see through the skin of the phantom and the final composition of it is displayed on handheld device to optimize the accuracy of the visualization in six degree of freedom. The framework is perceived as an improved visualization experience because the Augmented Reality x-ray vision allowed direct understanding of the breast tumor beyond the visible surface and direct guidance towards accurate biopsy targets.
Taylor, Zeike A; Kirk, Thomas B; Miller, Karol
2007-10-01
The theoretical framework developed in a companion paper (Part I) is used to derive estimates of mechanical response of two meniscal cartilage specimens. The previously developed framework consisted of a constitutive model capable of incorporating confocal image-derived tissue microstructural data. In the present paper (Part II) fibre and matrix constitutive parameters are first estimated from mechanical testing of a batch of specimens similar to, but independent from those under consideration. Image analysis techniques which allow estimation of tissue microstructural parameters form confocal images are presented. The constitutive model and image-derived structural parameters are then used to predict the reaction force history of the two meniscal specimens subjected to partially confined compression. The predictions are made on the basis of the specimens' individual structural condition as assessed by confocal microscopy and involve no tuning of material parameters. Although the model does not reproduce all features of the experimental curves, as an unfitted estimate of mechanical response the prediction is quite accurate. In light of the obtained results it is judged that more general non-invasive estimation of tissue mechanical properties is possible using the developed framework.
An automated and integrated framework for dust storm detection based on ogc web processing services
NASA Astrophysics Data System (ADS)
Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.
2014-11-01
Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data and scientific model integration problem by using a framework and scientific workflow approach together. The experimental result shows that this newly automated and integrated framework can be used to give advance near real-time warning of dust storms, for both environmental authorities and public. The methods presented in this paper might be also generalized to other types of Earth system models, leading to improved ease of use and flexibility.
Three Sets of Case Studies Suggest Logic and Consistency Challenges with Value Frameworks.
Cohen, Joshua T; Anderson, Jordan E; Neumann, Peter J
2017-02-01
To assess the logic and consistency of three prominent value frameworks. We reviewed the value frameworks from three organizations: the Memorial Sloan Kettering Cancer Center (DrugAbacus), the American Society of Clinical Oncologists, and the Institute for Clinical and Economic Review. For each framework, we developed case studies to explore the degree to which the frameworks have face validity in the sense that they are consistent with four important principles: value should be proportional to a therapy's benefit; components of value should matter to framework users (patients and payers); attribute weights should reflect user preferences; and value estimates used to inform therapy prices should reflect per-person benefit. All three frameworks can aid decision making by elucidating factors not explicitly addressed by conventional evaluation techniques (in particular, cost-effectiveness analyses). Our case studies identified four challenges: 1) value is not always proportional to benefit; 2) value reflects factors that may not be relevant to framework users (patients or payers); 3) attribute weights do not necessarily reflect user preferences or relate to value in ways that are transparent; and 4) value does not reflect per-person benefit. Although the value frameworks we reviewed capture value in a way that is important to various audiences, they are not always logical or consistent. Because these frameworks may have a growing influence on therapy access, it is imperative that analytic challenges be further explored. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Assessing the formability of metallic sheets by means of localized and diffuse necking models
NASA Astrophysics Data System (ADS)
Comşa, Dan-Sorin; Lǎzǎrescu, Lucian; Banabic, Dorel
2016-10-01
The main objective of the paper consists in elaborating a unified framework that allows the theoretical assessment of sheet metal formability. Hill's localized necking model and the Extended Maximum Force Criterion proposed by Mattiasson, Sigvant, and Larsson have been selected for this purpose. Both models are thoroughly described together with their solution procedures. A comparison of the theoretical predictions with experimental data referring to the formability of a DP600 steel sheet is also presented by the authors.
Tayyib, Nahla; Coyer, Fiona
This article reports on the development and implementation process used to integrate a care bundle approach (a pressure ulcer [PU] prevention bundle to improve patients' skin integrity in intensive care) and the Ottawa Model of Research Use (OMRU). The PU prevention care bundle demonstrated significant reduction in PU incidence, with the OMRU model providing a consolidated framework for the implementation of bundled evidence in an effective and consistent manner into daily clinical nursing practice.
Methods of Sparse Modeling and Dimensionality Reduction to Deal with Big Data
2015-04-01
supervised learning (c). Our framework consists of two separate phases: (a) first find an initial space in an unsupervised manner; then (b) utilize label...model that can learn thousands of topics from a large set of documents and infer the topic mixture of each document, 2) a supervised dimension reduction...model that can learn thousands of topics from a large set of documents and infer the topic mixture of each document, (i) a method of supervised
Vulnerability Assessment of Water Supply Systems: Status, Gaps and Opportunities
NASA Astrophysics Data System (ADS)
Wheater, H. S.
2015-12-01
Conventional frameworks for assessing the impacts of climate change on water resource systems use cascades of climate and hydrological models to provide 'top-down' projections of future water availability, but these are subject to high uncertainty and are model and scenario-specific. Hence there has been recent interest in 'bottom-up' frameworks, which aim to evaluate system vulnerability to change in the context of possible future climate and/or hydrological conditions. Such vulnerability assessments are generic, and can be combined with updated information from top-down assessments as they become available. While some vulnerability methods use hydrological models to estimate water availability, fully bottom-up schemes have recently been proposed that directly map system vulnerability as a function of feasible changes in water supply characteristics. These use stochastic algorithms, based on reconstruction or reshuffling methods, by which multiple water supply realizations can be generated under feasible ranges of change in water supply conditions. The paper reports recent successes, and points to areas of future improvement. Advances in stochastic modeling and optimization can address some technical limitations in flow reconstruction, while various data mining and system identification techniques can provide possibilities to better condition realizations for consistency with top-down scenarios. Finally, we show that probabilistic and Bayesian frameworks together can provide a potential basis to combine information obtained from fully bottom-up analyses with projections available from climate and/or hydrological models in a fully integrated risk assessment framework for deep uncertainty.
Overview of the Special Issue: A Multi-Model Framework to ...
The Climate Change Impacts and Risk Analysis (CIRA) project establishes a new multi-model framework to systematically assess the impacts, economic damages, and risks from climate change in the United States. The primary goal of this framework to estimate how climate change impacts and damages in the United States are avoided or reduced due to global greenhouse gas (GHG) emissions mitigation scenarios. Scenarios are designed to explore key uncertainties around the measurement of these changes. The modeling exercise presented in this Special Issue includes two integrated assessment models and 15 sectoral models encompassing six broad impacts sectors - water resources, electric power, infrastructure, human health, ecosystems, and forests. Three consistent emissions scenarios are used to analyze the benefits of global GHG mitigation targets: a reference and two policy scenarios, with total radiative forcing in 2100 of 10.0W/m2, 4.5W/m2, and 3.7W/m2. A range of climate sensitivities, climate models, natural variability measures, and structural uncertainties of sectoral models are examined to explore the implications of key uncertainties. This overview paper describes the motivations, goals, design, and academic contribution of the CIRA modeling exercise and briefly summarizes the subsequent papers in this Special Issue. A summary of results across impact sectors is provided showing that: GHG mitigation provides benefits to the United States that increase over
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khaira, Gurdaman; Doxastakis, Manolis; Bowen, Alec
There is considerable interest in developing multimodal characterization frameworks capable of probing critical properties of complex materials by relying on distinct, complementary methods or tools. Any such framework should maximize the amount of information that is extracted from any given experiment and should be sufficiently powerful and efficient to enable on-the-fly analysis of multiple measurements in a self-consistent manner. Such a framework is demonstrated in this work in the context of self-assembling polymeric materials, where theory and simulations provide the language to seamlessly mesh experimental data from two different scattering measurements. Specifically, the samples considered here consist of diblock copolymersmore » (BCP) that are self-assembled on chemically nanopatterned surfaces. The copolymers microphase separate into ordered lamellae with characteristic dimensions on the scale of tens of nanometers that are perfectly aligned by the substrate over macroscopic areas. These aligned lamellar samples provide ideal standards with which to develop the formalism introduced in this work and, more generally, the concept of high-information-content, multimodal experimentation. The outcomes of the proposed analysis are then compared to images generated by 3D scanning electron microscopy tomography, serving to validate the merit of the framework and ideas proposed here.« less
Developmental framework to validate future designs of ballistic neck protection.
Breeze, J; Midwinter, M J; Pope, D; Porter, K; Hepper, A E; Clasper, J
2013-01-01
The number of neck injuries has increased during the war in Afghanistan, and they have become an appreciable source of mortality and long-term morbidity for UK servicemen. A three-dimensional numerical model of the neck is necessary to allow simulation of penetrating injury from explosive fragments so that the design of body armour can be optimal, and a framework is required to validate and describe the individual components of this program. An interdisciplinary consensus group consisting of military maxillofacial surgeons, and biomedical, physical, and material scientists was convened to generate the components of the framework, and as a result it incorporates the following components: analysis of deaths and long-term morbidity, assessment of critical cervical structures for incorporation into the model, characterisation of explosive fragments, evaluation of the material of which the body armour is made, and mapping of the entry sites of fragments. The resulting numerical model will simulate the wound tract produced by fragments of differing masses and velocities, and illustrate the effects of temporary cavities on cervical neurovascular structures. Using this framework, a new shirt to be worn under body armour that incorporates ballistic cervical protection has been developed for use in Afghanistan. New designs of the collar validated by human factors and assessment of coverage are currently being incorporated into early versions of the numerical model. The aim of this paper is to describe this developmental framework and provide an update on the current progress of its individual components. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.
Reinterpreting Lifelong Learning: Meanings of Adult Education Policy in Portugal, 1999-2010
ERIC Educational Resources Information Center
Guimaraes, Paula
2013-01-01
This article analyses Portugal's adult education policy between 1999 and 2010. Our empirical material consists of Portuguese as well as supranational policy documents. We use a theoretical framework which distinguishes three models of public policy, with different views on the roles of public policy and of education: (1) participative…
An analytical procedure to assist decision-making in a government research organization
H. Dean Claxton; Giuseppe Rensi
1972-01-01
An analytical procedure to help management decision-making in planning government research is described. The objectives, activities, and restrictions of a government research organization are modeled in a consistent analytical framework. Theory and methodology is drawn from economics and mathe-matical programing. The major analytical aspects distinguishing research...
Self-Regulated Learning Procedure for University Students: The "Meaningful Text-Reading" Strategy
ERIC Educational Resources Information Center
Roman Sanchez, Jose Maria
2004-01-01
Introduction: Experimental validation of a self-regulated learning procedure for university students, i.e. the "meaningful text-reading" strategy, is reported in this paper. The strategy's theoretical framework is the "ACRA Model" of learning strategies. The strategy consists of a flexible, recurring sequence of five mental operations of written…
ERIC Educational Resources Information Center
Borge, Marcela; White, Barbara
2016-01-01
We proposed and evaluated an instructional framework for increasing students' ability to understand and regulate collaborative interactions called Co-Regulated Collaborative Learning (CRCL). In this instantiation of CRCL, models of collaborative competence were articulated through a set of socio-metacognitive roles. Our population consisted of 28…
Dual Diathesis-Stressor Model of Emotional and Linguistic Contributions to Developmental Stuttering
ERIC Educational Resources Information Center
Walden, Tedra A.; Frankel, Carl B.; Buhr, Anthony P.; Johnson, Kia N.; Conture, Edward G.; Karrass, Jan M.
2012-01-01
This study assessed emotional and speech-language contributions to childhood stuttering. A dual diathesis-stressor framework guided this study, in which both linguistic requirements and skills, and emotion and its regulation, are hypothesized to contribute to stuttering. The language diathesis consists of expressive and receptive language skills.…
Purposes and Bibliographic Objectives of a Pioneer Library Catalog in China
ERIC Educational Resources Information Center
Lee, Hur-Li; Lan, Wen-Chin
2009-01-01
This research aims to ascertain the conceptual basics underlying the design of the "Seven Epitomes", the first library catalog to establish the bibliographic model in imperial China. The analytical framework for the study consists of a reconstructed version of the catalog and its historical contexts. In analyzing the surviving text of…
Renehan, Emma; Goeman, Dianne; Koch, Susan
2017-07-20
In Australia, dementia is a national health priority. With the rising number of people living with dementia and shortage of formal and informal carers predicted in the near future, developing approaches to coordinating services in quality-focused ways is considered an urgent priority. Key worker support models are one approach that have been used to assist people living with dementia and their caring unit coordinate services and navigate service systems; however, there is limited literature outlining comprehensive frameworks for the implementation of community dementia key worker roles in practice. In this paper an optimised key worker framework for people with dementia, their family and caring unit living in the community is developed and presented. A number of processes were undertaken to inform the development of a co-designed optimised key worker framework: an expert working and reference group; a systematic review of the literature; and a qualitative evaluation of 14 dementia key worker models operating in Australia involving 14 interviews with organisation managers, 19 with key workers and 15 with people living with dementia and/or their caring unit. Data from the systematic review and evaluation of dementia key worker models were analysed by the researchers and the expert working and reference group using a constant comparative approach to define the essential components of the optimised framework. The developed framework consisted of four main components: overarching philosophies; organisational context; role definition; and key worker competencies. A number of more clearly defined sub-themes sat under each component. Reflected in the framework is the complexity of the dementia journey and the difficulty in trying to develop a 'one size fits all' approach. This co-designed study led to the development of an evidence based framework which outlines a comprehensive synthesis of components viewed as being essential to the implementation of a dementia key worker model of care in the community. The framework was informed and endorsed by people living with dementia and their caring unit, key workers, managers, Australian industry experts, policy makers and researchers. An evaluation of its effectiveness and relevance for practice within the dementia care space is required.
Surrogate assisted multidisciplinary design optimization for an all-electric GEO satellite
NASA Astrophysics Data System (ADS)
Shi, Renhe; Liu, Li; Long, Teng; Liu, Jian; Yuan, Bin
2017-09-01
State-of-the-art all-electric geostationary earth orbit (GEO) satellites use electric thrusters to execute all propulsive duties, which significantly differ from the traditional all-chemical ones in orbit-raising, station-keeping, radiation damage protection, and power budget, etc. Design optimization task of an all-electric GEO satellite is therefore a complex multidisciplinary design optimization (MDO) problem involving unique design considerations. However, solving the all-electric GEO satellite MDO problem faces big challenges in disciplinary modeling techniques and efficient optimization strategy. To address these challenges, we presents a surrogate assisted MDO framework consisting of several modules, i.e., MDO problem definition, multidisciplinary modeling, multidisciplinary analysis (MDA), and surrogate assisted optimizer. Based on the proposed framework, the all-electric GEO satellite MDO problem is formulated to minimize the total mass of the satellite system under a number of practical constraints. Then considerable efforts are spent on multidisciplinary modeling involving geosynchronous transfer, GEO station-keeping, power, thermal control, attitude control, and structure disciplines. Since orbit dynamics models and finite element structural model are computationally expensive, an adaptive response surface surrogate based optimizer is incorporated in the proposed framework to solve the satellite MDO problem with moderate computational cost, where a response surface surrogate is gradually refined to represent the computationally expensive MDA process. After optimization, the total mass of the studied GEO satellite is decreased by 185.3 kg (i.e., 7.3% of the total mass). Finally, the optimal design is further discussed to demonstrate the effectiveness of our proposed framework to cope with the all-electric GEO satellite system design optimization problems. This proposed surrogate assisted MDO framework can also provide valuable references for other all-electric spacecraft system design.
NASA Astrophysics Data System (ADS)
Bakker, Alexander; Louchard, Domitille; Keller, Klaus
2016-04-01
Sea-level rise threatens many coastal areas around the world. The integrated assessment of potential adaptation and mitigation strategies requires a sound understanding of the upper tails and the major drivers of the uncertainties. Global warming causes sea-level to rise, primarily due to thermal expansion of the oceans and mass loss of the major ice sheets, smaller ice caps and glaciers. These components show distinctly different responses to temperature changes with respect to response time, threshold behavior, and local fingerprints. Projections of these different components are deeply uncertain. Projected uncertainty ranges strongly depend on (necessary) pragmatic choices and assumptions; e.g. on the applied climate scenarios, which processes to include and how to parameterize them, and on error structure of the observations. Competing assumptions are very hard to objectively weigh. Hence, uncertainties of sea-level response are hard to grasp in a single distribution function. The deep uncertainty can be better understood by making clear the key assumptions. Here we demonstrate this approach using a relatively simple model framework. We present a mechanistically motivated, but simple model framework that is intended to efficiently explore the deeply uncertain sea-level response to anthropogenic climate change. The model consists of 'building blocks' that represent the major components of sea-level response and its uncertainties, including threshold behavior. The framework's simplicity enables the simulation of large ensembles allowing for an efficient exploration of parameter uncertainty and for the simulation of multiple combined adaptation and mitigation strategies. The model framework can skilfully reproduce earlier major sea level assessments, but due to the modular setup it can also be easily utilized to explore high-end scenarios and the effect of competing assumptions and parameterizations.
Formulating accident occurrence as a survival process.
Chang, H L; Jovanis, P P
1990-10-01
A conceptual framework for accident occurrence is developed based on the principle of the driver as an information processor. The framework underlies the development of a modeling approach that is consistent with the definition of exposure to risk as a repeated trial. Survival theory is proposed as a statistical technique that is consistent with the conceptual structure and allows the exploration of a wide range of factors that contribute to highway operating risk. This survival model of accident occurrence is developed at a disaggregate level, allowing safety researchers to broaden the scope of studies which may be limited by the use of traditional aggregate approaches. An application of the approach to motor carrier safety is discussed as are potential applications to a variety of transportation industries. Lastly, a typology of highway safety research methodologies is developed to compare the properties of four safety methodologies: laboratory experiments, on-the-road studies, multidisciplinary accident investigations, and correlational studies. The survival theory formulation has a mathematical structure that is compatible with each safety methodology, so it may facilitate the integration of findings across methodologies.
Bell, Margaret Carol; Galatioto, Fabio; Giuffrè, Tullio; Tesoriere, Giovanni
2012-05-01
Building on previous research a conceptual framework, based on potential conflicts analysis, has provided a quantitative evaluation of 'proneness' to red-light running behaviour at urban signalised intersections of different geometric, flow and driver characteristics. The results provided evidence that commonly used violation rates could cause inappropriate evaluation of the extent of the red-light running phenomenon. Initially, an in-depth investigation of the functional form of the mathematical relationship between the potential and actual red-light runners was carried out. The application of the conceptual framework was tested on a signalised intersection in order to quantify the proneness to red-light running. For the particular junction studied proneness for daytime was found to be 0.17 north and 0.16 south for opposing main road approaches and 0.42 east and 0.59 west for the secondary approaches. Further investigations were carried out using a traffic microsimulation model, to explore those geometric features and traffic volumes (arrival patterns at the stop-line) that significantly affect red-light running. In this way the prediction capability of the proposed potential conflict model was improved. A degree of consistency in the measured and simulated red-light running was observed and the conceptual framework was tested through a sensitivity analysis applied to different stop-line positions and traffic volume variations. The microsimulation, although at its early stages of development, has shown promise in its ability to model unintentional red light running behaviour and following further work through application to other junctions, potentially provides a tool for evaluation of alternative junction designs on proneness. In brief, this paper proposes and applies a novel approach to model red-light running using a microsimulation and demonstrates consistency with the observed and theoretical results. Copyright © 2011 Elsevier Ltd. All rights reserved.
Framework for shape analysis of white matter fiber bundles.
Glozman, Tanya; Bruckert, Lisa; Pestilli, Franco; Yecies, Derek W; Guibas, Leonidas J; Yeom, Kristen W
2018-02-15
Diffusion imaging coupled with tractography algorithms allows researchers to image human white matter fiber bundles in-vivo. These bundles are three-dimensional structures with shapes that change over time during the course of development as well as in pathologic states. While most studies on white matter variability focus on analysis of tissue properties estimated from the diffusion data, e.g. fractional anisotropy, the shape variability of white matter fiber bundle is much less explored. In this paper, we present a set of tools for shape analysis of white matter fiber bundles, namely: (1) a concise geometric model of bundle shapes; (2) a method for bundle registration between subjects; (3) a method for deformation estimation. Our framework is useful for analysis of shape variability in white matter fiber bundles. We demonstrate our framework by applying our methods on two datasets: one consisting of data for 6 normal adults and another consisting of data for 38 normal children of age 11 days to 8.5 years. We suggest a robust and reproducible method to measure changes in the shape of white matter fiber bundles. We demonstrate how this method can be used to create a model to assess age-dependent changes in the shape of specific fiber bundles. We derive such models for an ensemble of white matter fiber bundles on our pediatric dataset and show that our results agree with normative human head and brain growth data. Creating these models for a large pediatric longitudinal dataset may improve understanding of both normal development and pathologic states and propose novel parameters for the examination of the pediatric brain. Copyright © 2017 Elsevier Inc. All rights reserved.
Numerical Coupling and Simulation of Point-Mass System with the Turbulent Fluid Flow
NASA Astrophysics Data System (ADS)
Gao, Zheng
A computational framework that combines the Eulerian description of the turbulence field with a Lagrangian point-mass ensemble is proposed in this dissertation. Depending on the Reynolds number, the turbulence field is simulated using Direct Numerical Simulation (DNS) or eddy viscosity model. In the meanwhile, the particle system, such as spring-mass system and cloud droplets, are modeled using the ordinary differential system, which is stiff and hence poses a challenge to the stability of the entire system. This computational framework is applied to the numerical study of parachute deceleration and cloud microphysics. These two distinct problems can be uniformly modeled with Partial Differential Equations (PDEs) and Ordinary Differential Equations (ODEs), and numerically solved in the same framework. For the parachute simulation, a novel porosity model is proposed to simulate the porous effects of the parachute canopy. This model is easy to implement with the projection method and is able to reproduce Darcy's law observed in the experiment. Moreover, the impacts of using different versions of k-epsilon turbulence model in the parachute simulation have been investigated and conclude that the standard and Re-Normalisation Group (RNG) model may overestimate the turbulence effects when Reynolds number is small while the Realizable model has a consistent performance with both large and small Reynolds number. For another application, cloud microphysics, the cloud entrainment-mixing problem is studied in the same numerical framework. Three sets of DNS are carried out with both decaying and forced turbulence. The numerical result suggests a new way parameterize the cloud mixing degree using the dynamical measures. The numerical experiments also verify the negative relationship between the droplets number concentration and the vorticity field. The results imply that the gravity has fewer impacts on the forced turbulence than the decaying turbulence. In summary, the proposed framework can be used to solve a physics problem that involves turbulence field and point-mass system, and therefore has a broad application.
Metal–organic and covalent organic frameworks as single-site catalysts
Rogge, S. M. J.; Bavykina, A.; Hajek, J.; Garcia, H.; Olivos-Suarez, A. I.; Sepúlveda-Escribano, A.; Vimont, A.; Clet, G.; Bazin, P.; Kapteijn, F.
2017-01-01
Heterogeneous single-site catalysts consist of isolated, well-defined, active sites that are spatially separated in a given solid and, ideally, structurally identical. In this review, the potential of metal–organic frameworks (MOFs) and covalent organic frameworks (COFs) as platforms for the development of heterogeneous single-site catalysts is reviewed thoroughly. In the first part of this article, synthetic strategies and progress in the implementation of such sites in these two classes of materials are discussed. Because these solids are excellent playgrounds to allow a better understanding of catalytic functions, we highlight the most important recent advances in the modelling and spectroscopic characterization of single-site catalysts based on these materials. Finally, we discuss the potential of MOFs as materials in which several single-site catalytic functions can be combined within one framework along with their potential as powerful enzyme-mimicking materials. The review is wrapped up with our personal vision on future research directions. PMID:28338128
NASA Astrophysics Data System (ADS)
Cucchi, K.; Kawa, N.; Hesse, F.; Rubin, Y.
2017-12-01
In order to reduce uncertainty in the prediction of subsurface flow and transport processes, practitioners should use all data available. However, classic inverse modeling frameworks typically only make use of information contained in in-situ field measurements to provide estimates of hydrogeological parameters. Such hydrogeological information about an aquifer is difficult and costly to acquire. In this data-scarce context, the transfer of ex-situ information coming from previously investigated sites can be critical for improving predictions by better constraining the estimation procedure. Bayesian inverse modeling provides a coherent framework to represent such ex-situ information by virtue of the prior distribution and combine them with in-situ information from the target site. In this study, we present an innovative data-driven approach for defining such informative priors for hydrogeological parameters at the target site. Our approach consists in two steps, both relying on statistical and machine learning methods. The first step is data selection; it consists in selecting sites similar to the target site. We use clustering methods for selecting similar sites based on observable hydrogeological features. The second step is data assimilation; it consists in assimilating data from the selected similar sites into the informative prior. We use a Bayesian hierarchical model to account for inter-site variability and to allow for the assimilation of multiple types of site-specific data. We present the application and validation of the presented methods on an established database of hydrogeological parameters. Data and methods are implemented in the form of an open-source R-package and therefore facilitate easy use by other practitioners.
Lee, Minhyun; Koo, Choongwan; Hong, Taehoon; Park, Hyo Seon
2014-04-15
For the effective photovoltaic (PV) system, it is necessary to accurately determine the monthly average daily solar radiation (MADSR) and to develop an accurate MADSR map, which can simplify the decision-making process for selecting the suitable location of the PV system installation. Therefore, this study aimed to develop a framework for the mapping of the MADSR using an advanced case-based reasoning (CBR) and a geostatistical technique. The proposed framework consists of the following procedures: (i) the geographic scope for the mapping of the MADSR is set, and the measured MADSR and meteorological data in the geographic scope are collected; (ii) using the collected data, the advanced CBR model is developed; (iii) using the advanced CBR model, the MADSR at unmeasured locations is estimated; and (iv) by applying the measured and estimated MADSR data to the geographic information system, the MADSR map is developed. A practical validation was conducted by applying the proposed framework to South Korea. It was determined that the MADSR map developed through the proposed framework has been improved in terms of accuracy. The developed MADSR map can be used for estimating the MADSR at unmeasured locations and for determining the optimal location for the PV system installation.
Conceptual Framework To Extend Life Cycle Assessment ...
Life Cycle Assessment (LCA) is a decision-making tool that accounts for multiple impacts across the life cycle of a product or service. This paper presents a conceptual framework to integrate human health impact assessment with risk screening approaches to extend LCA to include near-field chemical sources (e.g., those originating from consumer products and building materials) that have traditionally been excluded from LCA. A new generation of rapid human exposure modeling and high-throughput toxicity testing is transforming chemical risk prioritization and provides an opportunity for integration of screening-level risk assessment (RA) with LCA. The combined LCA and RA approach considers environmental impacts of products alongside risks to human health, which is consistent with regulatory frameworks addressing RA within a sustainability mindset. A case study is presented to juxtapose LCA and risk screening approaches for a chemical used in a consumer product. The case study demonstrates how these new risk screening tools can be used to inform toxicity impact estimates in LCA and highlights needs for future research. The framework provides a basis for developing tools and methods to support decision making on the use of chemicals in products. This paper presents a conceptual framework for including near-field exposures into Life Cycle Assessment using advanced human exposure modeling and high-throughput tools
GillesPy: A Python Package for Stochastic Model Building and Simulation.
Abel, John H; Drawert, Brian; Hellander, Andreas; Petzold, Linda R
2016-09-01
GillesPy is an open-source Python package for model construction and simulation of stochastic biochemical systems. GillesPy consists of a Python framework for model building and an interface to the StochKit2 suite of efficient simulation algorithms based on the Gillespie stochastic simulation algorithms (SSA). To enable intuitive model construction and seamless integration into the scientific Python stack, we present an easy to understand, action-oriented programming interface. Here, we describe the components of this package and provide a detailed example relevant to the computational biology community.
GillesPy: A Python Package for Stochastic Model Building and Simulation
Abel, John H.; Drawert, Brian; Hellander, Andreas; Petzold, Linda R.
2017-01-01
GillesPy is an open-source Python package for model construction and simulation of stochastic biochemical systems. GillesPy consists of a Python framework for model building and an interface to the StochKit2 suite of efficient simulation algorithms based on the Gillespie stochastic simulation algorithms (SSA). To enable intuitive model construction and seamless integration into the scientific Python stack, we present an easy to understand, action-oriented programming interface. Here, we describe the components of this package and provide a detailed example relevant to the computational biology community. PMID:28630888
Lopsidedness of Self-consistent Galaxies Caused by the External Field Effect of Clusters
NASA Astrophysics Data System (ADS)
Wu, Xufen; Wang, Yougang; Feix, Martin; Zhao, HongSheng
2017-08-01
Adopting Schwarzschild’s orbit-superposition technique, we construct a series of self-consistent galaxy models, embedded in the external field of galaxy clusters in the framework of Milgrom’s MOdified Newtonian Dynamics (MOND). These models represent relatively massive ellipticals with a Hernquist radial profile at various distances from the cluster center. Using N-body simulations, we perform a first analysis of these models and their evolution. We find that self-gravitating axisymmetric density models, even under a weak external field, lose their symmetry by instability and generally evolve to triaxial configurations. A kinematic analysis suggests that the instability originates from both box and nonclassified orbits with low angular momentum. We also consider a self-consistent isolated system that is then placed in a strong external field and allowed to evolve freely. This model, just like the corresponding equilibrium model in the same external field, eventually settles to a triaxial equilibrium as well, but has a higher velocity radial anisotropy and is rounder. The presence of an external field in the MOND universe generically predicts some lopsidedness of galaxy shapes.
Lopsidedness of Self-consistent Galaxies Caused by the External Field Effect of Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Xufen; Wang, Yougang; Feix, Martin
2017-08-01
Adopting Schwarzschild’s orbit-superposition technique, we construct a series of self-consistent galaxy models, embedded in the external field of galaxy clusters in the framework of Milgrom’s MOdified Newtonian Dynamics (MOND). These models represent relatively massive ellipticals with a Hernquist radial profile at various distances from the cluster center. Using N -body simulations, we perform a first analysis of these models and their evolution. We find that self-gravitating axisymmetric density models, even under a weak external field, lose their symmetry by instability and generally evolve to triaxial configurations. A kinematic analysis suggests that the instability originates from both box and nonclassified orbitsmore » with low angular momentum. We also consider a self-consistent isolated system that is then placed in a strong external field and allowed to evolve freely. This model, just like the corresponding equilibrium model in the same external field, eventually settles to a triaxial equilibrium as well, but has a higher velocity radial anisotropy and is rounder. The presence of an external field in the MOND universe generically predicts some lopsidedness of galaxy shapes.« less
Li, Chen; Nagasaki, Masao; Koh, Chuan Hock; Miyano, Satoru
2011-05-01
Mathematical modeling and simulation studies are playing an increasingly important role in helping researchers elucidate how living organisms function in cells. In systems biology, researchers typically tune many parameters manually to achieve simulation results that are consistent with biological knowledge. This severely limits the size and complexity of simulation models built. In order to break this limitation, we propose a computational framework to automatically estimate kinetic parameters for a given network structure. We utilized an online (on-the-fly) model checking technique (which saves resources compared to the offline approach), with a quantitative modeling and simulation architecture named hybrid functional Petri net with extension (HFPNe). We demonstrate the applicability of this framework by the analysis of the underlying model for the neuronal cell fate decision model (ASE fate model) in Caenorhabditis elegans. First, we built a quantitative ASE fate model containing 3327 components emulating nine genetic conditions. Then, using our developed efficient online model checker, MIRACH 1.0, together with parameter estimation, we ran 20-million simulation runs, and were able to locate 57 parameter sets for 23 parameters in the model that are consistent with 45 biological rules extracted from published biological articles without much manual intervention. To evaluate the robustness of these 57 parameter sets, we run another 20 million simulation runs using different magnitudes of noise. Our simulation results concluded that among these models, one model is the most reasonable and robust simulation model owing to the high stability against these stochastic noises. Our simulation results provide interesting biological findings which could be used for future wet-lab experiments.
NASA Astrophysics Data System (ADS)
Silvis, Maurits H.; Remmerswaal, Ronald A.; Verstappen, Roel
2017-01-01
We study the construction of subgrid-scale models for large-eddy simulation of incompressible turbulent flows. In particular, we aim to consolidate a systematic approach of constructing subgrid-scale models, based on the idea that it is desirable that subgrid-scale models are consistent with the mathematical and physical properties of the Navier-Stokes equations and the turbulent stresses. To that end, we first discuss in detail the symmetries of the Navier-Stokes equations, and the near-wall scaling behavior, realizability and dissipation properties of the turbulent stresses. We furthermore summarize the requirements that subgrid-scale models have to satisfy in order to preserve these important mathematical and physical properties. In this fashion, a framework of model constraints arises that we apply to analyze the behavior of a number of existing subgrid-scale models that are based on the local velocity gradient. We show that these subgrid-scale models do not satisfy all the desired properties, after which we explain that this is partly due to incompatibilities between model constraints and limitations of velocity-gradient-based subgrid-scale models. However, we also reason that the current framework shows that there is room for improvement in the properties and, hence, the behavior of existing subgrid-scale models. We furthermore show how compatible model constraints can be combined to construct new subgrid-scale models that have desirable properties built into them. We provide a few examples of such new models, of which a new model of eddy viscosity type, that is based on the vortex stretching magnitude, is successfully tested in large-eddy simulations of decaying homogeneous isotropic turbulence and turbulent plane-channel flow.
Nunes, David; Tran, Thanh-Dien; Raposo, Duarte; Pinto, André; Gomes, André; Silva, Jorge Sá
2012-01-01
As the Internet evolved, social networks (such as Facebook) have bloomed and brought together an astonishing number of users. Mashing up mobile phones and sensors with these social environments enables the creation of people-centric sensing systems which have great potential for expanding our current social networking usage. However, such systems also have many associated technical challenges, such as privacy concerns, activity detection mechanisms or intermittent connectivity, as well as limitations due to the heterogeneity of sensor nodes and networks. Considering the openness of the Web 2.0, good technical solutions for these cases consist of frameworks that expose sensing data and functionalities as common Web-Services. This paper presents our RESTful Web Service-based model for people-centric sensing frameworks, which uses sensors and mobile phones to detect users’ activities and locations, sharing this information amongst the user’s friends within a social networking site. We also present some screenshot results of our experimental prototype. PMID:22438732
Physics of automated driving in framework of three-phase traffic theory.
Kerner, Boris S
2018-04-01
We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.
Physics of automated driving in framework of three-phase traffic theory
NASA Astrophysics Data System (ADS)
Kerner, Boris S.
2018-04-01
We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.
Nunes, David; Tran, Thanh-Dien; Raposo, Duarte; Pinto, André; Gomes, André; Silva, Jorge Sá
2012-01-01
As the Internet evolved, social networks (such as Facebook) have bloomed and brought together an astonishing number of users. Mashing up mobile phones and sensors with these social environments enables the creation of people-centric sensing systems which have great potential for expanding our current social networking usage. However, such systems also have many associated technical challenges, such as privacy concerns, activity detection mechanisms or intermittent connectivity, as well as limitations due to the heterogeneity of sensor nodes and networks. Considering the openness of the Web 2.0, good technical solutions for these cases consist of frameworks that expose sensing data and functionalities as common Web-Services. This paper presents our RESTful Web Service-based model for people-centric sensing frameworks, which uses sensors and mobile phones to detect users' activities and locations, sharing this information amongst the user's friends within a social networking site. We also present some screenshot results of our experimental prototype.
Multiscale modeling and simulation of brain blood flow
NASA Astrophysics Data System (ADS)
Perdikaris, Paris; Grinberg, Leopold; Karniadakis, George Em
2016-02-01
The aim of this work is to present an overview of recent advances in multi-scale modeling of brain blood flow. In particular, we present some approaches that enable the in silico study of multi-scale and multi-physics phenomena in the cerebral vasculature. We discuss the formulation of continuum and atomistic modeling approaches, present a consistent framework for their concurrent coupling, and list some of the challenges that one needs to overcome in achieving a seamless and scalable integration of heterogeneous numerical solvers. The effectiveness of the proposed framework is demonstrated in a realistic case involving modeling the thrombus formation process taking place on the wall of a patient-specific cerebral aneurysm. This highlights the ability of multi-scale algorithms to resolve important biophysical processes that span several spatial and temporal scales, potentially yielding new insight into the key aspects of brain blood flow in health and disease. Finally, we discuss open questions in multi-scale modeling and emerging topics of future research.
Emergent behaviors of the Schrödinger-Lohe model on cooperative-competitive networks
NASA Astrophysics Data System (ADS)
Huh, Hyungjin; Ha, Seung-Yeal; Kim, Dohyun
2017-12-01
We present several sufficient frameworks leading to the emergent behaviors of the coupled Schrödinger-Lohe (S-L) model under the same one-body external potential on cooperative-competitive networks. The S-L model was first introduced as a possible phenomenological model exhibiting quantum synchronization and its emergent dynamics on all-to-all cooperative networks has been treated via two distinct approaches, Lyapunov functional approach and the finite-dimensional reduction based on pairwise correlations. In this paper, we further generalize the finite-dimensional dynamical systems approach for pairwise correlation functions on cooperative-competitive networks and provide several sufficient frameworks leading to the collective exponential synchronization. For small systems consisting of three and four quantum subsystem, we also show that the system for pairwise correlations can be reduced to the Lotka-Volterra model with cooperative and competitive interactions, in which lots of interesting dynamical patterns appear, e.g., existence of closed orbits and limit-cycles.
Harun, Rashed; Grassi, Christine M; Munoz, Miranda J; Torres, Gonzalo E; Wagner, Amy K
2015-03-02
Fast-scan cyclic voltammetry (FSCV) is an electrochemical method that can assess real-time in vivo dopamine (DA) concentration changes to study the kinetics of DA neurotransmission. Electrical stimulation of dopaminergic (DAergic) pathways can elicit FSCV DA responses that largely reflect a balance of DA release and reuptake. Interpretation of these evoked DA responses requires a framework to discern the contribution of DA release and reuptake. The current, widely implemented interpretive framework for doing so is the Michaelis-Menten (M-M) model, which is grounded on two assumptions- (1) DA release rate is constant during stimulation, and (2) DA reuptake occurs through dopamine transporters (DAT) in a manner consistent with M-M enzyme kinetics. Though the M-M model can simulate evoked DA responses that rise convexly, response types that predominate in the ventral striatum, the M-M model cannot simulate dorsal striatal responses that rise concavely. Based on current neurotransmission principles and experimental FSCV data, we developed a novel, quantitative, neurobiological framework to interpret DA responses that assumes DA release decreases exponentially during stimulation and continues post-stimulation at a diminishing rate. Our model also incorporates dynamic M-M kinetics to describe DA reuptake as a process of decreasing reuptake efficiency. We demonstrate that this quantitative, neurobiological model is an extension of the traditional M-M model that can simulate heterogeneous regional DA responses following manipulation of stimulation duration, frequency, and DA pharmacology. The proposed model can advance our interpretive framework for future in vivo FSCV studies examining regional DA kinetics and their alteration by disease and DA pharmacology. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Avendaño-Valencia, Luis David; Fassois, Spilios D.
2017-07-01
The study focuses on vibration response based health monitoring for an operating wind turbine, which features time-dependent dynamics under environmental and operational uncertainty. A Gaussian Mixture Model Random Coefficient (GMM-RC) model based Structural Health Monitoring framework postulated in a companion paper is adopted and assessed. The assessment is based on vibration response signals obtained from a simulated offshore 5 MW wind turbine. The non-stationarity in the vibration signals originates from the continually evolving, due to blade rotation, inertial properties, as well as the wind characteristics, while uncertainty is introduced by random variations of the wind speed within the range of 10-20 m/s. Monte Carlo simulations are performed using six distinct structural states, including the healthy state and five types of damage/fault in the tower, the blades, and the transmission, with each one of them characterized by four distinct levels. Random vibration response modeling and damage diagnosis are illustrated, along with pertinent comparisons with state-of-the-art diagnosis methods. The results demonstrate consistently good performance of the GMM-RC model based framework, offering significant performance improvements over state-of-the-art methods. Most damage types and levels are shown to be properly diagnosed using a single vibration sensor.
Gunn, Christine M; Clark, Jack A; Battaglia, Tracy A; Freund, Karen M; Parker, Victoria A
2014-10-01
To determine how closely a published model of navigation reflects the practice of navigation in breast cancer patient navigation programs. Observational field notes describing patient navigator activities collected from 10 purposefully sampled, foundation-funded breast cancer navigation programs in 2008-2009. An exploratory study evaluated a model framework for patient navigation published by Harold Freeman by using an a priori coding scheme based on model domains. Field notes were compiled and coded. Inductive codes were added during analysis to characterize activities not included in the original model. Programs were consistent with individual-level principles representing tasks focused on individual patients. There was variation with respect to program-level principles that related to program organization and structure. Program characteristics such as the use of volunteer or clinical navigators were identified as contributors to patterns of model concordance. This research provides a framework for defining the navigator role as focused on eliminating barriers through the provision of individual-level interventions. The diversity observed at the program level in these programs was a reflection of implementation according to target population. Further guidance may be required to assist patient navigation programs to define and tailor goals and measurement to community needs. © Health Research and Educational Trust.
Gunn, Christine M; Clark, Jack A; Battaglia, Tracy A; Freund, Karen M; Parker, Victoria A
2014-01-01
Objective To determine how closely a published model of navigation reflects the practice of navigation in breast cancer patient navigation programs. Data Source Observational field notes describing patient navigator activities collected from 10 purposefully sampled, foundation-funded breast cancer navigation programs in 2008–2009. Study Design An exploratory study evaluated a model framework for patient navigation published by Harold Freeman by using an a priori coding scheme based on model domains. Data Collection Field notes were compiled and coded. Inductive codes were added during analysis to characterize activities not included in the original model. Principal Findings Programs were consistent with individual-level principles representing tasks focused on individual patients. There was variation with respect to program-level principles that related to program organization and structure. Program characteristics such as the use of volunteer or clinical navigators were identified as contributors to patterns of model concordance. Conclusions This research provides a framework for defining the navigator role as focused on eliminating barriers through the provision of individual-level interventions. The diversity observed at the program level in these programs was a reflection of implementation according to target population. Further guidance may be required to assist patient navigation programs to define and tailor goals and measurement to community needs. PMID:24820445
Dictionary-based fiber orientation estimation with improved spatial consistency.
Ye, Chuyang; Prince, Jerry L
2018-02-01
Diffusion magnetic resonance imaging (dMRI) has enabled in vivo investigation of white matter tracts. Fiber orientation (FO) estimation is a key step in tract reconstruction and has been a popular research topic in dMRI analysis. In particular, the sparsity assumption has been used in conjunction with a dictionary-based framework to achieve reliable FO estimation with a reduced number of gradient directions. Because image noise can have a deleterious effect on the accuracy of FO estimation, previous works have incorporated spatial consistency of FOs in the dictionary-based framework to improve the estimation. However, because FOs are only indirectly determined from the mixture fractions of dictionary atoms and not modeled as variables in the objective function, these methods do not incorporate FO smoothness directly, and their ability to produce smooth FOs could be limited. In this work, we propose an improvement to Fiber Orientation Reconstruction using Neighborhood Information (FORNI), which we call FORNI+; this method estimates FOs in a dictionary-based framework where FO smoothness is better enforced than in FORNI alone. We describe an objective function that explicitly models the actual FOs and the mixture fractions of dictionary atoms. Specifically, it consists of data fidelity between the observed signals and the signals represented by the dictionary, pairwise FO dissimilarity that encourages FO smoothness, and weighted ℓ 1 -norm terms that ensure the consistency between the actual FOs and the FO configuration suggested by the dictionary representation. The FOs and mixture fractions are then jointly estimated by minimizing the objective function using an iterative alternating optimization strategy. FORNI+ was evaluated on a simulation phantom, a physical phantom, and real brain dMRI data. In particular, in the real brain dMRI experiment, we have qualitatively and quantitatively evaluated the reproducibility of the proposed method. Results demonstrate that FORNI+ produces FOs with better quality compared with competing methods. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuma, Christian; Sauer, Joachim, E-mail: js@chemie.hu-berlin.de
2015-09-14
A hybrid MP2:DFT (second-order Møller–Plesset perturbation theory–density functional theory) method that combines MP2 calculations for cluster models with DFT calculations for the full periodic structure is used to localize minima and transition structures for proton jumps at different Brønsted sites in different frameworks (chabazite, faujasite, ferrierite, and ZSM-5) and at different crystallographic positions of a given framework. The MP2 limit for the periodic structures is obtained by extrapolating the results of a series of cluster models of increasing size. A coupled-cluster (CCSD(T)) correction to MP2 energies is calculated for cluster models consisting of three tetrahedra. For the adsorption energies, thismore » difference is small, between 0.1 and 0.9 kJ/mol, but for the intrinsic proton exchange barriers, this difference makes a significant (10.85 ± 0.25 kJ/mol) and almost constant contribution across different systems. The total values of the adsorption energies vary between 22 and 34 kJ/mol, whereas the total proton exchange energy barriers fall in the narrow range of 152–156 kJ/mol. After adding nuclear motion contributions (harmonic approximation, 298 K), intrinsic enthalpy barriers between 134 and 141 kJ/mol and apparent energy barriers between 105 and 118 kJ/mol are predicted for the different sites examined for the different frameworks. These predictions are consistent with experimental results available for faujasite, ferrierite, and ZSM-5.« less
On the representability problem and the physical meaning of coarse-grained models
NASA Astrophysics Data System (ADS)
Wagner, Jacob W.; Dama, James F.; Durumeric, Aleksander E. P.; Voth, Gregory A.
2016-07-01
In coarse-grained (CG) models where certain fine-grained (FG, i.e., atomistic resolution) observables are not directly represented, one can nonetheless identify indirect the CG observables that capture the FG observable's dependence on CG coordinates. Often, in these cases it appears that a CG observable can be defined by analogy to an all-atom or FG observable, but the similarity is misleading and significantly undermines the interpretation of both bottom-up and top-down CG models. Such problems emerge especially clearly in the framework of the systematic bottom-up CG modeling, where a direct and transparent correspondence between FG and CG variables establishes precise conditions for consistency between CG observables and underlying FG models. Here we present and investigate these representability challenges and illustrate them via the bottom-up conceptual framework for several simple analytically tractable polymer models. The examples provide special focus on the observables of configurational internal energy, entropy, and pressure, which have been at the root of controversy in the CG literature, as well as discuss observables that would seem to be entirely missing in the CG representation but can nonetheless be correlated with CG behavior. Though we investigate these problems in the framework of systematic coarse-graining, the lessons apply to top-down CG modeling also, with crucial implications for simulation at constant pressure and surface tension and for the interpretations of structural and thermodynamic correlations for comparison to experiment.
NASA Astrophysics Data System (ADS)
Moriarty, Patrick; Sanz Rodrigo, Javier; Gancarski, Pawel; Chuchfield, Matthew; Naughton, Jonathan W.; Hansen, Kurt S.; Machefaux, Ewan; Maguire, Eoghan; Castellani, Francesco; Terzi, Ludovico; Breton, Simon-Philippe; Ueda, Yuko
2014-06-01
Researchers within the International Energy Agency (IEA) Task 31: Wakebench have created a framework for the evaluation of wind farm flow models operating at the microscale level. The framework consists of a model evaluation protocol integrated with a web-based portal for model benchmarking (www.windbench.net). This paper provides an overview of the building-block validation approach applied to wind farm wake models, including best practices for the benchmarking and data processing procedures for validation datasets from wind farm SCADA and meteorological databases. A hierarchy of test cases has been proposed for wake model evaluation, from similarity theory of the axisymmetric wake and idealized infinite wind farm, to single-wake wind tunnel (UMN-EPFL) and field experiments (Sexbierum), to wind farm arrays in offshore (Horns Rev, Lillgrund) and complex terrain conditions (San Gregorio). A summary of results from the axisymmetric wake, Sexbierum, Horns Rev and Lillgrund benchmarks are used to discuss the state-of-the-art of wake model validation and highlight the most relevant issues for future development.
Global continental and ocean basin reconstructions since 200 Ma
NASA Astrophysics Data System (ADS)
Seton, M.; Müller, R. D.; Zahirovic, S.; Gaina, C.; Torsvik, T.; Shephard, G.; Talsma, A.; Gurnis, M.; Turner, M.; Maus, S.; Chandler, M.
2012-07-01
Global plate motion models provide a spatial and temporal framework for geological data and have been effective tools for exploring processes occurring at the earth's surface. However, published models either have insufficient temporal coverage or fail to treat tectonic plates in a self-consistent manner. They usually consider the motions of selected features attached to tectonic plates, such as continents, but generally do not explicitly account for the continuous evolution of plate boundaries through time. In order to explore the coupling between the surface and mantle, plate models are required that extend over at least a few hundred million years and treat plates as dynamic features with dynamically evolving plate boundaries. We have constructed a new type of global plate motion model consisting of a set of continuously-closing topological plate polygons with associated plate boundaries and plate velocities since the break-up of the supercontinent Pangea. Our model is underpinned by plate motions derived from reconstructing the seafloor-spreading history of the ocean basins and motions of the continents and utilizes a hybrid absolute reference frame, based on a moving hotspot model for the last 100 Ma, and a true-polar wander corrected paleomagnetic model for 200 to 100 Ma. Detailed regional geological and geophysical observations constrain plate boundary inception or cessation, and time-dependent geometry. Although our plate model is primarily designed as a reference model for a new generation of geodynamic studies by providing the surface boundary conditions for the deep earth, it is also useful for studies in disparate fields when a framework is needed for analyzing and interpreting spatio-temporal data.
Self-consistent approach for neutral community models with speciation
NASA Astrophysics Data System (ADS)
Haegeman, Bart; Etienne, Rampal S.
2010-03-01
Hubbell’s neutral model provides a rich theoretical framework to study ecological communities. By incorporating both ecological and evolutionary time scales, it allows us to investigate how communities are shaped by speciation processes. The speciation model in the basic neutral model is particularly simple, describing speciation as a point-mutation event in a birth of a single individual. The stationary species abundance distribution of the basic model, which can be solved exactly, fits empirical data of distributions of species’ abundances surprisingly well. More realistic speciation models have been proposed such as the random-fission model in which new species appear by splitting up existing species. However, no analytical solution is available for these models, impeding quantitative comparison with data. Here, we present a self-consistent approximation method for neutral community models with various speciation modes, including random fission. We derive explicit formulas for the stationary species abundance distribution, which agree very well with simulations. We expect that our approximation method will be useful to study other speciation processes in neutral community models as well.
A Framework to Survey the Energy Efficiency of Installed Motor Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Prakash; Hasanbeigi, Ali; McKane, Aimee
2013-08-01
While motors are ubiquitous throughout the globe, there is insufficient data to properly assess their level of energy efficiency across regional boundaries. Furthermore, many of the existing data sets focus on motor efficiency and neglect the connected drive and system. Without a comprehensive survey of the installed motor system base, a baseline energy efficiency of a country or region’s motor systems cannot be developed. The lack of data impedes government agencies, utilities, manufacturers, distributers, and energy managers when identifying where to invest resources to capture potential energy savings, creating programs aimed at reducing electrical energy consumption, or quantifying the impactsmore » of such programs. This paper will outline a data collection framework for use when conducting a survey under a variety of execution models to characterize motor system energy efficiency within a country or region. The framework is intended to standardize the data collected ensuring consistency across independently conducted surveys. Consistency allows for the surveys to be leveraged against each other enabling comparisons to motor system energy efficiencies from other regions. In creating the framework, an analysis of various motor driven systems, including compressed air, pumping, and fan systems, was conducted and relevant parameters characterizing the efficiency of these systems were identified. A database using the framework will enable policymakers and industry to better assess the improvement potential of their installed motor system base particularly with respect to other regions, assisting in efforts to promote improvements to the energy efficiency of motor driven systems.« less
Reid, Marianne; Botma, Yvonne
2012-06-01
The study undertook the development of a framework for expanding the public services available to children with biomedical healthcare needs related to HIV in South Africa. The study consisted of various component projects which were depicted as phases. The first phase was a descriptive quantitative analysis of healthcare services for children exposed to or infected by HIV, as rendered by the public health sector in the Free State Province. The second stage was informed by health policy research: a nominal group technique with stakeholders was used to identify strategies for expanding the healthcare services available to these children. The third phase consisted of workshops with stakeholders in order to devise and validate a framework for the expansion. The theory of change logic model served as the theoretical underpinning of the draft framework. Triangulated data from the literature and the preceding two phases of the study provided the empirical foundation. The problem identified was that of fragmented care delivered to children exposed to or infected with HIV, due to the 'over-verticalization' of programmes. A workshop was held during which the desired results, the possible factors that could influence the results, as well as the suggested strategies to expand and integrate the public services available to HIV-affected children were confirmed. Thus the framework was finalised during the validation workshop by the researchers in collaboration with the stakeholders.
A Biophysical Modeling Framework for Assessing the Environmental Impact of Biofuel Production
NASA Astrophysics Data System (ADS)
Zhang, X.; Izaurradle, C.; Manowitz, D.; West, T. O.; Post, W. M.; Thomson, A. M.; Nichols, J.; Bandaru, V.; Williams, J. R.
2009-12-01
Long-term sustainability of a biofuel economy necessitates environmentally friendly biofuel production systems. We describe a biophysical modeling framework developed to understand and quantify the environmental value and impact (e.g. water balance, nutrients balance, carbon balance, and soil quality) of different biomass cropping systems. This modeling framework consists of three major components: 1) a Geographic Information System (GIS) based data processing system, 2) a spatially-explicit biophysical modeling approach, and 3) a user friendly information distribution system. First, we developed a GIS to manage the large amount of geospatial data (e.g. climate, land use, soil, and hydrograhy) and extract input information for the biophysical model. Second, the Environmental Policy Integrated Climate (EPIC) biophysical model is used to predict the impact of various cropping systems and management intensities on productivity, water balance, and biogeochemical variables. Finally, a geo-database is developed to distribute the results of ecosystem service variables (e.g. net primary productivity, soil carbon balance, soil erosion, nitrogen and phosphorus losses, and N2O fluxes) simulated by EPIC for each spatial modeling unit online using PostgreSQL. We applied this framework in a Regional Intensive Management Area (RIMA) of 9 counties in Michigan. A total of 4,833 spatial units with relatively homogeneous biophysical properties were derived using SSURGO, Crop Data Layer, County, and 10-digit watershed boundaries. For each unit, EPIC was executed from 1980 to 2003 under 54 cropping scenarios (eg. corn, switchgrass, and hybrid poplar). The simulation results were compared with historical crop yields from USDA NASS. Spatial mapping of the results show high variability among different cropping scenarios in terms of the simulated ecosystem services variables. Overall, the framework developed in this study enables the incorporation of environmental factors into economic and life-cycle analysis in order to optimize biomass cropping production scenarios.
A Conceptual Framework for SAHRA Integrated Multi-resolution Modeling in the Rio Grande Basin
NASA Astrophysics Data System (ADS)
Liu, Y.; Gupta, H.; Springer, E.; Wagener, T.; Brookshire, D.; Duffy, C.
2004-12-01
The sustainable management of water resources in a river basin requires an integrated analysis of the social, economic, environmental and institutional dimensions of the problem. Numerical models are commonly used for integration of these dimensions and for communication of the analysis results to stakeholders and policy makers. The National Science Foundation Science and Technology Center for Sustainability of semi-Arid Hydrology and Riparian Areas (SAHRA) has been developing integrated multi-resolution models to assess impacts of climate variability and land use change on water resources in the Rio Grande Basin. These models not only couple natural systems such as surface and ground waters, but will also include engineering, economic and social components that may be involved in water resources decision-making processes. This presentation will describe the conceptual framework being developed by SAHRA to guide and focus the multiple modeling efforts and to assist the modeling team in planning, data collection and interpretation, communication, evaluation, etc. One of the major components of this conceptual framework is a Conceptual Site Model (CSM), which describes the basin and its environment based on existing knowledge and identifies what additional information must be collected to develop technically sound models at various resolutions. The initial CSM is based on analyses of basin profile information that has been collected, including a physical profile (e.g., topographic and vegetative features), a man-made facility profile (e.g., dams, diversions, and pumping stations), and a land use and ecological profile (e.g., demographics, natural habitats, and endangered species). Based on the initial CSM, a Conceptual Physical Model (CPM) is developed to guide and evaluate the selection of a model code (or numerical model) for each resolution to conduct simulations and predictions. A CPM identifies, conceptually, all the physical processes and engineering and socio-economic activities occurring (or to occur) in the real system that the corresponding numerical models are required to address, such as riparian evapotranspiration responses to vegetation change and groundwater pumping impacts on soil moisture contents. Simulation results from different resolution models and observations of the real system will then be compared to evaluate the consistency among the CSM, the CPMs, and the numerical models, and feedbacks will be used to update the models. In a broad sense, the evaluation of the models (conceptual or numerical), as well as the linkages between them, can be viewed as a part of the overall conceptual framework. As new data are generated and understanding improves, the models will evolve, and the overall conceptual framework is refined. The development of the conceptual framework becomes an on-going process. We will describe the current state of this framework and the open questions that have to be addressed in the future.
Tuo, Rui; Jeff Wu, C. F.
2016-07-19
Calibration parameters in deterministic computer experiments are those attributes that cannot be measured or available in physical experiments. Here, an approach to estimate them by using data from physical experiments and computer simulations. A theoretical framework is given which allows us to study the issues of parameter identifiability and estimation. We define the L 2-consistency for calibration as a justification for calibration methods. It is shown that a simplified version of the original KO method leads to asymptotically L 2-inconsistent calibration. This L 2-inconsistency can be remedied by modifying the original estimation procedure. A novel calibration method, called the Lmore » 2 calibration, is proposed and proven to be L 2-consistent and enjoys optimal convergence rate. Furthermore a numerical example and some mathematical analysis are used to illustrate the source of the L 2-inconsistency problem.« less
Managing complexity in simulations of land surface and near-surface processes
Coon, Ethan T.; Moulton, J. David; Painter, Scott L.
2016-01-12
Increasing computing power and the growing role of simulation in Earth systems science have led to an increase in the number and complexity of processes in modern simulators. We present a multiphysics framework that specifies interfaces for coupled processes and automates weak and strong coupling strategies to manage this complexity. Process management is enabled by viewing the system of equations as a tree, where individual equations are associated with leaf nodes and coupling strategies with internal nodes. A dynamically generated dependency graph connects a variable to its dependencies, streamlining and automating model evaluation, easing model development, and ensuring models aremore » modular and flexible. Additionally, the dependency graph is used to ensure that data requirements are consistent between all processes in a given simulation. Here we discuss the design and implementation of these concepts within the Arcos framework, and demonstrate their use for verification testing and hypothesis evaluation in numerical experiments.« less
Assessing Inter-Sectoral Climate Change Risks: The Role of ISIMIP
NASA Technical Reports Server (NTRS)
Rosenzweig, Cynthia; Arnell, Nigel W.; Ebi, Kristie L.; Lotze-Campen, Hermann; Raes, Frank; Rapley, Chris; Smith, Mark Stafford; Cramer, Wolfgang; Frieler, Katja; Reyer, Christopher P. O.;
2017-01-01
The aims of the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP) are to provide a framework for the intercomparison of global and regional-scale risk models within and across multiple sectors and to enable coordinated multi-sectoral assessments of different risks and their aggregated effects. The overarching goal is to use the knowledge gained to support adaptation and mitigation decisions that require regional or global perspectives within the context of facilitating transformations to enable sustainable development, despite inevitable climate shifts and disruptions. ISIMIP uses community-agreed sets of scenarios with standardized climate variables and socioeconomic projections as inputs for projecting future risks and associated uncertainties, within and across sectors. The results are consistent multi-model assessments of sectoral risks and opportunities that enable studies that integrate across sectors, providing support for implementation of the Paris Agreement under the United Nations Framework Convention on Climate Change.
Modeling Urban Scenarios & Experiments: Fort Indiantown Gap Data Collections Summary and Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Archer, Daniel E.; Bandstra, Mark S.; Davidson, Gregory G.
This report summarizes experimental radiation detector, contextual sensor, weather, and global positioning system (GPS) data collected to inform and validate a comprehensive, operational radiation transport modeling framework to evaluate radiation detector system and algorithm performance. This framework will be used to study the influence of systematic effects (such as geometry, background activity, background variability, environmental shielding, etc.) on detector responses and algorithm performance using synthetic time series data. This work consists of performing data collection campaigns at a canonical, controlled environment for complete radiological characterization to help construct and benchmark a high-fidelity model with quantified system geometries, detector response functions,more » and source terms for background and threat objects. This data also provides an archival, benchmark dataset that can be used by the radiation detection community. The data reported here spans four data collection campaigns conducted between May 2015 and September 2016.« less
Automatic classification of animal vocalizations
NASA Astrophysics Data System (ADS)
Clemins, Patrick J.
2005-11-01
Bioacoustics, the study of animal vocalizations, has begun to use increasingly sophisticated analysis techniques in recent years. Some common tasks in bioacoustics are repertoire determination, call detection, individual identification, stress detection, and behavior correlation. Each research study, however, uses a wide variety of different measured variables, called features, and classification systems to accomplish these tasks. The well-established field of human speech processing has developed a number of different techniques to perform many of the aforementioned bioacoustics tasks. Melfrequency cepstral coefficients (MFCCs) and perceptual linear prediction (PLP) coefficients are two popular feature sets. The hidden Markov model (HMM), a statistical model similar to a finite autonoma machine, is the most commonly used supervised classification model and is capable of modeling both temporal and spectral variations. This research designs a framework that applies models from human speech processing for bioacoustic analysis tasks. The development of the generalized perceptual linear prediction (gPLP) feature extraction model is one of the more important novel contributions of the framework. Perceptual information from the species under study can be incorporated into the gPLP feature extraction model to represent the vocalizations as the animals might perceive them. By including this perceptual information and modifying parameters of the HMM classification system, this framework can be applied to a wide range of species. The effectiveness of the framework is shown by analyzing African elephant and beluga whale vocalizations. The features extracted from the African elephant data are used as input to a supervised classification system and compared to results from traditional statistical tests. The gPLP features extracted from the beluga whale data are used in an unsupervised classification system and the results are compared to labels assigned by experts. The development of a framework from which to build animal vocalization classifiers will provide bioacoustics researchers with a consistent platform to analyze and classify vocalizations. A common framework will also allow studies to compare results across species and institutions. In addition, the use of automated classification techniques can speed analysis and uncover behavioral correlations not readily apparent using traditional techniques.
Model-independent cosmological constraints from growth and expansion
NASA Astrophysics Data System (ADS)
L'Huillier, Benjamin; Shafieloo, Arman; Kim, Hyungjin
2018-05-01
Reconstructing the expansion history of the Universe from Type Ia supernovae data, we fit the growth rate measurements and put model-independent constraints on some key cosmological parameters, namely, Ωm, γ, and σ8. The constraints are consistent with those from the concordance model within the framework of general relativity, but the current quality of the data is not sufficient to rule out modified gravity models. Adding the condition that dark energy density should be positive at all redshifts, independently of its equation of state, further constrains the parameters and interestingly supports the concordance model.
Csiszar, Susan A; Meyer, David E; Dionisio, Kathie L; Egeghy, Peter; Isaacs, Kristin K; Price, Paul S; Scanlon, Kelly A; Tan, Yu-Mei; Thomas, Kent; Vallero, Daniel; Bare, Jane C
2016-11-01
Life Cycle Assessment (LCA) is a decision-making tool that accounts for multiple impacts across the life cycle of a product or service. This paper presents a conceptual framework to integrate human health impact assessment with risk screening approaches to extend LCA to include near-field chemical sources (e.g., those originating from consumer products and building materials) that have traditionally been excluded from LCA. A new generation of rapid human exposure modeling and high-throughput toxicity testing is transforming chemical risk prioritization and provides an opportunity for integration of screening-level risk assessment (RA) with LCA. The combined LCA and RA approach considers environmental impacts of products alongside risks to human health, which is consistent with regulatory frameworks addressing RA within a sustainability mindset. A case study is presented to juxtapose LCA and risk screening approaches for a chemical used in a consumer product. The case study demonstrates how these new risk screening tools can be used to inform toxicity impact estimates in LCA and highlights needs for future research. The framework provides a basis for developing tools and methods to support decision making on the use of chemicals in products.
Bures, Vladimír; Otcenásková, Tereza; Cech, Pavel; Antos, Karel
2012-11-01
Biological incidents jeopardising public health require decision-making that consists of one dominant feature: complexity. Therefore, public health decision-makers necessitate appropriate support. Based on the analogy with business intelligence (BI) principles, the contextual analysis of the environment and available data resources, and conceptual modelling within systems and knowledge engineering, this paper proposes a general framework for computer-based decision support in the case of a biological incident. At the outset, the analysis of potential inputs to the framework is conducted and several resources such as demographic information, strategic documents, environmental characteristics, agent descriptors and surveillance systems are considered. Consequently, three prototypes were developed, tested and evaluated by a group of experts. Their selection was based on the overall framework scheme. Subsequently, an ontology prototype linked with an inference engine, multi-agent-based model focusing on the simulation of an environment, and expert-system prototypes were created. All prototypes proved to be utilisable support tools for decision-making in the field of public health. Nevertheless, the research revealed further issues and challenges that might be investigated by both public health focused researchers and practitioners.
A Working Framework for Enabling International Science Data System Interoperability
NASA Astrophysics Data System (ADS)
Hughes, J. Steven; Hardman, Sean; Crichton, Daniel J.; Martinez, Santa; Law, Emily; Gordon, Mitchell K.
2016-07-01
For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework that leverages ISO level reference models for metadata registries and digital archives. This framework provides multi-level governance, evolves independent of the implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation is captured in an ontology through a process of knowledge acquisition. Discipline experts in the role of stewards at the common, discipline, and project levels work to design and populate the ontology model. The result is a formal and consistent knowledge base that provides requirements for data representation, integrity, provenance, context, identification, and relationship. The contents of the knowledge base are translated and written to files in suitable formats to configure system software and services, provide user documentation, validate input, and support data analytics. This presentation will provide an overview of the framework, present a use case that has been adopted by an entire science discipline at the international level, and share some important lessons learned.
Consistent multiphysics simulation of a central tower CSP plant as applied to ISTORE
NASA Astrophysics Data System (ADS)
Votyakov, Evgeny V.; Papanicolas, Costas N.
2017-06-01
We present a unified consistent multiphysics approach to model a central tower CSP plant. The framework for the model includes Monte Carlo ray tracing (RT) and computational fluid dynamics (CFD) components utilizing the OpenFOAM C++ software library. The RT part works effectively with complex surfaces of engineering design given in CAD formats. The CFD simulation, which is based on 3D Navier-Stokes equations, takes into account all possible heat transfer mechanisms: radiation, conduction, and convection. Utilizing this package, the solar field of the experimental Platform for Research, Observation, and TEchnological Applications in Solar Energy (PROTEAS) and the Integrated STOrage and Receiver (ISTORE), developed at the Cyprus Institute, are being examined.
Pressure calculation in hybrid particle-field simulations
NASA Astrophysics Data System (ADS)
Milano, Giuseppe; Kawakatsu, Toshihiro
2010-12-01
In the framework of a recently developed scheme for a hybrid particle-field simulation techniques where self-consistent field (SCF) theory and particle models (molecular dynamics) are combined [J. Chem. Phys. 130, 214106 (2009)], we developed a general formulation for the calculation of instantaneous pressure and stress tensor. The expressions have been derived from statistical mechanical definition of the pressure starting from the expression for the free energy functional in the SCF theory. An implementation of the derived formulation suitable for hybrid particle-field molecular dynamics-self-consistent field simulations is described. A series of test simulations on model systems are reported comparing the calculated pressure with those obtained from standard molecular dynamics simulations based on pair potentials.
Validating a new methodology for strain estimation from cardiac cine MRI
NASA Astrophysics Data System (ADS)
Elnakib, Ahmed; Beache, Garth M.; Gimel'farb, Georgy; Inanc, Tamer; El-Baz, Ayman
2013-10-01
This paper focuses on validating a novel framework for estimating the functional strain from cine cardiac magnetic resonance imaging (CMRI). The framework consists of three processing steps. First, the left ventricle (LV) wall borders are segmented using a level-set based deformable model. Second, the points on the wall borders are tracked during the cardiac cycle based on solving the Laplace equation between the LV edges. Finally, the circumferential and radial strains are estimated at the inner, mid-wall, and outer borders of the LV wall. The proposed framework is validated using synthetic phantoms of the material strains that account for the physiological features and the LV response during the cardiac cycle. Experimental results on simulated phantom images confirm the accuracy and robustness of our method.
The Finite Strain Johnson Cook Plasticity and Damage Constitutive Model in ALEGRA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanchez, Jason James
A finite strain formulation of the Johnson Cook plasticity and damage model and it's numerical implementation into the ALEGRA code is presented. The goal of this work is to improve the predictive material failure capability of the Johnson Cook model. The new implementation consists of a coupling of damage and the stored elastic energy as well as the minimum failure strain criteria for spall included in the original model development. This effort establishes the necessary foundation for a thermodynamically consistent and complete continuum solid material model, for which all intensive properties derive from a common energy. The motivation for developingmore » such a model is to improve upon ALEGRA's present combined model framework. Several applications of the new Johnson Cook implementation are presented. Deformation driven loading paths demonstrate the basic features of the new model formulation. Use of the model produces good comparisons with experimental Taylor impact data. Localized deformation leading to fragmentation is produced for expanding ring and exploding cylinder applications.« less
A model for the pilot's use of motion cues in roll-axis tracking tasks
NASA Technical Reports Server (NTRS)
Levison, W. H.; Junker, A. M.
1977-01-01
Simulated target-following and disturbance-regulation tasks were explored with subjects using visual-only and combined visual and motion cues. The effects of motion cues on task performance and pilot response behavior were appreciably different for the two task configurations and were consistent with data reported in earlier studies for similar task configurations. The optimal-control model for pilot/vehicle systems provided a task-independent framework for accounting for the pilot's use of motion cues. Specifically, the availability of motion cues was modeled by augmenting the set of perceptual variables to include position, rate, acceleration, and accleration-rate of the motion simulator, and results were consistent with the hypothesis of attention-sharing between visual and motion variables. This straightforward informational model allowed accurate model predictions of the effects of motion cues on a variety of response measures for both the target-following and disturbance-regulation tasks.
Chatterjee, Abhijit; Vlachos, Dionisios G
2007-07-21
While recently derived continuum mesoscopic equations successfully bridge the gap between microscopic and macroscopic physics, so far they have been derived only for simple lattice models. In this paper, general deterministic continuum mesoscopic equations are derived rigorously via nonequilibrium statistical mechanics to account for multiple interacting surface species and multiple processes on multiple site types and/or different crystallographic planes. Adsorption, desorption, reaction, and surface diffusion are modeled. It is demonstrated that contrary to conventional phenomenological continuum models, microscopic physics, such as the interaction potential, determines the final form of the mesoscopic equation. Models of single component diffusion and binary diffusion of interacting particles on single-type site lattice and of single component diffusion on complex microporous materials' lattices consisting of two types of sites are derived, as illustrations of the mesoscopic framework. Simplification of the diffusion mesoscopic model illustrates the relation to phenomenological models, such as the Fickian and Maxwell-Stefan transport models. It is demonstrated that the mesoscopic equations are in good agreement with lattice kinetic Monte Carlo simulations for several prototype examples studied.
Electric train energy consumption modeling
Wang, Jinghui; Rakha, Hesham A.
2017-05-01
For this paper we develop an electric train energy consumption modeling framework considering instantaneous regenerative braking efficiency in support of a rail simulation system. The model is calibrated with data from Portland, Oregon using an unconstrained non-linear optimization procedure, and validated using data from Chicago, Illinois by comparing model predictions against the National Transit Database (NTD) estimates. The results demonstrate that regenerative braking efficiency varies as an exponential function of the deceleration level, rather than an average constant as assumed in previous studies. The model predictions are demonstrated to be consistent with the NTD estimates, producing a predicted error ofmore » 1.87% and -2.31%. The paper demonstrates that energy recovery reduces the overall power consumption by 20% for the tested Chicago route. Furthermore, the paper demonstrates that the proposed modeling approach is able to capture energy consumption differences associated with train, route and operational parameters, and thus is applicable for project-level analysis. The model can be easily implemented in traffic simulation software, used in smartphone applications and eco-transit programs given its fast execution time and easy integration in complex frameworks.« less
Electric train energy consumption modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jinghui; Rakha, Hesham A.
For this paper we develop an electric train energy consumption modeling framework considering instantaneous regenerative braking efficiency in support of a rail simulation system. The model is calibrated with data from Portland, Oregon using an unconstrained non-linear optimization procedure, and validated using data from Chicago, Illinois by comparing model predictions against the National Transit Database (NTD) estimates. The results demonstrate that regenerative braking efficiency varies as an exponential function of the deceleration level, rather than an average constant as assumed in previous studies. The model predictions are demonstrated to be consistent with the NTD estimates, producing a predicted error ofmore » 1.87% and -2.31%. The paper demonstrates that energy recovery reduces the overall power consumption by 20% for the tested Chicago route. Furthermore, the paper demonstrates that the proposed modeling approach is able to capture energy consumption differences associated with train, route and operational parameters, and thus is applicable for project-level analysis. The model can be easily implemented in traffic simulation software, used in smartphone applications and eco-transit programs given its fast execution time and easy integration in complex frameworks.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kandler; Shi, Ying; Santhanagopalan, Shriram
Predictive models of Li-ion battery lifetime must consider a multiplicity of electrochemical, thermal, and mechanical degradation modes experienced by batteries in application environments. To complicate matters, Li-ion batteries can experience different degradation trajectories that depend on storage and cycling history of the application environment. Rates of degradation are controlled by factors such as temperature history, electrochemical operating window, and charge/discharge rate. We present a generalized battery life prognostic model framework for battery systems design and control. The model framework consists of trial functions that are statistically regressed to Li-ion cell life datasets wherein the cells have been aged under differentmore » levels of stress. Degradation mechanisms and rate laws dependent on temperature, storage, and cycling condition are regressed to the data, with multiple model hypotheses evaluated and the best model down-selected based on statistics. The resulting life prognostic model, implemented in state variable form, is extensible to arbitrary real-world scenarios. The model is applicable in real-time control algorithms to maximize battery life and performance. We discuss efforts to reduce lifetime prediction error and accommodate its inevitable impact in controller design.« less
Detailed assessment of global transport-energy models’ structures and projections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yeh, Sonia; Mishra, Gouri Shankar; Fulton, Lew
This paper focuses on comparing the frameworks and projections from four major global transportation models with considerable transportation technology and behavioral detail. We analyze and compare the modeling frameworks, underlying data, assumptions, intermediate parameters, and projections to identify the sources of divergence or consistency, as well as key knowledge gaps. We find that there are significant differences in the base-year data and key parameters for future projections, especially for developing countries. These include passenger and freight activity, mode shares, vehicle ownership rates, and even energy consumption by mode, particularly for shipping, aviation and trucking. This may be due in partmore » to a lack of previous efforts to do such consistency-checking and “bench-marking.” We find that the four models differ in terms of the relative roles of various mitigation strategies to achieve a 2°C / 450 ppm CO2e target: the economics-based integrated assessment models favor the use of low carbon fuels as the primary mitigation option followed by efficiency improvements, whereas transport-only and expert-based models favor efficiency improvements of vehicles followed by mode shifts. We offer recommendations for future modeling improvements focusing on (1) reducing data gaps; (2) translating the findings from this study into relevant policy implications such as feasibility of current policy goals, additional policy targets needed, regional vs. global reductions, etc.; (3) modeling strata of demographic groups to improve understanding of vehicle ownership levels, travel behavior, and urban vs. rural considerations; and (4) conducting coordinated efforts in aligning input assumptions and historical data, policy analysis, and modeling insights.« less
Nimmo, J.R.; Herkelrath, W.N.; Laguna, Luna A.M.
2007-01-01
Numerous models are in widespread use for the estimation of soil water retention from more easily measured textural data. Improved models are needed for better prediction and wider applicability. We developed a basic framework from which new and existing models can be derived to facilitate improvements. Starting from the assumption that every particle has a characteristic dimension R associated uniquely with a matric pressure ?? and that the form of the ??-R relation is the defining characteristic of each model, this framework leads to particular models by specification of geometric relationships between pores and particles. Typical assumptions are that particles are spheres, pores are cylinders with volume equal to the associated particle volume times the void ratio, and that the capillary inverse proportionality between radius and matric pressure is valid. Examples include fixed-pore-shape and fixed-pore-length models. We also developed alternative versions of the model of Arya and Paris that eliminate its interval-size dependence and other problems. The alternative models are calculable by direct application of algebraic formulas rather than manipulation of data tables and intermediate results, and they easily combine with other models (e.g., incorporating structural effects) that are formulated on a continuous basis. Additionally, we developed a family of models based on the same pore geometry as the widely used unsaturated hydraulic conductivity model of Mualem. Predictions of measurements for different suitable media show that some of the models provide consistently good results and can be chosen based on ease of calculations and other factors. ?? Soil Science Society of America. All rights reserved.
Spatially explicit land-use and land-cover scenarios for the Great Plains of the United States
Sohl, Terry L.; Sleeter, Benjamin M.; Sayler, Kristi L.; Bouchard, Michelle A.; Reker, Ryan R.; Bennett, Stacie L.; Sleeter, Rachel R.; Kanengieter, Ronald L.; Zhu, Zhi-Liang
2012-01-01
The Great Plains of the United States has undergone extensive land-use and land-cover change in the past 150 years, with much of the once vast native grasslands and wetlands converted to agricultural crops, and much of the unbroken prairie now heavily grazed. Future land-use change in the region could have dramatic impacts on ecological resources and processes. A scenario-based modeling framework is needed to support the analysis of potential land-use change in an uncertain future, and to mitigate potentially negative future impacts on ecosystem processes. We developed a scenario-based modeling framework to analyze potential future land-use change in the Great Plains. A unique scenario construction process, using an integrated modeling framework, historical data, workshops, and expert knowledge, was used to develop quantitative demand for future land-use change for four IPCC scenarios at the ecoregion level. The FORE-SCE model ingested the scenario information and produced spatially explicit land-use maps for the region at relatively fine spatial and thematic resolutions. Spatial modeling of the four scenarios provided spatial patterns of land-use change consistent with underlying assumptions and processes associated with each scenario. Economically oriented scenarios were characterized by significant loss of natural land covers and expansion of agricultural and urban land uses. Environmentally oriented scenarios experienced modest declines in natural land covers to slight increases. Model results were assessed for quantity and allocation disagreement between each scenario pair. In conjunction with the U.S. Geological Survey's Biological Carbon Sequestration project, the scenario-based modeling framework used for the Great Plains is now being applied to the entire United States.
Kristensen, Finn Børlum; Lampe, Kristian; Wild, Claudia; Cerbo, Marina; Goettsch, Wim; Becla, Lidia
2017-02-01
The HTA Core Model ® as a science-based framework for assessing dimensions of value was developed as a part of the European network for Health Technology Assessment project in the period 2006 to 2008 to facilitate production and sharing of health technology assessment (HTA) information, such as evidence on efficacy and effectiveness and patient aspects, to inform decisions. It covers clinical value as well as organizational, economic, and patient aspects of technologies and has been field-tested in two consecutive joint actions in the period 2010 to 2016. A large number of HTA institutions were involved in the work. The model has undergone revisions and improvement after iterations of piloting and can be used in a local, national, or international context to produce structured HTA information that can be taken forward by users into their own frameworks to fit their specific needs when informing decisions on technology. The model has a broad scope and offers a common ground to various stakeholders through offering a standard structure and a transparent set of proposed HTA questions. It consists of three main components: 1) the HTA ontology, 2) methodological guidance, and 3) a common reporting structure. It covers domains such as effectiveness, safety, and economics, and also includes domains covering organizational, patient, social, and legal aspects. There is a full model and a focused rapid relative effectiveness assessment model, and a third joint action is to continue till 2020. The HTA Core Model is now available for everyone around the world as a framework for assessing value. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Turnbull, Wayne; Burton, Diana; Mullins, Pat
2008-01-01
The UK higher education sector is grounded in an academic culture protective of its autonomy in the exercise of academic judgement within a flexible and internally validated tradition. However, the socio-political demands placed upon this sector articulate an outcomes-based, transparent and consistent model of higher education provision, as…
Contributions of cultural services to the ecosystem services agenda
Terry C. Daniel; Andreas Muhar; Arne Arnberger; Olivier Aznar; James W. Boyd; Kai M.A. Chan; Robert Costanza; Thomas Elmqvist; Courtney G. Flint; Paul H. Gobster; A. Gret-Regamey; R. Lave; S. Muhar; M. Penker; R.G. Ribe; T. Schauppenlehner; T. Sikor; I. Soloviy; M. Spierenburg; K. Taczanowska; J. Tam; A. von der Dunk
2012-01-01
Cultural ecosystem services (ES) are consistently recognized but not yet adequately defined or integrated within the ES framework. A substantial body of models, methods, and data relevant to cultural services has been developed within the social and behavioral sciences before and outside of the ES approach. A selective review of work in landscape aesthetics, cultural...
Users' Interaction with World Wide Web Resources: An Exploratory Study Using a Holistic Approach.
ERIC Educational Resources Information Center
Wang, Peiling; Hawk, William B.; Tenopir, Carol
2000-01-01
Presents results of a study that explores factors of user-Web interaction in finding factual information, develops a conceptual framework for studying user-Web interaction, and applies a process-tracing method for conducting holistic user-Web studies. Describes measurement techniques and proposes a model consisting of the user, interface, and the…
DOT National Transportation Integrated Search
1997-08-01
This document provides guidelines for the review of state ITS/CVO business plans by the FHWA. The purpose of the review is to ensure that the state business plans have been developed in a manner consistent with the FHWAs intent in awarding ITS/CVO...
Kimberley K. Ayre; Wayne G. Landis
2012-01-01
We present a Bayesian network model based on the ecological risk assessment framework to evaluate potential impacts to habitats and resources resulting from wildfire, grazing, forest management activities, and insect outbreaks in a forested landscape in northeastern Oregon. The Bayesian network structure consisted of three tiers of nodes: landscape disturbances,...
NASA Astrophysics Data System (ADS)
Carniel, Roberto; Di Cecca, Mauro; Jaquet, Olivier
2006-05-01
In the framework of the EU-funded project "Multi-disciplinary monitoring, modelling and forecasting of volcanic hazard" (MULTIMO), multiparametric data have been recorded at the MULTIMO station in Montserrat. Moreover, several other long time series, recorded at Montserrat and at other volcanoes, have been acquired in order to test stochastic and deterministic methodologies under development. Creating a general framework to handle data efficiently is a considerable task even for homogeneous data. In the case of heterogeneous data, this becomes a major issue. A need for a consistent way of browsing such a heterogeneous dataset in a user-friendly way therefore arose. Additionally, a framework for applying the calculation of the developed dynamical parameters on the data series was also needed in order to easily keep these parameters under control, e.g. for monitoring, research or forecasting purposes. The solution which we present is completely based on Open Source software, including Linux operating system, MySql database management system, Apache web server, Zope application server, Scilab math engine, Plone content management framework, Unified Modelling Language. From the user point of view the main advantage is the possibility of browsing through datasets recorded on different volcanoes, with different instruments, with different sampling frequencies, stored in different formats, all via a consistent, user- friendly interface that transparently runs queries to the database, gets the data from the main storage units, generates the graphs and produces dynamically generated web pages to interact with the user. The involvement of third parties for continuing the development in the Open Source philosophy and/or extending the application fields is now sought.
Modeling dynamics of large tabular icebergs submerged in the ocean
NASA Astrophysics Data System (ADS)
Adcroft, A.; Stern, A. A.; Sergienko, O. V.
2017-12-01
Large tabular icebergs account for a major fraction of the ice calved from the Antarctic ice shelves, and have long lifetimes due to their size. They drift for long distances, interacting with the local ocean circulation, impacting bottom-water formation, sea-ice formation, and biological productivity in the vicinity of the icebergs. However, due to their large horizontal extent and mass, it is challenging to consistently represent large tabular icebergs in global ocean circulation models and so large tabular icebergs are not currently represented in climate models. In this study we develop a novel framework to model large tabular icebergs submerged in the ocean. In this framework, a tabular iceberg is represented by a collection of Lagrangian elements that are linked through rigid bonds. The Lagrangian elements are finite-area modifications of the point-particles used in previous studies to represent small icebergs. These elements interact with the ocean by exerting pressure on the ocean surface, and through melt water and momentum exchange. A breaking of the rigid bonds allows the model to emulate calving events (i.e. detachment of a tabular iceberg from an ice shelf), and to emulate the breaking up of tabular icebergs into smaller pieces. Idealized simulations of the calving of a tabular iceberg, subsequent drift and breakup, demonstrate the capabilities of the new framework with a promise that climate models may soon be able to represent large tabular icebergs.
Halliday, David M; Senik, Mohd Harizal; Stevenson, Carl W; Mason, Rob
2016-08-01
The ability to infer network structure from multivariate neuronal signals is central to computational neuroscience. Directed network analyses typically use parametric approaches based on auto-regressive (AR) models, where networks are constructed from estimates of AR model parameters. However, the validity of using low order AR models for neurophysiological signals has been questioned. A recent article introduced a non-parametric approach to estimate directionality in bivariate data, non-parametric approaches are free from concerns over model validity. We extend the non-parametric framework to include measures of directed conditional independence, using scalar measures that decompose the overall partial correlation coefficient summatively by direction, and a set of functions that decompose the partial coherence summatively by direction. A time domain partial correlation function allows both time and frequency views of the data to be constructed. The conditional independence estimates are conditioned on a single predictor. The framework is applied to simulated cortical neuron networks and mixtures of Gaussian time series data with known interactions. It is applied to experimental data consisting of local field potential recordings from bilateral hippocampus in anaesthetised rats. The framework offers a non-parametric approach to estimation of directed interactions in multivariate neuronal recordings, and increased flexibility in dealing with both spike train and time series data. The framework offers a novel alternative non-parametric approach to estimate directed interactions in multivariate neuronal recordings, and is applicable to spike train and time series data. Copyright © 2016 Elsevier B.V. All rights reserved.
VO2 estimation using 6-axis motion sensor with sports activity classification.
Nagata, Takashi; Nakamura, Naoteru; Miyatake, Masato; Yuuki, Akira; Yomo, Hiroyuki; Kawabata, Takashi; Hara, Shinsuke
2016-08-01
In this paper, we focus on oxygen consumption (VO2) estimation using 6-axis motion sensor (3-axis accelerometer and 3-axis gyroscope) for people playing sports with diverse intensities. The VO2 estimated with a small motion sensor can be used to calculate the energy expenditure, however, its accuracy depends on the intensities of various types of activities. In order to achieve high accuracy over a wide range of intensities, we employ an estimation framework that first classifies activities with a simple machine-learning based classification algorithm. We prepare different coefficients of linear regression model for different types of activities, which are determined with training data obtained by experiments. The best-suited model is used for each type of activity when VO2 is estimated. The accuracy of the employed framework depends on the trade-off between the degradation due to classification errors and improvement brought by applying separate, optimum model to VO2 estimation. Taking this trade-off into account, we evaluate the accuracy of the employed estimation framework by using a set of experimental data consisting of VO2 and motion data of people with a wide range of intensities of exercises, which were measured by a VO2 meter and motion sensor, respectively. Our numerical results show that the employed framework can improve the estimation accuracy in comparison to a reference method that uses a common regression model for all types of activities.
From Solvent-Free to Dilute Electrolytes: Essential Components for a Continuum Theory.
Gavish, Nir; Elad, Doron; Yochelis, Arik
2018-01-04
The increasing number of experimental observations on highly concentrated electrolytes and ionic liquids show qualitative features that are distinct from dilute or moderately concentrated electrolytes, such as self-assembly, multiple-time relaxation, and underscreening, which all impact the emergence of fluid/solid interfaces, and the transport in these systems. Because these phenomena are not captured by existing mean-field models of electrolytes, there is a paramount need for a continuum framework for highly concentrated electrolytes and ionic liquid mixtures. In this work, we present a self-consistent spatiotemporal framework for a ternary composition that comprises ions and solvent employing a free energy that consists of short- and long-range interactions, along with an energy dissipation mechanism obtained by Onsager's relations. We show that the model can describe multiple bulk and interfacial morphologies at steady-state. Thus, the dynamic processes in the emergence of distinct morphologies become equally as important as the interactions that are specified by the free energy. The model equations not only provide insights into transport mechanisms beyond the Stokes-Einstein-Smoluchowski relations but also enable qualitative recovery of three distinct regions in the full range of the nonmonotonic electrical screening length that has been recently observed in experiments in which organic solvent is used to dilute ionic liquids.
NASA Astrophysics Data System (ADS)
Priyadi, Y.; Prasetio, A.
2018-03-01
This research resulted in the development of e-SCM application, in small-scale group of fish farmers based on Open Source technology in Ulekan Market Bandung, by collaborating the implementation of e-SCM and Data Management. Then proceed with the application of supply chain business through collaboration Business Model Canvas and Waterfall Framework. For the design of business process reengineering in this activity, it produces a context diagram called e-SCM SME Fish consisting of five entities directly involved with the system, namely: fish shop supervisor, fish shop retailer, employees, fish farmers, and customers. Referring to the Context Diagram, decomposition process of Level 0 e-SCM SMEs Fish. The decomposition results in Data Flow Diagram Level 1 for four sub processes, namely: business partners, transactions, retailer stock, and documentation. Result of nine blocks on Business Model Canvas on e-SCM activity, its category consist of Priority 1, Priority 2, Direct, Indirect, Purchase/e-SCM, Transactional, Community, Asset Sale, Physical Asset, Human, Production, Strategic Alliance -competitors, Coopetition, Buyer supplier relationship, Fixed Cost, Variable Cost. For integration of data management on Localhost Server media on e-SCM using http://whyphi: 8080 address, as prototype which will soon be adopted by farmer fish farmer.
NASA Astrophysics Data System (ADS)
de Jong, Floor; van Hillegersberg, Jos; van Eck, Pascal; van der Kolk, Feiko; Jorissen, Rene
The lack of effective IT governance is widely recognized as a key inhibitor to successful global IT outsourcing relationships. In this study we present the development and application of a governance framework to improve outsourcing relationships. The approach used to developing an IT governance framework includes a meta model and a customization process to fit the framework to the target organization. The IT governance framework consists of four different elements (1) organisational structures, (2) joint processes between in- and outsourcer, (3) responsibilities that link roles to processes and (4) a diverse set of control indicators to measure the success of the relationship. The IT governance framework is put in practice in Shell GFIT BAM, a part of Shell that concluded to have a lack of management control over at least one of their outsourcing relationships. In a workshop the governance framework was used to perform a gap analysis between the current and desired governance. Several gaps were identified in the way roles and responsibilities are assigned and joint processes are set-up. Moreover, this workshop also showed the usefulness and usability of the IT governance framework in structuring, providing input and managing stakeholders in the discussions around IT governance.
Spielman, Stephanie J; Wilke, Claus O
2016-11-01
The mutation-selection model of coding sequence evolution has received renewed attention for its use in estimating site-specific amino acid propensities and selection coefficient distributions. Two computationally tractable mutation-selection inference frameworks have been introduced: One framework employs a fixed-effects, highly parameterized maximum likelihood approach, whereas the other employs a random-effects Bayesian Dirichlet Process approach. While both implementations follow the same model, they appear to make distinct predictions about the distribution of selection coefficients. The fixed-effects framework estimates a large proportion of highly deleterious substitutions, whereas the random-effects framework estimates that all substitutions are either nearly neutral or weakly deleterious. It remains unknown, however, how accurately each method infers evolutionary constraints at individual sites. Indeed, selection coefficient distributions pool all site-specific inferences, thereby obscuring a precise assessment of site-specific estimates. Therefore, in this study, we use a simulation-based strategy to determine how accurately each approach recapitulates the selective constraint at individual sites. We find that the fixed-effects approach, despite its extensive parameterization, consistently and accurately estimates site-specific evolutionary constraint. By contrast, the random-effects Bayesian approach systematically underestimates the strength of natural selection, particularly for slowly evolving sites. We also find that, despite the strong differences between their inferred selection coefficient distributions, the fixed- and random-effects approaches yield surprisingly similar inferences of site-specific selective constraint. We conclude that the fixed-effects mutation-selection framework provides the more reliable software platform for model application and future development. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perdikaris, Paris, E-mail: parisp@mit.edu; Grinberg, Leopold, E-mail: leopoldgrinberg@us.ibm.com; Karniadakis, George Em, E-mail: george-karniadakis@brown.edu
The aim of this work is to present an overview of recent advances in multi-scale modeling of brain blood flow. In particular, we present some approaches that enable the in silico study of multi-scale and multi-physics phenomena in the cerebral vasculature. We discuss the formulation of continuum and atomistic modeling approaches, present a consistent framework for their concurrent coupling, and list some of the challenges that one needs to overcome in achieving a seamless and scalable integration of heterogeneous numerical solvers. The effectiveness of the proposed framework is demonstrated in a realistic case involving modeling the thrombus formation process takingmore » place on the wall of a patient-specific cerebral aneurysm. This highlights the ability of multi-scale algorithms to resolve important biophysical processes that span several spatial and temporal scales, potentially yielding new insight into the key aspects of brain blood flow in health and disease. Finally, we discuss open questions in multi-scale modeling and emerging topics of future research.« less
a Framework for Architectural Heritage Hbim Semantization and Development
NASA Astrophysics Data System (ADS)
Brusaporci, S.; Maiezza, P.; Tata, A.
2018-05-01
Despite the recognized advantages of the use of BIM in the field of architecture and engineering, the extension of this procedure to the architectural heritage is neither immediate nor critical. The uniqueness and irregularity of historical architecture, on the one hand, and the great quantity of information necessary for the knowledge of architectural heritage, on the other, require appropriate reflections. The aim of this paper is to define a general framework for the use of BIM procedures for architectural heritage. The proposed methodology consists of three different Level of Development (LoD), depending on the characteristics of the building and the objectives of the study: a simplified model with a low geometric accuracy and a minimum quantity of information (LoD 200); a model nearer to the reality but, however, with a high deviation between virtual and real model (LoD 300); a detailed BIM model that reproduce as much as possible the geometric irregularities of the building and is enriched by the maximum quantity of information available (LoD 400).
Jafari, Mohieddin; Ansari-Pour, Naser; Azimzadeh, Sadegh; Mirzaie, Mehdi
It is nearly half a century past the age of the introduction of the Central Dogma (CD) of molecular biology. This biological axiom has been developed and currently appears to be all the more complex. In this study, we modified CD by adding further species to the CD information flow and mathematically expressed CD within a dynamic framework by using Boolean network based on its present-day and 1965 editions. We show that the enhancement of the Dogma not only now entails a higher level of complexity, but it also shows a higher level of robustness, thus far more consistent with the nature of biological systems. Using this mathematical modeling approach, we put forward a logic-based expression of our conceptual view of molecular biology. Finally, we show that such biological concepts can be converted into dynamic mathematical models using a logic-based approach and thus may be useful as a framework for improving static conceptual models in biology.
Jafari, Mohieddin; Ansari-Pour, Naser; Azimzadeh, Sadegh; Mirzaie, Mehdi
2017-01-01
It is nearly half a century past the age of the introduction of the Central Dogma (CD) of molecular biology. This biological axiom has been developed and currently appears to be all the more complex. In this study, we modified CD by adding further species to the CD information flow and mathematically expressed CD within a dynamic framework by using Boolean network based on its present-day and 1965 editions. We show that the enhancement of the Dogma not only now entails a higher level of complexity, but it also shows a higher level of robustness, thus far more consistent with the nature of biological systems. Using this mathematical modeling approach, we put forward a logic-based expression of our conceptual view of molecular biology. Finally, we show that such biological concepts can be converted into dynamic mathematical models using a logic-based approach and thus may be useful as a framework for improving static conceptual models in biology. PMID:29267315
Exploring Evolving Media Discourse Through Event Cueing.
Lu, Yafeng; Steptoe, Michael; Burke, Sarah; Wang, Hong; Tsai, Jiun-Yi; Davulcu, Hasan; Montgomery, Douglas; Corman, Steven R; Maciejewski, Ross
2016-01-01
Online news, microblogs and other media documents all contain valuable insight regarding events and responses to events. Underlying these documents is the concept of framing, a process in which communicators act (consciously or unconsciously) to construct a point of view that encourages facts to be interpreted by others in a particular manner. As media discourse evolves, how topics and documents are framed can undergo change, shifting the discussion to different viewpoints or rhetoric. What causes these shifts can be difficult to determine directly; however, by linking secondary datasets and enabling visual exploration, we can enhance the hypothesis generation process. In this paper, we present a visual analytics framework for event cueing using media data. As discourse develops over time, our framework applies a time series intervention model which tests to see if the level of framing is different before or after a given date. If the model indicates that the times before and after are statistically significantly different, this cues an analyst to explore related datasets to help enhance their understanding of what (if any) events may have triggered these changes in discourse. Our framework consists of entity extraction and sentiment analysis as lenses for data exploration and uses two different models for intervention analysis. To demonstrate the usage of our framework, we present a case study on exploring potential relationships between climate change framing and conflicts in Africa.
Parameterized post-Newtonian cosmology
NASA Astrophysics Data System (ADS)
Sanghai, Viraj A. A.; Clifton, Timothy
2017-03-01
Einstein’s theory of gravity has been extensively tested on solar system scales, and for isolated astrophysical systems, using the perturbative framework known as the parameterized post-Newtonian (PPN) formalism. This framework is designed for use in the weak-field and slow-motion limit of gravity, and can be used to constrain a large class of metric theories of gravity with data collected from the aforementioned systems. Given the potential of future surveys to probe cosmological scales to high precision, it is a topic of much contemporary interest to construct a similar framework to link Einstein’s theory of gravity and its alternatives to observations on cosmological scales. Our approach to this problem is to adapt and extend the existing PPN formalism for use in cosmology. We derive a set of equations that use the same parameters to consistently model both weak fields and cosmology. This allows us to parameterize a large class of modified theories of gravity and dark energy models on cosmological scales, using just four functions of time. These four functions can be directly linked to the background expansion of the universe, first-order cosmological perturbations, and the weak-field limit of the theory. They also reduce to the standard PPN parameters on solar system scales. We illustrate how dark energy models and scalar-tensor and vector-tensor theories of gravity fit into this framework, which we refer to as ‘parameterized post-Newtonian cosmology’ (PPNC).
Lalys, Florent; Riffaud, Laurent; Bouget, David; Jannin, Pierre
2012-01-01
The need for a better integration of the new generation of Computer-Assisted-Surgical (CAS) systems has been recently emphasized. One necessity to achieve this objective is to retrieve data from the Operating Room (OR) with different sensors, then to derive models from these data. Recently, the use of videos from cameras in the OR has demonstrated its efficiency. In this paper, we propose a framework to assist in the development of systems for the automatic recognition of high level surgical tasks using microscope videos analysis. We validated its use on cataract procedures. The idea is to combine state-of-the-art computer vision techniques with time series analysis. The first step of the framework consisted in the definition of several visual cues for extracting semantic information, therefore characterizing each frame of the video. Five different pieces of image-based classifiers were therefore implemented. A step of pupil segmentation was also applied for dedicated visual cue detection. Time series classification algorithms were then applied to model time-varying data. Dynamic Time Warping (DTW) and Hidden Markov Models (HMM) were tested. This association combined the advantages of all methods for better understanding of the problem. The framework was finally validated through various studies. Six binary visual cues were chosen along with 12 phases to detect, obtaining accuracies of 94%. PMID:22203700
A Simple Demonstration of Concrete Structural Health Monitoring Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahadevan, Sankaran; Agarwal, Vivek; Cai, Guowei
Assessment and management of aging concrete structures in nuclear power plants require a more systematic approach than simple reliance on existing code margins of safety. Structural health monitoring of concrete structures aims to understand the current health condition of a structure based on heterogeneous measurements to produce high confidence actionable information regarding structural integrity that supports operational and maintenance decisions. This ongoing research project is seeking to develop a probabilistic framework for health diagnosis and prognosis of aging concrete structures in a nuclear power plant subjected to physical, chemical, environment, and mechanical degradation. The proposed framework consists of four elements—damagemore » modeling, monitoring, data analytics, and uncertainty quantification. This report describes a proof-of-concept example on a small concrete slab subjected to a freeze-thaw experiment that explores techniques in each of the four elements of the framework and their integration. An experimental set-up at Vanderbilt University’s Laboratory for Systems Integrity and Reliability is used to research effective combination of full-field techniques that include infrared thermography, digital image correlation, and ultrasonic measurement. The measured data are linked to the probabilistic framework: the thermography, digital image correlation data, and ultrasonic measurement data are used for Bayesian calibration of model parameters, for diagnosis of damage, and for prognosis of future damage. The proof-of-concept demonstration presented in this report highlights the significance of each element of the framework and their integration.« less
Geometrothermodynamic model for the evolution of the Universe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gruber, Christine; Quevedo, Hernando, E-mail: christine.gruber@correo.nucleares.unam.mx, E-mail: quevedo@nucleares.unam.mx
Using the formalism of geometrothermodynamics to derive a fundamental thermodynamic equation, we construct a cosmological model in the framework of relativistic cosmology. In a first step, we describe a system without thermodynamic interaction, and show it to be equivalent to the standard ΛCDM paradigm. The second step includes thermodynamic interaction and produces a model consistent with the main features of inflation. With the proposed fundamental equation we are thus able to describe all the known epochs in the evolution of our Universe, starting from the inflationary phase.
Modeling myosin VI stepping dynamics
NASA Astrophysics Data System (ADS)
Tehver, Riina
Myosin VI is a molecular motor that transports intracellular cargo as well as acts as an anchor. The motor has been measured to have unusually large step size variation and it has been reported to make both long forward and short inchworm-like forward steps, as well as step backwards. We have been developing a model that incorporates this diverse stepping behavior in a consistent framework. Our model allows us to predict the dynamics of the motor under different conditions and investigate the evolutionary advantages of the large step size variation.
Kelay, Tanika; Chan, Kah Leong; Ako, Emmanuel; Yasin, Mohammad; Costopoulos, Charis; Gold, Matthew; Kneebone, Roger K; Malik, Iqbal S; Bello, Fernando
2017-01-01
Distributed Simulation is the concept of portable, high-fidelity immersive simulation. Here, it is used for the development of a simulation-based training programme for cardiovascular specialities. We present an evidence base for how accessible, portable and self-contained simulated environments can be effectively utilised for the modelling, development and testing of a complex training framework and assessment methodology. Iterative user feedback through mixed-methods evaluation techniques resulted in the implementation of the training programme. Four phases were involved in the development of our immersive simulation-based training programme: ( 1) initial conceptual stage for mapping structural criteria and parameters of the simulation training framework and scenario development ( n = 16), (2) training facility design using Distributed Simulation , (3) test cases with clinicians ( n = 8) and collaborative design, where evaluation and user feedback involved a mixed-methods approach featuring (a) quantitative surveys to evaluate the realism and perceived educational relevance of the simulation format and framework for training and (b) qualitative semi-structured interviews to capture detailed feedback including changes and scope for development. Refinements were made iteratively to the simulation framework based on user feedback, resulting in (4) transition towards implementation of the simulation training framework, involving consistent quantitative evaluation techniques for clinicians ( n = 62). For comparative purposes, clinicians' initial quantitative mean evaluation scores for realism of the simulation training framework, realism of the training facility and relevance for training ( n = 8) are presented longitudinally, alongside feedback throughout the development stages from concept to delivery, including the implementation stage ( n = 62). Initially, mean evaluation scores fluctuated from low to average, rising incrementally. This corresponded with the qualitative component, which augmented the quantitative findings; trainees' user feedback was used to perform iterative refinements to the simulation design and components (collaborative design), resulting in higher mean evaluation scores leading up to the implementation phase. Through application of innovative Distributed Simulation techniques, collaborative design, and consistent evaluation techniques from conceptual, development, and implementation stages, fully immersive simulation techniques for cardiovascular specialities are achievable and have the potential to be implemented more broadly.
Joint Multi-Fiber NODDI Parameter Estimation and Tractography Using the Unscented Information Filter
Reddy, Chinthala P.; Rathi, Yogesh
2016-01-01
Tracing white matter fiber bundles is an integral part of analyzing brain connectivity. An accurate estimate of the underlying tissue parameters is also paramount in several neuroscience applications. In this work, we propose to use a joint fiber model estimation and tractography algorithm that uses the NODDI (neurite orientation dispersion diffusion imaging) model to estimate fiber orientation dispersion consistently and smoothly along the fiber tracts along with estimating the intracellular and extracellular volume fractions from the diffusion signal. While the NODDI model has been used in earlier works to estimate the microstructural parameters at each voxel independently, for the first time, we propose to integrate it into a tractography framework. We extend this framework to estimate the NODDI parameters for two crossing fibers, which is imperative to trace fiber bundles through crossings as well as to estimate the microstructural parameters for each fiber bundle separately. We propose to use the unscented information filter (UIF) to accurately estimate the model parameters and perform tractography. The proposed approach has significant computational performance improvements as well as numerical robustness over the unscented Kalman filter (UKF). Our method not only estimates the confidence in the estimated parameters via the covariance matrix, but also provides the Fisher-information matrix of the state variables (model parameters), which can be quite useful to measure model complexity. Results from in-vivo human brain data sets demonstrate the ability of our algorithm to trace through crossing fiber regions, while estimating orientation dispersion and other biophysical model parameters in a consistent manner along the tracts. PMID:27147956
Reddy, Chinthala P; Rathi, Yogesh
2016-01-01
Tracing white matter fiber bundles is an integral part of analyzing brain connectivity. An accurate estimate of the underlying tissue parameters is also paramount in several neuroscience applications. In this work, we propose to use a joint fiber model estimation and tractography algorithm that uses the NODDI (neurite orientation dispersion diffusion imaging) model to estimate fiber orientation dispersion consistently and smoothly along the fiber tracts along with estimating the intracellular and extracellular volume fractions from the diffusion signal. While the NODDI model has been used in earlier works to estimate the microstructural parameters at each voxel independently, for the first time, we propose to integrate it into a tractography framework. We extend this framework to estimate the NODDI parameters for two crossing fibers, which is imperative to trace fiber bundles through crossings as well as to estimate the microstructural parameters for each fiber bundle separately. We propose to use the unscented information filter (UIF) to accurately estimate the model parameters and perform tractography. The proposed approach has significant computational performance improvements as well as numerical robustness over the unscented Kalman filter (UKF). Our method not only estimates the confidence in the estimated parameters via the covariance matrix, but also provides the Fisher-information matrix of the state variables (model parameters), which can be quite useful to measure model complexity. Results from in-vivo human brain data sets demonstrate the ability of our algorithm to trace through crossing fiber regions, while estimating orientation dispersion and other biophysical model parameters in a consistent manner along the tracts.
A log-normal distribution model for the molecular weight of aquatic fulvic acids
Cabaniss, S.E.; Zhou, Q.; Maurice, P.A.; Chin, Y.-P.; Aiken, G.R.
2000-01-01
The molecular weight of humic substances influences their proton and metal binding, organic pollutant partitioning, adsorption onto minerals and activated carbon, and behavior during water treatment. We propose a lognormal model for the molecular weight distribution in aquatic fulvic acids to provide a conceptual framework for studying these size effects. The normal curve mean and standard deviation are readily calculated from measured M(n) and M(w) and vary from 2.7 to 3 for the means and from 0.28 to 0.37 for the standard deviations for typical aquatic fulvic acids. The model is consistent with several types of molecular weight data, including the shapes of high- pressure size-exclusion chromatography (HP-SEC) peaks. Applications of the model to electrostatic interactions, pollutant solubilization, and adsorption are explored in illustrative calculations.The molecular weight of humic substances influences their proton and metal binding, organic pollutant partitioning, adsorption onto minerals and activated carbon, and behavior during water treatment. We propose a log-normal model for the molecular weight distribution in aquatic fulvic acids to provide a conceptual framework for studying these size effects. The normal curve mean and standard deviation are readily calculated from measured Mn and Mw and vary from 2.7 to 3 for the means and from 0.28 to 0.37 for the standard deviations for typical aquatic fulvic acids. The model is consistent with several type's of molecular weight data, including the shapes of high-pressure size-exclusion chromatography (HP-SEC) peaks. Applications of the model to electrostatic interactions, pollutant solubilization, and adsorption are explored in illustrative calculations.
NASA Astrophysics Data System (ADS)
Berloff, P. S.
2016-12-01
This work aims at developing a framework for dynamically consistent parameterization of mesoscale eddy effects for use in non-eddy-resolving ocean circulation models. The proposed eddy parameterization framework is successfully tested on the classical, wind-driven double-gyre model, which is solved both with explicitly resolved vigorous eddy field and in the non-eddy-resolving configuration with the eddy parameterization replacing the eddy effects. The parameterization focuses on the effect of the stochastic part of the eddy forcing that backscatters and induces eastward jet extension of the western boundary currents and its adjacent recirculation zones. The parameterization locally approximates transient eddy flux divergence by spatially localized and temporally periodic forcing, referred to as the plunger, and focuses on the linear-dynamics flow solution induced by it. The nonlinear self-interaction of this solution, referred to as the footprint, characterizes and quantifies the induced eddy forcing exerted on the large-scale flow. We find that spatial pattern and amplitude of each footprint strongly depend on the underlying large-scale flow, and the corresponding relationships provide the basis for the eddy parameterization and its closure on the large-scale flow properties. Dependencies of the footprints on other important parameters of the problem are also systematically analyzed. The parameterization utilizes the local large-scale flow information, constructs and scales the corresponding footprints, and then sums them up over the gyres to produce the resulting eddy forcing field, which is interactively added to the model as an extra forcing. Thus, the assumed ensemble of plunger solutions can be viewed as a simple model for the cumulative effect of the stochastic eddy forcing. The parameterization framework is implemented in the simplest way, but it provides a systematic strategy for improving the implementation algorithm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahmadi, Rouhollah, E-mail: rouhollahahmadi@yahoo.com; Khamehchi, Ehsan
Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks andmore » fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.« less
NASA Astrophysics Data System (ADS)
Papagiannopoulou, Christina; Decubber, Stijn; Miralles, Diego; Demuzere, Matthias; Dorigo, Wouter; Verhoest, Niko; Waegeman, Willem
2017-04-01
Satellite data provide an abundance of information about crucial climatic and environmental variables. These data - consisting of global records, spanning up to 35 years and having the form of multivariate time series with different spatial and temporal resolutions - enable the study of key climate-vegetation interactions. Although methods which are based on correlations and linear models are typically used for this purpose, their assumptions for linearity about the climate-vegetation relationships are too simplistic. Therefore, we adopt a recently proposed non-linear Granger causality analysis [1], in which we incorporate spatial information, concatenating data from neighboring pixels and training a joint model on the combined data. Experimental results based on global data sets show that considering non-linear relationships leads to a higher explained variance of past vegetation dynamics, compared to simple linear models. Our approach consists of several steps. First, we compile an extensive database [1], which includes multiple data sets for land surface temperature, near-surface air temperature, surface radiation, precipitation, snow water equivalents and surface soil moisture. Based on this database, high-level features are constructed and considered as predictors in our machine-learning framework. These high-level features include (de-trended) seasonal anomalies, lagged variables, past cumulative variables, and extreme indices, all calculated based on the raw climatic data. Second, we apply a spatiotemporal non-linear Granger causality framework - in which the linear predictive model is substituted for a non-linear machine learning algorithm - in order to assess which of these predictor variables Granger-cause vegetation dynamics at each 1° pixel. We use the de-trended anomalies of Normalized Difference Vegetation Index (NDVI) to characterize vegetation, being the target variable of our framework. Experimental results indicate that climate strongly (Granger-)causes vegetation dynamics in most regions globally. More specifically, water availability is the most dominant vegetation driver, being the dominant vegetation driver in 54% of the vegetated surface. Furthermore, our results show that precipitation and soil moisture have prolonged impacts on vegetation in semiarid regions, with up to 10% of additional explained variance on the vegetation dynamics occurring three months later. Finally, hydro-climatic extremes seem to have a remarkable impact on vegetation, since they also explain up to 10% of additional variance of vegetation in certain regions despite their infrequent occurrence. References [1] Papagiannopoulou, C., Miralles, D. G., Verhoest, N. E. C., Dorigo, W. A., and Waegeman, W.: A non-linear Granger causality framework to investigate climate-vegetation dynamics, Geosci. Model Dev. Discuss., doi:10.5194/gmd-2016-266, in review, 2016.
NASA Astrophysics Data System (ADS)
Marketin, T.; Huther, L.; Martínez-Pinedo, G.
2016-02-01
Background: r -process nucleosynthesis models rely, by necessity, on nuclear structure models for input. Particularly important are β -decay half-lives of neutron-rich nuclei. At present only a single systematic calculation exists that provides values for all relevant nuclei making it difficult to test the sensitivity of nucleosynthesis models to this input. Additionally, even though there are indications that their contribution may be significant, the impact of first-forbidden transitions on decay rates has not been systematically studied within a consistent model. Purpose: Our goal is to provide a table of β -decay half-lives and β -delayed neutron emission probabilities, including first-forbidden transitions, calculated within a fully self-consistent microscopic theoretical framework. The results are used in an r -process nucleosynthesis calculation to asses the sensitivity of heavy element nucleosynthesis to weak interaction reaction rates. Method: We use a fully self-consistent covariant density functional theory (CDFT) framework. The ground state of all nuclei is calculated with the relativistic Hartree-Bogoliubov (RHB) model, and excited states are obtained within the proton-neutron relativistic quasiparticle random phase approximation (p n -RQRPA). Results: The β -decay half-lives, β -delayed neutron emission probabilities, and the average number of emitted neutrons have been calculated for 5409 nuclei in the neutron-rich region of the nuclear chart. We observe a significant contribution of the first-forbidden transitions to the total decay rate in nuclei far from the valley of stability. The experimental half-lives are in general well reproduced for even-even, odd-A , and odd-odd nuclei, in particular for short-lived nuclei. The resulting data table is included with the article as Supplemental Material. Conclusions: In certain regions of the nuclear chart, first-forbidden transitions constitute a large fraction of the total decay rate and must be taken into account consistently in modern evaluations of half-lives. Both the β -decay half-lives and β -delayed neutron emission probabilities have a noticeable impact on the results of heavy element nucleosynthesis models.
NASA Astrophysics Data System (ADS)
Kido, Kentaro; Kasahara, Kento; Yokogawa, Daisuke; Sato, Hirofumi
2015-07-01
In this study, we reported the development of a new quantum mechanics/molecular mechanics (QM/MM)-type framework to describe chemical processes in solution by combining standard molecular-orbital calculations with a three-dimensional formalism of integral equation theory for molecular liquids (multi-center molecular Ornstein-Zernike (MC-MOZ) method). The theoretical procedure is very similar to the 3D-reference interaction site model self-consistent field (RISM-SCF) approach. Since the MC-MOZ method is highly parallelized for computation, the present approach has the potential to be one of the most efficient procedures to treat chemical processes in solution. Benchmark tests to check the validity of this approach were performed for two solute (solute water and formaldehyde) systems and a simple SN2 reaction (Cl- + CH3Cl → ClCH3 + Cl-) in aqueous solution. The results for solute molecular properties and solvation structures obtained by the present approach were in reasonable agreement with those obtained by other hybrid frameworks and experiments. In particular, the results of the proposed approach are in excellent agreements with those of 3D-RISM-SCF.
Kido, Kentaro; Kasahara, Kento; Yokogawa, Daisuke; Sato, Hirofumi
2015-07-07
In this study, we reported the development of a new quantum mechanics/molecular mechanics (QM/MM)-type framework to describe chemical processes in solution by combining standard molecular-orbital calculations with a three-dimensional formalism of integral equation theory for molecular liquids (multi-center molecular Ornstein-Zernike (MC-MOZ) method). The theoretical procedure is very similar to the 3D-reference interaction site model self-consistent field (RISM-SCF) approach. Since the MC-MOZ method is highly parallelized for computation, the present approach has the potential to be one of the most efficient procedures to treat chemical processes in solution. Benchmark tests to check the validity of this approach were performed for two solute (solute water and formaldehyde) systems and a simple SN2 reaction (Cl(-) + CH3Cl → ClCH3 + Cl(-)) in aqueous solution. The results for solute molecular properties and solvation structures obtained by the present approach were in reasonable agreement with those obtained by other hybrid frameworks and experiments. In particular, the results of the proposed approach are in excellent agreements with those of 3D-RISM-SCF.
Description of transitional nuclei in the sdg boson model
NASA Astrophysics Data System (ADS)
Lac, V.-S.; Kuyucak, S.
1992-03-01
We study the transitional nuclei in the framework of the sdg boson model. This extension is necessitated by recent measurements of E2 and E4 transitions in the Pt and Os isotopes which can not be explained in the sd boson models. We show how γ-unstable and triaxial shapes arise from special choices of sdg model hamiltonians and discuss ways of limiting the number of free parameters through consistency and coherence conditions. A satisfactory description of E2 and E4 properties is obtained for the Pt and Os nuclei, which also predicts dynamic shape transitions in these nuclei.
Bretscher, P A
2014-01-01
The establishment of central tolerance to most self-antigens results in a repertoire of mature peripheral lymphocytes specific for foreign and peripheral self-antigens. The framework that single, mature lymphocytes are inactivated by antigen, whereas their activation requires lymphocyte cooperation, accounts for diverse observations and incorporates a mechanism of peripheral tolerance. This framework accounts for the generalizations that the sustained activation by antigen of most B cells and CD8 T cells requires CD4 T helper cells; in the absence of CD4 T cells, antigen can inactivate these B cells and CD8 T cells. In this sense, CD4 T cells are the guardians of the fate of most B cells and CD8 T cells when they encounter antigen. I argue here that the single-lymphocyte/multiple-lymphocyte framework for the inactivation/activation of lymphocytes also applies to CD4 T cells. I consider within this framework a model for the activation/inactivation of CD4 T cells that is consistent with the large majority of contemporary observations, including significant clinical observations. I outline the grounds why I feel this model is more plausible than the contemporary and predominant pathogen-associated molecular pattern (PAMP) and Danger Models for CD4 T cell activation. These models are based upon what I consider the radical premise that self–nonself discrimination does not exist at the level of mature CD4 T cells. I explain why I feel this feature renders the PAMP and Danger Models somewhat implausible. The model I propose, in contrast, is conservative in that it embodies such a process of self–nonself discrimination. PMID:24684567
Van Dijk-de Vries, Anneke N; Duimel-Peeters, Inge G P; Muris, Jean W; Wesseling, Geertjan J; Beusmans, George H M I; Vrijhoef, Hubertus J M
2016-04-08
Teamwork between healthcare providers is conditional for the delivery of integrated care. This study aimed to assess the usefulness of the conceptual framework Integrated Team Effectiveness Model for developing and testing of the Integrated Team Effectiveness Instrument. Focus groups with healthcare providers in an integrated care setting for people with chronic obstructive pulmonary disease (COPD) were conducted to examine the recognisability of the conceptual framework and to explore critical success factors for collaborative COPD practice out of this framework. The resulting items were transposed into a pilot instrument. This was reviewed by expert opinion and completed 153 times by healthcare providers. The underlying structure and internal consistency of the instrument were verified by factor analysis and Cronbach's alpha. The conceptual framework turned out to be comprehensible for discussing teamwork effectiveness. The pilot instrument measures 25 relevant aspects of teamwork in integrated COPD care. Factor analysis suggested three reliable components: teamwork effectiveness, team processes and team psychosocial traits (Cronbach's alpha between 0.76 and 0.81). The conceptual framework Integrated Team Effectiveness Model is relevant in developing a practical full-spectrum instrument to facilitate discussing teamwork effectiveness. The Integrated Team Effectiveness Instrument provides a well-founded basis to self-evaluate teamwork effectiveness in integrated COPD care by healthcare providers. Recommendations are provided for the improvement of the instrument.
The Hartree-Fock calculation of the magnetic properties of molecular solutes
NASA Astrophysics Data System (ADS)
Cammi, R.
1998-08-01
In this paper we set the formal bases for the calculation of the magnetic susceptibility and of the nuclear magnetic shielding tensors for molecular solutes described within the framework of the polarizable continuum model (PCM). The theory has been developed at self-consistent field (SCF) level and adapted to be used within the framework of some of the computational procedures of larger use, i.e., the gauge invariant atomic orbital method (GIAO) and the continuous set gauge transformation method (CSGT). The numerical results relative to the magnetizabilities and chemical shielding of acetonitrile and nitrometane in various solvents computed with the PCM-CSGT method are also presented.
Tripathy, Shreepada; Miller, Karen H; Berkenbosch, John W; McKinley, Tara F; Boland, Kimberly A; Brown, Seth A; Calhoun, Aaron W
2016-06-01
Controversy exists in the simulation community as to the emotional and educational ramifications of mannequin death due to learner action or inaction. No theoretical framework to guide future investigations of learner actions currently exists. The purpose of our study was to generate a model of the learner experience of mannequin death using a mixed methods approach. The study consisted of an initial focus group phase composed of 11 learners who had previously experienced mannequin death due to action or inaction on the part of learners as defined by Leighton (Clin Simul Nurs. 2009;5(2):e59-e62). Transcripts were analyzed using grounded theory to generate a list of relevant themes that were further organized into a theoretical framework. With the use of this framework, a survey was generated and distributed to additional learners who had experienced mannequin death due to action or inaction. Results were analyzed using a mixed methods approach. Forty-one clinicians completed the survey. A correlation was found between the emotional experience of mannequin death and degree of presession anxiety (P < 0.001). Debriefing was found to significantly reduce negative emotion and enhance satisfaction. Sixty-nine percent of respondents indicated that mannequin death enhanced learning. These results were used to modify our framework. Using the previous approach, we created a model of the effect of mannequin death on the educational and psychological state of learners. We offer the final model as a guide to future research regarding the learner experience of mannequin death.
NASA Astrophysics Data System (ADS)
Latypov, Marat I.; Kalidindi, Surya R.
2017-10-01
There is a critical need for the development and verification of practically useful multiscale modeling strategies for simulating the mechanical response of multiphase metallic materials with heterogeneous microstructures. In this contribution, we present data-driven reduced order models for effective yield strength and strain partitioning in such microstructures. These models are built employing the recently developed framework of Materials Knowledge Systems that employ 2-point spatial correlations (or 2-point statistics) for the quantification of the heterostructures and principal component analyses for their low-dimensional representation. The models are calibrated to a large collection of finite element (FE) results obtained for a diverse range of microstructures with various sizes, shapes, and volume fractions of the phases. The performance of the models is evaluated by comparing the predictions of yield strength and strain partitioning in two-phase materials with the corresponding predictions from a classical self-consistent model as well as results of full-field FE simulations. The reduced-order models developed in this work show an excellent combination of accuracy and computational efficiency, and therefore present an important advance towards computationally efficient microstructure-sensitive multiscale modeling frameworks.
Li, Bin; Shin, Hyunjin; Gulbekyan, Georgy; Pustovalova, Olga; Nikolsky, Yuri; Hope, Andrew; Bessarabova, Marina; Schu, Matthew; Kolpakova-Hart, Elona; Merberg, David; Dorner, Andrew; Trepicchio, William L.
2015-01-01
Development of drug responsive biomarkers from pre-clinical data is a critical step in drug discovery, as it enables patient stratification in clinical trial design. Such translational biomarkers can be validated in early clinical trial phases and utilized as a patient inclusion parameter in later stage trials. Here we present a study on building accurate and selective drug sensitivity models for Erlotinib or Sorafenib from pre-clinical in vitro data, followed by validation of individual models on corresponding treatment arms from patient data generated in the BATTLE clinical trial. A Partial Least Squares Regression (PLSR) based modeling framework was designed and implemented, using a special splitting strategy and canonical pathways to capture robust information for model building. Erlotinib and Sorafenib predictive models could be used to identify a sub-group of patients that respond better to the corresponding treatment, and these models are specific to the corresponding drugs. The model derived signature genes reflect each drug’s known mechanism of action. Also, the models predict each drug’s potential cancer indications consistent with clinical trial results from a selection of globally normalized GEO expression datasets. PMID:26107615
Li, Bin; Shin, Hyunjin; Gulbekyan, Georgy; Pustovalova, Olga; Nikolsky, Yuri; Hope, Andrew; Bessarabova, Marina; Schu, Matthew; Kolpakova-Hart, Elona; Merberg, David; Dorner, Andrew; Trepicchio, William L
2015-01-01
Development of drug responsive biomarkers from pre-clinical data is a critical step in drug discovery, as it enables patient stratification in clinical trial design. Such translational biomarkers can be validated in early clinical trial phases and utilized as a patient inclusion parameter in later stage trials. Here we present a study on building accurate and selective drug sensitivity models for Erlotinib or Sorafenib from pre-clinical in vitro data, followed by validation of individual models on corresponding treatment arms from patient data generated in the BATTLE clinical trial. A Partial Least Squares Regression (PLSR) based modeling framework was designed and implemented, using a special splitting strategy and canonical pathways to capture robust information for model building. Erlotinib and Sorafenib predictive models could be used to identify a sub-group of patients that respond better to the corresponding treatment, and these models are specific to the corresponding drugs. The model derived signature genes reflect each drug's known mechanism of action. Also, the models predict each drug's potential cancer indications consistent with clinical trial results from a selection of globally normalized GEO expression datasets.
Chan, Connie V.; Kaufman, David R.
2009-01-01
Health information technologies (HIT) have great potential to advance health care globally. In particular, HIT can provide innovative approaches and methodologies to overcome the range of access and resource barriers specific to developing countries. However, there is a paucity of models and empirical evidence informing the technology selection process in these settings. We propose a framework for selecting patient-oriented technologies in developing countries. The selection guidance process is structured by a set of filters that impose particular constraints and serve to narrow the space of possible decisions. The framework consists of three levels of factors: 1) situational factors, 2) the technology and its relationship with health interventions and with target patients, and 3) empirical evidence. We demonstrate the utility of the framework in the context of mobile phones for behavioral health interventions to reduce risk factors for cardiovascular disease. This framework can be applied to health interventions across health domains to explore how and whether available technologies can support delivery of the associated types of interventions and with the target populations. PMID:19796709
Moore, William B; Webb, A Alexander G
2013-09-26
The heat transport and lithospheric dynamics of early Earth are currently explained by plate tectonic and vertical tectonic models, but these do not offer a global synthesis consistent with the geologic record. Here we use numerical simulations and comparison with the geologic record to explore a heat-pipe model in which volcanism dominates surface heat transport. These simulations indicate that a cold and thick lithosphere developed as a result of frequent volcanic eruptions that advected surface materials downwards. Declining heat sources over time led to an abrupt transition to plate tectonics. Consistent with model predictions, the geologic record shows rapid volcanic resurfacing, contractional deformation, a low geothermal gradient across the bulk of the lithosphere and a rapid decrease in heat-pipe volcanism after initiation of plate tectonics. The heat-pipe Earth model therefore offers a coherent geodynamic framework in which to explore the evolution of our planet before the onset of plate tectonics.
Software systems for modeling articulated figures
NASA Technical Reports Server (NTRS)
Phillips, Cary B.
1989-01-01
Research in computer animation and simulation of human task performance requires sophisticated geometric modeling and user interface tools. The software for a research environment should present the programmer with a powerful but flexible substrate of facilities for displaying and manipulating geometric objects, yet insure that future tools have a consistent and friendly user interface. Jack is a system which provides a flexible and extensible programmer and user interface for displaying and manipulating complex geometric figures, particularly human figures in a 3D working environment. It is a basic software framework for high-performance Silicon Graphics IRIS workstations for modeling and manipulating geometric objects in a general but powerful way. It provides a consistent and user-friendly interface across various applications in computer animation and simulation of human task performance. Currently, Jack provides input and control for applications including lighting specification and image rendering, anthropometric modeling, figure positioning, inverse kinematics, dynamic simulation, and keyframe animation.
A coupled modeling framework for sustainable watershed management in transboundary river basins
NASA Astrophysics Data System (ADS)
Furqan Khan, Hassaan; Yang, Y. C. Ethan; Xie, Hua; Ringler, Claudia
2017-12-01
There is a growing recognition among water resource managers that sustainable watershed management needs to not only account for the diverse ways humans benefit from the environment, but also incorporate the impact of human actions on the natural system. Coupled natural-human system modeling through explicit modeling of both natural and human behavior can help reveal the reciprocal interactions and co-evolution of the natural and human systems. This study develops a spatially scalable, generalized agent-based modeling (ABM) framework consisting of a process-based semi-distributed hydrologic model (SWAT) and a decentralized water system model to simulate the impacts of water resource management decisions that affect the food-water-energy-environment (FWEE) nexus at a watershed scale. Agents within a river basin are geographically delineated based on both political and watershed boundaries and represent key stakeholders of ecosystem services. Agents decide about the priority across three primary water uses: food production, hydropower generation and ecosystem health within their geographical domains. Agents interact with the environment (streamflow) through the SWAT model and interact with other agents through a parameter representing willingness to cooperate. The innovative two-way coupling between the water system model and SWAT enables this framework to fully explore the feedback of human decisions on the environmental dynamics and vice versa. To support non-technical stakeholder interactions, a web-based user interface has been developed that allows for role-play and participatory modeling. The generalized ABM framework is also tested in two key transboundary river basins, the Mekong River basin in Southeast Asia and the Niger River basin in West Africa, where water uses for ecosystem health compete with growing human demands on food and energy resources. We present modeling results for crop production, energy generation and violation of eco-hydrological indicators at both the agent and basin-wide levels to shed light on holistic FWEE management policies in these two basins.
Brown, Ramsay A; Swanson, Larry W
2013-09-01
Systematic description and the unambiguous communication of findings and models remain among the unresolved fundamental challenges in systems neuroscience. No common descriptive frameworks exist to describe systematically the connective architecture of the nervous system, even at the grossest level of observation. Furthermore, the accelerating volume of novel data generated on neural connectivity outpaces the rate at which this data is curated into neuroinformatics databases to synthesize digitally systems-level insights from disjointed reports and observations. To help address these challenges, we propose the Neural Systems Language (NSyL). NSyL is a modeling language to be used by investigators to encode and communicate systematically reports of neural connectivity from neuroanatomy and brain imaging. NSyL engenders systematic description and communication of connectivity irrespective of the animal taxon described, experimental or observational technique implemented, or nomenclature referenced. As a language, NSyL is internally consistent, concise, and comprehensible to both humans and computers. NSyL is a promising development for systematizing the representation of neural architecture, effectively managing the increasing volume of data on neural connectivity and streamlining systems neuroscience research. Here we present similar precedent systems, how NSyL extends existing frameworks, and the reasoning behind NSyL's development. We explore NSyL's potential for balancing robustness and consistency in representation by encoding previously reported assertions of connectivity from the literature as examples. Finally, we propose and discuss the implications of a framework for how NSyL will be digitally implemented in the future to streamline curation of experimental results and bridge the gaps among anatomists, imagers, and neuroinformatics databases. Copyright © 2013 Wiley Periodicals, Inc.
A mathematical framework for modelling cambial surface evolution using a level set method
Sellier, Damien; Plank, Michael J.; Harrington, Jonathan J.
2011-01-01
Background and Aims During their lifetime, tree stems take a series of successive nested shapes. Individual tree growth models traditionally focus on apical growth and architecture. However, cambial growth, which is distributed over a surface layer wrapping the whole organism, equally contributes to plant form and function. This study aims at providing a framework to simulate how organism shape evolves as a result of a secondary growth process that occurs at the cellular scale. Methods The development of the vascular cambium is modelled as an expanding surface using the level set method. The surface consists of multiple compartments following distinct expansion rules. Growth behaviour can be formulated as a mathematical function of surface state variables and independent variables to describe biological processes. Key Results The model was coupled to an architectural model and to a forest stand model to simulate cambium dynamics and wood formation at the scale of the organism. The model is able to simulate competition between cambia, surface irregularities and local features. Predicting the shapes associated with arbitrarily complex growth functions does not add complexity to the numerical method itself. Conclusions Despite their slenderness, it is sometimes useful to conceive of trees as expanding surfaces. The proposed mathematical framework provides a way to integrate through time and space the biological and physical mechanisms underlying cambium activity. It can be used either to test growth hypotheses or to generate detailed maps of wood internal structure. PMID:21470972
On the representability problem and the physical meaning of coarse-grained models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wagner, Jacob W.; Dama, James F.; Durumeric, Aleksander E. P.
2016-07-28
In coarse-grained (CG) models where certain fine-grained (FG, i.e., atomistic resolution) observables are not directly represented, one can nonetheless identify indirect the CG observables that capture the FG observable’s dependence on CG coordinates. Often, in these cases it appears that a CG observable can be defined by analogy to an all-atom or FG observable, but the similarity is misleading and significantly undermines the interpretation of both bottom-up and top-down CG models. Such problems emerge especially clearly in the framework of the systematic bottom-up CG modeling, where a direct and transparent correspondence between FG and CG variables establishes precise conditions formore » consistency between CG observables and underlying FG models. Here we present and investigate these representability challenges and illustrate them via the bottom-up conceptual framework for several simple analytically tractable polymer models. The examples provide special focus on the observables of configurational internal energy, entropy, and pressure, which have been at the root of controversy in the CG literature, as well as discuss observables that would seem to be entirely missing in the CG representation but can nonetheless be correlated with CG behavior. Though we investigate these problems in the framework of systematic coarse-graining, the lessons apply to top-down CG modeling also, with crucial implications for simulation at constant pressure and surface tension and for the interpretations of structural and thermodynamic correlations for comparison to experiment.« less
Quark-lepton flavor democracy and the nonexistence of the fourth generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cvetic, G.; Kim, C.S.
1995-01-01
In the standard model with two Higgs doublets (type II), which has a consistent trend to a flavor gauge theory and its related flavor democracy in the quark and the leptonic sectors (unlike the minimal standard model) when the energy of the probes increases, we impose the mixed quark-lepton flavor democracy at high transition'' energy and assume the usual seesaw mechanism, and consequently find out that the existence of the fourth generation of fermions in this framework is practically ruled out.
Margaret Carreiro; Wayne Zipperer
2011-01-01
The responses of urban park woodlands to large disturbances provide the opportunity to identify and examine linkages in social-ecological systems in urban landscapes.We propose that the Panarchy model consisting of hierarchically nested adaptive cycles provides a useful framework to evaluate those linkages.We use two case studies as examples â Cherokee Park in...
ERIC Educational Resources Information Center
Rama, D. V., Ed.
This volume is part of a series of 18 monographs on service learning and the academic disciplines. It is designed to (1) develop a theoretical framework for service learning in accounting consistent with the goals identified by accounting educators and the recent efforts toward curriculum reform, and (2) describe specific active learning…
2006-01-01
enabling technologies such as built-in-test, advanced health monitoring algorithms, reliability and component aging models, prognostics methods, and...deployment and acceptance. This framework and vision is consistent with the onboard PHM ( Prognostic and Health Management) as well as advanced... monitored . In addition to the prognostic forecasting capabilities provided by monitoring system power, multiple confounding errors by electronic
ERIC Educational Resources Information Center
Peltier, Corey; Vannest, Kimberly J.
2018-01-01
Mr. Buxton is a perplexed elementary mathematics teacher. He co-teaches a second-grade classroom, with Ms. Snyder. In their classroom they have 25 students; five are identified as academically at risk, and three receive special education services. In the past Mr. Buxton successfully used an instructional approach consisting of (a) modeling, (b)…
ERIC Educational Resources Information Center
Zechner, Klaus; Chen, Lei; Davis, Larry; Evanini, Keelan; Lee, Chong Min; Leong, Chee Wee; Wang, Xinhao; Yoon, Su-Youn
2015-01-01
This research report presents a summary of research and development efforts devoted to creating scoring models for automatically scoring spoken item responses of a pilot administration of the Test of English-for-Teaching ("TEFT"™) within the "ELTeach"™ framework.The test consists of items for all four language modalities:…
David N. Wear; Robert Huggett
2011-01-01
This chapter describes how forest type and age distributions might be expected to change in the Appalachian-Cumberland portions of the Central Hardwood Region over the next 50 years. Forecasting forest conditions requires accounting for a number of biophysical and socioeconomic dynamics within an internally consistent modeling framework. We used the US Forest...
Development and Validation of the Meaning of Work Inventory among French Workers
ERIC Educational Resources Information Center
Arnoux-Nicolas, Caroline; Sovet, Laurent; Lhotellier, Lin; Bernaud, Jean-Luc
2017-01-01
The purpose of this study was to validate a psychometric instrument among French workers for assessing the meaning of work. Following an empirical framework, a two-step procedure consisted of exploring and then validating the scale among distinctive samples. The consequent Meaning of Work Inventory is a 15-item scale based on a four-factor model,…
A model to evaluate quality and effectiveness of disease management.
Lemmens, K M M; Nieboer, A P; van Schayck, C P; Asin, J D; Huijsman, R
2008-12-01
Disease management has emerged as a new strategy to enhance quality of care for patients suffering from chronic conditions, and to control healthcare costs. So far, however, the effects of this strategy remain unclear. Although current models define the concept of disease management, they do not provide a systematic development or an explanatory theory of how disease management affects the outcomes of care. The objective of this paper is to present a framework for valid evaluation of disease-management initiatives. The evaluation model is built on two pillars of disease management: patient-related and professional-directed interventions. The effectiveness of these interventions is thought to be affected by the organisational design of the healthcare system. Disease management requires a multifaceted approach; hence disease-management programme evaluations should focus on the effects of multiple interventions, namely patient-related, professional-directed and organisational interventions. The framework has been built upon the conceptualisation of these disease-management interventions. Analysis of the underlying mechanisms of these interventions revealed that learning and behavioural theories support the core assumptions of disease management. The evaluation model can be used to identify the components of disease-management programmes and the mechanisms behind them, making valid comparison feasible. In addition, this model links the programme interventions to indicators that can be used to evaluate the disease-management programme. Consistent use of this framework will enable comparisons among disease-management programmes and outcomes in evaluation research.
OpenDrift - an open source framework for ocean trajectory modeling
NASA Astrophysics Data System (ADS)
Dagestad, Knut-Frode; Breivik, Øyvind; Ådlandsvik, Bjørn
2016-04-01
We will present a new, open source tool for modeling the trajectories and fate of particles or substances (Lagrangian Elements) drifting in the ocean, or even in the atmosphere. The software is named OpenDrift, and has been developed at Norwegian Meteorological Institute in cooperation with Institute of Marine Research. OpenDrift is a generic framework written in Python, and is openly available at https://github.com/knutfrode/opendrift/. The framework is modular with respect to three aspects: (1) obtaining input data, (2) the transport/morphological processes, and (3) exporting of results to file. Modularity is achieved through well defined interfaces between components, and use of a consistent vocabulary (CF conventions) for naming of variables. Modular input implies that it is not necessary to preprocess input data (e.g. currents, wind and waves from Eulerian models) to a particular file format. Instead "reader modules" can be written/used to obtain data directly from any original source, including files or through web based protocols (e.g. OPeNDAP/Thredds). Modularity of processes implies that a model developer may focus on the geophysical processes relevant for the application of interest, without needing to consider technical tasks such as reading, reprojecting, and colocating input data, rotation and scaling of vectors and model output. We will show a few example applications of using OpenDrift for predicting drifters, oil spills, and search and rescue objects.
Active and Passive 3D Vector Radiative Transfer with Preferentially-Aligned Ice Particles
NASA Astrophysics Data System (ADS)
Adams, I. S.; Munchak, S. J.; Pelissier, C.; Kuo, K. S.; Heymsfield, G. M.
2017-12-01
To support the observation of clouds and precipitation using combinations of radars and radiometers, a forward model capable of representing diverse sensing geometries for active and passive instruments is necessary for correctly interpreting and consistently combining multi-sensor measurements from ground-based, airborne, and spaceborne platforms. As such, the Atmospheric Radiative Transfer Simulator (ARTS) uses Monte Carlo integration to produce radar reflectivities and radiometric brightness temperatures for three-dimensional cloud and precipitation input fields. This radiative transfer framework is capable of efficiently sampling Gaussian antenna beams and fully accounting for multiple scattering. By relying on common ray-tracing tools, gaseous absorption models, and scattering properties, the model reproduces accurate and consistent radar and radiometer observables. While such a framework is an important component for simulating remote sensing observables, the key driver for self-consistent radiative transfer calculations of clouds and precipitation is scattering data. Research over the past decade has demonstrated that spheroidal models of frozen hydrometeors cannot accurately reproduce all necessary scattering properties at all desired frequencies. The discrete dipole approximation offers flexibility in calculating scattering for arbitrary particle geometries, but at great computational expense. When considering scattering for certain pristine ice particles, the Extended Boundary Condition Method, or T-Matrix, is much more computationally efficient; however, convergence for T-Matrix calculations fails at large size parameters and high aspect ratios. To address these deficiencies, we implemented the Invariant Imbedding T-Matrix Method (IITM). A brief overview of ARTS and IITM will be given, including details for handling preferentially-aligned hydrometeors. Examples highlighting the performance of the model for simulating space-based and airborne measurements will be offered, and some case studies showing the response to particle type and orientation will be presented. Simulations of polarized radar (Z, LDR, ZDR) and radiometer (Stokes I and Q) quantities will be used to demonstrate the capabilities of the model.
Retinal artery-vein classification via topology estimation
Estrada, Rolando; Allingham, Michael J.; Mettu, Priyatham S.; Cousins, Scott W.; Tomasi, Carlo; Farsiu, Sina
2015-01-01
We propose a novel, graph-theoretic framework for distinguishing arteries from veins in a fundus image. We make use of the underlying vessel topology to better classify small and midsized vessels. We extend our previously proposed tree topology estimation framework by incorporating expert, domain-specific features to construct a simple, yet powerful global likelihood model. We efficiently maximize this model by iteratively exploring the space of possible solutions consistent with the projected vessels. We tested our method on four retinal datasets and achieved classification accuracies of 91.0%, 93.5%, 91.7%, and 90.9%, outperforming existing methods. Our results show the effectiveness of our approach, which is capable of analyzing the entire vasculature, including peripheral vessels, in wide field-of-view fundus photographs. This topology-based method is a potentially important tool for diagnosing diseases with retinal vascular manifestation. PMID:26068204
Theory of the Origin, Evolution, and Nature of Life
Andrulis, Erik D.
2011-01-01
Life is an inordinately complex unsolved puzzle. Despite significant theoretical progress, experimental anomalies, paradoxes, and enigmas have revealed paradigmatic limitations. Thus, the advancement of scientific understanding requires new models that resolve fundamental problems. Here, I present a theoretical framework that economically fits evidence accumulated from examinations of life. This theory is based upon a straightforward and non-mathematical core model and proposes unique yet empirically consistent explanations for major phenomena including, but not limited to, quantum gravity, phase transitions of water, why living systems are predominantly CHNOPS (carbon, hydrogen, nitrogen, oxygen, phosphorus, and sulfur), homochirality of sugars and amino acids, homeoviscous adaptation, triplet code, and DNA mutations. The theoretical framework unifies the macrocosmic and microcosmic realms, validates predicted laws of nature, and solves the puzzle of the origin and evolution of cellular life in the universe. PMID:25382118
A study of microindentation hardness tests by mechanism-based strain gradient plasticity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Y.; Xue, Z.; Gao, H.
2000-08-01
We recently proposed a theory of mechanism-based strain gradient (MSG) plasticity to account for the size dependence of plastic deformation at micron- and submicron-length scales. The MSG plasticity theory connects micron-scale plasticity to dislocation theories via a multiscale, hierarchical framework linking Taylor's dislocation hardening model to strain gradient plasticity. Here we show that the theory of MSG plasticity, when used to study micro-indentation, indeed reproduces the linear dependence observed in experiments, thus providing an important self-consistent check of the theory. The effects of pileup, sink-in, and the radius of indenter tip have been taken into account in the indentation model.more » In accomplishing this objective, we have generalized the MSG plasticity theory to include the elastic deformation in the hierarchical framework. (c) 2000 Materials Research Society.« less
3D numerical simulations of oblique droplet impact onto a deep liquid pool
NASA Astrophysics Data System (ADS)
Gelderblom, Hanneke; Reijers, Sten A.; Gielen, Marise; Sleutel, Pascal; Lohse, Detlef; Xie, Zhihua; Pain, Christopher C.; Matar, Omar K.
2017-11-01
We study the fluid dynamics of three-dimensional oblique droplet impact, which results in phenomena that include splashing and cavity formation. An adaptive, unstructured mesh modelling framework is employed here, which can modify and adapt unstructured meshes to better represent the underlying physics of droplet dynamics, and reduce computational effort without sacrificing accuracy. The numerical framework consists of a mixed control-volume and finite-element formulation, a volume-of-fluid-type method for the interface-capturing based on a compressive control-volume advection method. The framework also features second-order finite-element methods, and a force-balanced algorithm for the surface tension implementation, minimising the spurious velocities often found in many simulations involving capillary-driven flows. The numerical results generated using this framework are compared with high-speed images of the interfacial shapes of the deformed droplet, and the cavity formed upon impact, yielding good agreement. EPSRC, UK, MEMPHIS program Grant (EP/K003976/1), RAEng Research Chair (OKM).
[Research on tumor information grid framework].
Zhang, Haowei; Qin, Zhu; Liu, Ying; Tan, Jianghao; Cao, Haitao; Chen, Youping; Zhang, Ke; Ding, Yuqing
2013-10-01
In order to realize tumor disease information sharing and unified management, we utilized grid technology to make the data and software resources which distributed in various medical institutions for effective integration so that we could make the heterogeneous resources consistent and interoperable in both semantics and syntax aspects. This article describes the tumor grid framework, the type of the service being packaged in Web Service Description Language (WSDL) and extensible markup language schemas definition (XSD), the client use the serialized document to operate the distributed resources. The service objects could be built by Unified Modeling Language (UML) as middle ware to create application programming interface. All of the grid resources are registered in the index and released in the form of Web Services based on Web Services Resource Framework (WSRF). Using the system we can build a multi-center, large sample and networking tumor disease resource sharing framework to improve the level of development in medical scientific research institutions and the patient's quality of life.
Structure simulation with calculated NMR parameters - integrating COSMOS into the CCPN framework.
Schneider, Olaf; Fogh, Rasmus H; Sternberg, Ulrich; Klenin, Konstantin; Kondov, Ivan
2012-01-01
The Collaborative Computing Project for NMR (CCPN) has build a software framework consisting of the CCPN data model (with APIs) for NMR related data, the CcpNmr Analysis program and additional tools like CcpNmr FormatConverter. The open architecture allows for the integration of external software to extend the abilities of the CCPN framework with additional calculation methods. Recently, we have carried out the first steps for integrating our software Computer Simulation of Molecular Structures (COSMOS) into the CCPN framework. The COSMOS-NMR force field unites quantum chemical routines for the calculation of molecular properties with a molecular mechanics force field yielding the relative molecular energies. COSMOS-NMR allows introducing NMR parameters as constraints into molecular mechanics calculations. The resulting infrastructure will be made available for the NMR community. As a first application we have tested the evaluation of calculated protein structures using COSMOS-derived 13C Cα and Cβ chemical shifts. In this paper we give an overview of the methodology and a roadmap for future developments and applications.
On the thermodynamic framework of generalized coupled thermoelastic-viscoplastic-damage modeling
NASA Technical Reports Server (NTRS)
Arnold, S. M.; Saleeb, A. F.
1991-01-01
A complete potential based framework using internal state variables is put forth for the derivation of reversible and irreversible constitutive equations. In this framework, the existence of the total (integrated) form of either the (Helmholtz) free energy or the (Gibbs) complementary free energy are assumed a priori. Two options for describing the flow and evolutionary equations are described, wherein option one (the fully coupled form) is shown to be over restrictive while the second option (the decoupled form) provides significant flexibility. As a consequence of the decoupled form, a new operator, i.e., the Compliance operator, is defined which provides a link between the assumed Gibb's and complementary dissipation potential and ensures a number of desirable numerical features, for example the symmetry of the resulting consistent tangent stiffness matrix. An important conclusion reached, is that although many theories in the literature do not conform to the general potential framework outlined, it is still possible in some cases, by slight modifications of the used forms, to restore the complete potential structure.
NASA Astrophysics Data System (ADS)
Walker, David M.; Allingham, David; Lee, Heung Wing Joseph; Small, Michael
2010-02-01
Small world network models have been effective in capturing the variable behaviour of reported case data of the SARS coronavirus outbreak in Hong Kong during 2003. Simulations of these models have previously been realized using informed “guesses” of the proposed model parameters and tested for consistency with the reported data by surrogate analysis. In this paper we attempt to provide statistically rigorous parameter distributions using Approximate Bayesian Computation sampling methods. We find that such sampling schemes are a useful framework for fitting parameters of stochastic small world network models where simulation of the system is straightforward but expressing a likelihood is cumbersome.
Architecture for reactive planning of robot actions
NASA Astrophysics Data System (ADS)
Riekki, Jukka P.; Roening, Juha
1995-01-01
In this article, a reactive system for planning robot actions is described. The described hierarchical control system architecture consists of planning-executing-monitoring-modelling elements (PEMM elements). A PEMM element is a goal-oriented, combined processing and data element. It includes a planner, an executor, a monitor, a modeler, and a local model. The elements form a tree-like structure. An element receives tasks from its ancestor and sends subtasks to its descendants. The model knowledge is distributed into the local models, which are connected to each other. The elements can be synchronized. The PEMM architecture is strictly hierarchical. It integrated planning, sensing, and modelling into a single framework. A PEMM-based control system is reactive, as it can cope with asynchronous events and operate under time constraints. The control system is intended to be used primarily to control mobile robots and robot manipulators in dynamic and partially unknown environments. It is suitable especially for applications consisting of physically separated devices and computing resources.
A stochastic differential equation model of diurnal cortisol patterns
NASA Technical Reports Server (NTRS)
Brown, E. N.; Meehan, P. M.; Dempster, A. P.
2001-01-01
Circadian modulation of episodic bursts is recognized as the normal physiological pattern of diurnal variation in plasma cortisol levels. The primary physiological factors underlying these diurnal patterns are the ultradian timing of secretory events, circadian modulation of the amplitude of secretory events, infusion of the hormone from the adrenal gland into the plasma, and clearance of the hormone from the plasma by the liver. Each measured plasma cortisol level has an error arising from the cortisol immunoassay. We demonstrate that all of these three physiological principles can be succinctly summarized in a single stochastic differential equation plus measurement error model and show that physiologically consistent ranges of the model parameters can be determined from published reports. We summarize the model parameters in terms of the multivariate Gaussian probability density and establish the plausibility of the model with a series of simulation studies. Our framework makes possible a sensitivity analysis in which all model parameters are allowed to vary simultaneously. The model offers an approach for simultaneously representing cortisol's ultradian, circadian, and kinetic properties. Our modeling paradigm provides a framework for simulation studies and data analysis that should be readily adaptable to the analysis of other endocrine hormone systems.
NASA Astrophysics Data System (ADS)
Razavi, Saman; Gupta, Hoshin
2015-04-01
Earth and Environmental Systems (EES) models are essential components of research, development, and decision-making in science and engineering disciplines. With continuous advances in understanding and computing power, such models are becoming more complex with increasingly more factors to be specified (model parameters, forcings, boundary conditions, etc.). To facilitate better understanding of the role and importance of different factors in producing the model responses, the procedure known as 'Sensitivity Analysis' (SA) can be very helpful. Despite the availability of a large body of literature on the development and application of various SA approaches, two issues continue to pose major challenges: (1) Ambiguous Definition of Sensitivity - Different SA methods are based in different philosophies and theoretical definitions of sensitivity, and can result in different, even conflicting, assessments of the underlying sensitivities for a given problem, (2) Computational Cost - The cost of carrying out SA can be large, even excessive, for high-dimensional problems and/or computationally intensive models. In this presentation, we propose a new approach to sensitivity analysis that addresses the dual aspects of 'effectiveness' and 'efficiency'. By effective, we mean achieving an assessment that is both meaningful and clearly reflective of the objective of the analysis (the first challenge above), while by efficiency we mean achieving statistically robust results with minimal computational cost (the second challenge above). Based on this approach, we develop a 'global' sensitivity analysis framework that efficiently generates a newly-defined set of sensitivity indices that characterize a range of important properties of metric 'response surfaces' encountered when performing SA on EES models. Further, we show how this framework embraces, and is consistent with, a spectrum of different concepts regarding 'sensitivity', and that commonly-used SA approaches (e.g., Sobol, Morris, etc.) are actually limiting cases of our approach under specific conditions. Multiple case studies are used to demonstrate the value of the new framework. The results show that the new framework provides a fundamental understanding of the underlying sensitivities for any given problem, while requiring orders of magnitude fewer model runs.
Fast image interpolation via random forests.
Huang, Jun-Jie; Siu, Wan-Chi; Liu, Tian-Rui
2015-10-01
This paper proposes a two-stage framework for fast image interpolation via random forests (FIRF). The proposed FIRF method gives high accuracy, as well as requires low computation. The underlying idea of this proposed work is to apply random forests to classify the natural image patch space into numerous subspaces and learn a linear regression model for each subspace to map the low-resolution image patch to high-resolution image patch. The FIRF framework consists of two stages. Stage 1 of the framework removes most of the ringing and aliasing artifacts in the initial bicubic interpolated image, while Stage 2 further refines the Stage 1 interpolated image. By varying the number of decision trees in the random forests and the number of stages applied, the proposed FIRF method can realize computationally scalable image interpolation. Extensive experimental results show that the proposed FIRF(3, 2) method achieves more than 0.3 dB improvement in peak signal-to-noise ratio over the state-of-the-art nonlocal autoregressive modeling (NARM) method. Moreover, the proposed FIRF(1, 1) obtains similar or better results as NARM while only takes its 0.3% computational time.
Nemesis Autonomous Test System
NASA Technical Reports Server (NTRS)
Barltrop, Kevin J.; Lee, Cin-Young; Horvath, Gregory A,; Clement, Bradley J.
2012-01-01
A generalized framework has been developed for systems validation that can be applied to both traditional and autonomous systems. The framework consists of an automated test case generation and execution system called Nemesis that rapidly and thoroughly identifies flaws or vulnerabilities within a system. By applying genetic optimization and goal-seeking algorithms on the test equipment side, a "war game" is conducted between a system and its complementary nemesis. The end result of the war games is a collection of scenarios that reveals any undesirable behaviors of the system under test. The software provides a reusable framework to evolve test scenarios using genetic algorithms using an operation model of the system under test. It can automatically generate and execute test cases that reveal flaws in behaviorally complex systems. Genetic algorithms focus the exploration of tests on the set of test cases that most effectively reveals the flaws and vulnerabilities of the system under test. It leverages advances in state- and model-based engineering, which are essential in defining the behavior of autonomous systems. It also uses goal networks to describe test scenarios.
A review of predictive nonlinear theories for multiscale modeling of heterogeneous materials
NASA Astrophysics Data System (ADS)
Matouš, Karel; Geers, Marc G. D.; Kouznetsova, Varvara G.; Gillman, Andrew
2017-02-01
Since the beginning of the industrial age, material performance and design have been in the midst of innovation of many disruptive technologies. Today's electronics, space, medical, transportation, and other industries are enriched by development, design and deployment of composite, heterogeneous and multifunctional materials. As a result, materials innovation is now considerably outpaced by other aspects from component design to product cycle. In this article, we review predictive nonlinear theories for multiscale modeling of heterogeneous materials. Deeper attention is given to multiscale modeling in space and to computational homogenization in addressing challenging materials science questions. Moreover, we discuss a state-of-the-art platform in predictive image-based, multiscale modeling with co-designed simulations and experiments that executes on the world's largest supercomputers. Such a modeling framework consists of experimental tools, computational methods, and digital data strategies. Once fully completed, this collaborative and interdisciplinary framework can be the basis of Virtual Materials Testing standards and aids in the development of new material formulations. Moreover, it will decrease the time to market of innovative products.
Hayes, Holly; Parchman, Michael L.; Howard, Ray
2012-01-01
Evaluating effective growth and development of a Practice-Based Research Network (PBRN) can be challenging. The purpose of this article is to describe the development of a logic model and how the framework has been used for planning and evaluation in a primary care PBRN. An evaluation team was formed consisting of the PBRN directors, staff and its board members. After the mission and the target audience were determined, facilitated meetings and discussions were held with stakeholders to identify the assumptions, inputs, activities, outputs, outcomes and outcome indicators. The long-term outcomes outlined in the final logic model are two-fold: 1.) Improved health outcomes of patients served by PBRN community clinicians; and 2.) Community clinicians are recognized leaders of quality research projects. The Logic Model proved useful in identifying stakeholder interests and dissemination activities as an area that required more attention in the PBRN. The logic model approach is a useful planning tool and project management resource that increases the probability that the PBRN mission will be successfully implemented. PMID:21900441
A review of predictive nonlinear theories for multiscale modeling of heterogeneous materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matouš, Karel, E-mail: kmatous@nd.edu; Geers, Marc G.D.; Kouznetsova, Varvara G.
2017-02-01
Since the beginning of the industrial age, material performance and design have been in the midst of innovation of many disruptive technologies. Today's electronics, space, medical, transportation, and other industries are enriched by development, design and deployment of composite, heterogeneous and multifunctional materials. As a result, materials innovation is now considerably outpaced by other aspects from component design to product cycle. In this article, we review predictive nonlinear theories for multiscale modeling of heterogeneous materials. Deeper attention is given to multiscale modeling in space and to computational homogenization in addressing challenging materials science questions. Moreover, we discuss a state-of-the-art platformmore » in predictive image-based, multiscale modeling with co-designed simulations and experiments that executes on the world's largest supercomputers. Such a modeling framework consists of experimental tools, computational methods, and digital data strategies. Once fully completed, this collaborative and interdisciplinary framework can be the basis of Virtual Materials Testing standards and aids in the development of new material formulations. Moreover, it will decrease the time to market of innovative products.« less
Multimodeling Framework for Predicting Water Quality in Fragmented Agriculture-Forest Ecosystems
NASA Astrophysics Data System (ADS)
Rose, J. B.; Guber, A.; Porter, W. F.; Williams, D.; Tamrakar, S.; Dechen Quinn, A.
2012-12-01
Both livestock and wildlife are major contributors of nonpoint pollution of surface water bodies. The interactions among them can substantially increase the chance of contamination especially in fragmented agriculture-forest landscapes, where wildlife (e.g. white tailed deer) can transmit diseases between remote farms. Unfortunately, models currently available for predicting fate and transport of microorganisms in these ecosystems do not account for such interactions. The objectives of this study are to develop and test a multimodeling framework that assesses the risk of microbial contamination of surface water caused by wildlife-livestock interactions in fragmented agriculture-forest ecosystems. The framework consists of a modified Soil Water Assessment Tool (SWAT), KINematic Runoff and EROSion model (KINEROS2) with the add-on module STWIR (Microorganism Transport with Infiltration and Runoff), RAMAS GIS, SIR compartmental model and Quantitative Microbial Risk Assessment model (QMRA). The watershed-scale model SWAT simulates plant biomass growth, wash-off of microorganisms from foliage and soil, overland and in-stream microbial transport, microbial growth, and die-off in foliage and soil. RAMAS GIS model predicts the most probable habitat and subsequent population of white-tailed deer based on land use and crop biomass. KINEROS-STWIR simulates overland transport of microorganisms released from soil, surface applied manure, and fecal deposits during runoff events at high temporal and special resolutions. KINEROS-STWIR and RAMAS GIS provide input for an SIR compartmental model which simulates disease transmission within and between deer groups. This information is used in SWAT model to account for transmission and deposition of pathogens by white tailed deer in stream water, foliage and soil. The QMRA approach extends to microorganisms inactivated in forage and water consumed by deer. Probabilities of deer infections and numbers of infected animals are computed based on a dose-response approach, including Beta Poisson and Maximum Risk models, which take into account pathogen variation in infectivity. An example of the Multimodeling framework performance for a fragmented agriculture-forest ecosystem will be shown in the presentation.
Alkhatib, Omar J; Abdou, Alaa
2018-04-01
The construction industry is usually characterized as a fragmented system of multiple-organizational entities in which members from different technical backgrounds and moral values join together to develop a particular business or project. The greatest challenge in the construction process for the achievement of a successful practice is the development of an outstanding reputation, which is built on identifying and applying an ethical framework. This framework should reflect a common ethical ground for myriad people involved in this process to survive and compete ethically in today's turbulent construction market. This study establishes a framework for ethical judgment of behavior and actions conducted in the construction process. The framework was primarily developed based on the essential attributes of business management identified in the literature review and subsequently incorporates additional attributes identified to prevent breaches in the construction industry and common ethical values related to professional engineering. The proposed judgment framework is based primarily on the ethical dimension of professional responsibility. The Ethical Judgment Framework consists of descriptive approaches involving technical, professional, administrative, and miscellaneous terms. The framework provides the basis for judging actions as either ethical or unethical. Furthermore, the framework can be implemented as a form of preventive ethics, which would help avoid ethical dilemmas and moral allegations. The framework can be considered a decision-making model to guide actions and improve the ethical reasoning process that would help individuals think through possible implications and consequences of ethical dilemmas in the construction industry.
NASA Astrophysics Data System (ADS)
Pan, Ming; Troy, Tara; Sahoo, Alok; Sheffield, Justin; Wood, Eric
2010-05-01
Documentation of the water cycle and its evolution over time is a primary scientific goal of the Global Energy and Water Cycle Experiment (GEWEX) and fundamental to assessing global change impacts. In developed countries, observation systems that include in-situ, remote sensing and modeled data can provide long-term, consistent and generally high quality datasets of water cycle variables. The export of these technologies to less developed regions has been rare, but it is these regions where information on water availability and change is probably most needed in the face of regional environmental change due to climate, land use and water management. In these data sparse regions, in situ data alone are insufficient to develop a comprehensive picture of how the water cycle is changing, and strategies that merge in-situ, model and satellite observations within a framework that results in consistent water cycle records is essential. Such an approach is envisaged by the Global Earth Observing System of Systems (GOESS), but has yet to be applied. The goal of this study is to quantify the variation and changes in the global water cycle over the past 50 years. We evaluate the global water cycle using a variety of independent large-scale datasets of hydrologic variables that are used to bridge the gap between sparse in-situ observations, including remote-sensing based retrievals, observation-forced hydrologic modeling, and weather model reanalyses. A data assimilation framework that blends these disparate sources of information together in a consistent fashion with attention to budget closure is applied to make best estimates of the global water cycle and its variation. The framework consists of a constrained Kalman filter applied to the water budget equation. With imperfect estimates of the water budget components, the equation additionally has an error residual term that is redistributed across the budget components using error statistics, which are estimated from the uncertainties among data products. The constrained Kalman filter treats the budget closure constraint as a perfect observation within the assimilation framework. Precipitation is estimated using gauge observations, reanalysis products, and remote sensing products for below 50°N. Evapotranspiration is estimated in a number of ways: from the VIC land surface hydrologic model forced with a hybrid reanalysis-observation global forcing dataset, from remote sensing retrievals based on a suite of energy balance and process based models, and from an atmospheric water budget approach using reanalysis products for the atmospheric convergence and storage terms and our best estimate for precipitation. Terrestrial water storage changes, including surface and subsurface changes, are estimated using estimates from both VIC and the GRACE remote sensing retrievals. From these components, discharge can then be calculated as a residual of the water budget and compared with gauge observations to evaluate the closure of the water budget. Through the use of these largely independent data products, we estimate both the mean seasonal cycle of the water budget components and their uncertainties for a set of 20 large river basins across the globe. We particularly focus on three regions of interest in global changes studies: the Northern Eurasian region which is experiencing rapid change in terrestrial processes; the Amazon which is a central part of the global water, energy and carbon budgets; and Africa, which is predicted to face some of the most critical challenges for water and food security in the coming decades.
A cognitive perspective on health systems integration: results of a Canadian Delphi study.
Evans, Jenna M; Baker, G Ross; Berta, Whitney; Barnsley, Jan
2014-05-19
Ongoing challenges to healthcare integration point toward the need to move beyond structural and process issues. While we know what needs to be done to achieve integrated care, there is little that informs us as to how. We need to understand how diverse organizations and professionals develop shared knowledge and beliefs - that is, we need to generate knowledge about normative integration. We present a cognitive perspective on integration, based on shared mental model theory, that may enhance our understanding and ability to measure and influence normative integration. The aim of this paper is to validate and improve the Mental Models of Integrated Care (MMIC) Framework, which outlines important knowledge and beliefs whose convergence or divergence across stakeholder groups may influence inter-professional and inter-organizational relations. We used a two-stage web-based modified Delphi process to test the MMIC Framework against expert opinion using a random sample of participants from Canada's National Symposium on Integrated Care. Respondents were asked to rate the framework's clarity, comprehensiveness, usefulness, and importance using seven-point ordinal scales. Spaces for open comments were provided. Descriptive statistics were used to describe the structured responses, while open comments were coded and categorized using thematic analysis. The Kruskall-Wallis test was used to examine cross-group agreement by level of integration experience, current workplace, and current role. In the first round, 90 individuals responded (52% response rate), representing a wide range of professional roles and organization types from across the continuum of care. In the second round, 68 individuals responded (75.6% response rate). The quantitative and qualitative feedback from experts was used to revise the framework. The re-named "Integration Mindsets Framework" consists of a Strategy Mental Model and a Relationships Mental Model, comprising a total of nineteen content areas. The Integration Mindsets Framework draws the attention of researchers and practitioners to how various stakeholders think about and conceptualize integration. A cognitive approach to understanding and measuring normative integration complements dominant cultural approaches and allows for more fine-grained analyses. The framework can be used by managers and leaders to facilitate the interpretation, planning, implementation, management and evaluation of integration initiatives.
Bessette, Douglas L; Campbell-Arvai, Victoria; Arvai, Joseph
2016-05-01
This article presents research aimed at developing and testing an online, multistakeholder decision-aiding framework for informing multiattribute risk management choices associated with energy development and climate change. The framework was designed to provide necessary background information and facilitate internally consistent choices, or choices that are in line with users' prioritized objectives. In order to test different components of the decision-aiding framework, a six-part, 2 × 2 × 2 factorial experiment was conducted, yielding eight treatment scenarios. The three factors included: (1) whether or not users could construct their own alternatives; (2) the level of detail regarding the composition of alternatives users would evaluate; and (3) the way in which a final choice between users' own constructed (or highest-ranked) portfolio and an internally consistent portfolio was presented. Participants' self-reports revealed the framework was easy to use and providing an opportunity to develop one's own risk-management alternatives (Factor 1) led to the highest knowledge gains. Empirical measures showed the internal consistency of users' decisions across all treatments to be lower than expected and confirmed that providing information about alternatives' composition (Factor 2) resulted in the least internally consistent choices. At the same time, those users who did not develop their own alternatives and were not shown detailed information about the composition of alternatives believed their choices to be the most internally consistent. These results raise concerns about how the amount of information provided and the ability to construct alternatives may inversely affect users' real and perceived internal consistency. © 2015 Society for Risk Analysis.
A conceptual framework for achieving performance enhancing drug compliance in sport.
Donovan, Robert J; Egger, Garry; Kapernick, Vicki; Mendoza, John
2002-01-01
There has been, and continues to be, widespread international concern about athletes' use of banned performance enhancing drugs (PEDs). This concern culminated in the formation of the World Anti-Doping Agency (WADA) in November 1999. To date, the main focus on controlling the use of PEDs has been on testing athletes and the development of tests to detect usage. Although athletes' beliefs and values are known to influence whether or not an athlete will use drugs, little is known about athletes' beliefs and attitudes, and the limited empirical literature shows little use of behavioural science frameworks to guide research methodology, results interpretation, and intervention implications. Mindful of this in preparing its anti-doping strategy for the 2000 Olympics, the Australian Sports Drug Agency (ASDA) in 1997 commissioned a study to assess the extent to which models of attitude-behaviour change in the public health/injury prevention literature had useful implications for compliance campaigns in the sport drug area. A preliminary compliance model was developed from three behavioural science frameworks: social cognition models; threat (or fear) appeals; and instrumental and normative approaches. A subsequent review of the performance enhancing drug literature confirmed that the overall framework was consistent with known empirical data, and therefore had at least face validity if not construct validity. The overall model showed six major inputs to an athlete's attitudes and intentions with respect to performance enhancing drug usage: personality factors, threat appraisal, benefit appraisal, reference group influences, personal morality and legitimacy. The model demonstrated that a comprehensive, fully integrated programme is necessary for maximal effect, and provides anti-doping agencies with a structured framework for strategic planning and implementing interventions. Programmes can be developed in each of the six major areas, with allocation of resources to each area based on needs-assessment research with athletes and other relevant groups.
A general framework for predicting delayed responses of ecological communities to habitat loss.
Chen, Youhua; Shen, Tsung-Jen
2017-04-20
Although biodiversity crisis at different spatial scales has been well recognised, the phenomena of extinction debt and immigration credit at a crossing-scale context are, at best, unclear. Based on two community patterns, regional species abundance distribution (SAD) and spatial abundance distribution (SAAD), Kitzes and Harte (2015) presented a macroecological framework for predicting post-disturbance delayed extinction patterns in the entire ecological community. In this study, we further expand this basic framework to predict diverse time-lagged effects of habitat destruction on local communities. Specifically, our generalisation of KH's model could address the questions that could not be answered previously: (1) How many species are subjected to delayed extinction in a local community when habitat is destructed in other areas? (2) How do rare or endemic species contribute to extinction debt or immigration credit of the local community? (3) How will species differ between two local areas? From the demonstrations using two SAD models (single-parameter lognormal and logseries), the predicted patterns of the debt, credit, and change in the fraction of unique species can vary, but with consistencies and depending on several factors. The general framework deepens the understanding of the theoretical effects of habitat loss on community dynamic patterns in local samples.
Scott, Michael J.; Daly, Don S.; Hejazi, Mohamad I.; ...
2016-02-06
Here, one of the most important interactions between humans and climate is in the demand and supply of water. Humans withdraw, use, and consume water and return waste water to the environment for a variety of socioeconomic purposes, including domestic, commercial, and industrial use, production of energy resources and cooling thermal-electric power plants, and growing food, fiber, and chemical feed stocks for human consumption. Uncertainties in the future human demand for water interact with future impacts of climatic change on water supplies to impinge on water management decisions at the international, national, regional, and local level, but until recently toolsmore » were not available to assess the uncertainties surrounding these decisions. This paper demonstrates the use of a multi-model framework in a structured sensitivity analysis to project and quantify the sensitivity of future deficits in surface water in the context of climate and socioeconomic change for all U.S. states and sub-basins. The framework treats all sources of water demand and supply consistently from the world to local level. The paper illustrates the capabilities of the framework with sample results for a river sub-basin in the U.S. state of Georgia.« less
Angelis, Aris; Kanavos, Panos
2017-09-01
Escalating drug prices have catalysed the generation of numerous "value frameworks" with the aim of informing payers, clinicians and patients on the assessment and appraisal process of new medicines for the purpose of coverage and treatment selection decisions. Although this is an important step towards a more inclusive Value Based Assessment (VBA) approach, aspects of these frameworks are based on weak methodologies and could potentially result in misleading recommendations or decisions. In this paper, a Multiple Criteria Decision Analysis (MCDA) methodological process, based on Multi Attribute Value Theory (MAVT), is adopted for building a multi-criteria evaluation model. A five-stage model-building process is followed, using a top-down "value-focused thinking" approach, involving literature reviews and expert consultations. A generic value tree is structured capturing decision-makers' concerns for assessing the value of new medicines in the context of Health Technology Assessment (HTA) and in alignment with decision theory. The resulting value tree (Advance Value Tree) consists of three levels of criteria (top level criteria clusters, mid-level criteria, bottom level sub-criteria or attributes) relating to five key domains that can be explicitly measured and assessed: (a) burden of disease, (b) therapeutic impact, (c) safety profile (d) innovation level and (e) socioeconomic impact. A number of MAVT modelling techniques are introduced for operationalising (i.e. estimating) the model, for scoring the alternative treatment options, assigning relative weights of importance to the criteria, and combining scores and weights. Overall, the combination of these MCDA modelling techniques for the elicitation and construction of value preferences across the generic value tree provides a new value framework (Advance Value Framework) enabling the comprehensive measurement of value in a structured and transparent way. Given its flexibility to meet diverse requirements and become readily adaptable across different settings, the Advance Value Framework could be offered as a decision-support tool for evaluators and payers to aid coverage and reimbursement of new medicines. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Martinek, Janna; Wendelin, Timothy; Ma, Zhiwen
2018-04-05
Concentrating solar power (CSP) plants can provide dispatchable power with a thermal energy storage capability for increased renewable-energy grid penetration. Particle-based CSP systems permit higher temperatures, and thus, potentially higher solar-to-electric efficiency than state-of-the-art molten-salt heat-transfer systems. This paper describes a detailed numerical analysis framework for estimating the performance of a novel, geometrically complex, enclosed particle receiver design. The receiver configuration uses arrays of small tubular absorbers to collect and subsequently transfer solar energy to a flowing particulate medium. The enclosed nature of the receiver design renders it amenable to either an inert heat-transfer medium, or a reactive heat-transfer medium that requires a controllable ambient environment. The numerical analysis framework described in this study is demonstrated for the case of thermal reduction of CaCr 0.1Mn 0.9O 3-more » $$\\delta$$ for thermochemical energy storage. The modeling strategy consists of Monte Carlo ray tracing for absorbed solar-energy distributions from a surround heliostat field, computational fluid dynamics modeling of small-scale local tubular arrays, surrogate response surfaces that approximately capture simulated tubular array performance, a quasi-two-dimensional reduced-order description of counter-flow reactive solids and purge gas, and a radiative exchange model applied to embedded-cavity structures at the size scale of the full receiver. In this work we apply the numerical analysis strategy to a single receiver configuration, but the framework can be generically applicable to alternative enclosed designs. In conclusion, we assess sensitivity of receiver performance to surface optical properties, heat-transfer coefficients, solids outlet temperature, and purge-gas feed rates, and discuss the significance of model assumptions and results for future receiver development.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martinek, Janna; Wendelin, Timothy; Ma, Zhiwen
Concentrating solar power (CSP) plants can provide dispatchable power with a thermal energy storage capability for increased renewable-energy grid penetration. Particle-based CSP systems permit higher temperatures, and thus, potentially higher solar-to-electric efficiency than state-of-the-art molten-salt heat-transfer systems. This paper describes a detailed numerical analysis framework for estimating the performance of a novel, geometrically complex, enclosed particle receiver design. The receiver configuration uses arrays of small tubular absorbers to collect and subsequently transfer solar energy to a flowing particulate medium. The enclosed nature of the receiver design renders it amenable to either an inert heat-transfer medium, or a reactive heat-transfer medium that requires a controllable ambient environment. The numerical analysis framework described in this study is demonstrated for the case of thermal reduction of CaCr 0.1Mn 0.9O 3-more » $$\\delta$$ for thermochemical energy storage. The modeling strategy consists of Monte Carlo ray tracing for absorbed solar-energy distributions from a surround heliostat field, computational fluid dynamics modeling of small-scale local tubular arrays, surrogate response surfaces that approximately capture simulated tubular array performance, a quasi-two-dimensional reduced-order description of counter-flow reactive solids and purge gas, and a radiative exchange model applied to embedded-cavity structures at the size scale of the full receiver. In this work we apply the numerical analysis strategy to a single receiver configuration, but the framework can be generically applicable to alternative enclosed designs. In conclusion, we assess sensitivity of receiver performance to surface optical properties, heat-transfer coefficients, solids outlet temperature, and purge-gas feed rates, and discuss the significance of model assumptions and results for future receiver development.« less
NASA Technical Reports Server (NTRS)
Cortes, Gonzalo; Girotto, Manuela; Margulis, Steven
2016-01-01
A data assimilation framework was implemented with the objective of obtaining high resolution retrospective snow water equivalent (SWE) estimates over several Andean study basins. The framework integrates Landsat fractional snow covered area (fSCA) images, a land surface and snow depletion model, and the Modern Era Retrospective Analysis for Research and Applications (MERRA) reanalysis as a forcing data set. The outputs are SWE and fSCA fields (1985-2015) at a resolution of 90 m that are consistent with the observed depletion record. Verification using in-situ snow surveys showed significant improvements in the accuracy of the SWE estimates relative to forward model estimates, with increases in correlation (0.49-0.87) and reductions in root mean square error (0.316 m to 0.129 m) and mean error (-0.221 m to 0.009 m). A sensitivity analysis showed that the framework is robust to variations in physiography, fSCA data availability and a priori precipitation biases. Results from the application to the headwater basin of the Aconcagua River showed how the forward model versus the fSCA-conditioned estimate resulted in different quantifications of the relationship between runoff and SWE, and different correlation patterns between pixel-wise SWE and ENSO. The illustrative results confirm the influence that ENSO has on snow accumulation for Andean basins draining into the Pacific, with ENSO explaining approximately 25% of the variability in near-peak (1 September) SWE values. Our results show how the assimilation of fSCA data results in a significant improvement upon MERRA-forced modeled SWE estimates, further increasing the utility of the MERRA data for high-resolution snow modeling applications.
SCaLeM: A Framework for Characterizing and Analyzing Execution Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavarría-Miranda, Daniel; Manzano Franco, Joseph B.; Krishnamoorthy, Sriram
2014-10-13
As scalable parallel systems evolve towards more complex nodes with many-core architectures and larger trans-petascale & upcoming exascale deployments, there is a need to understand, characterize and quantify the underlying execution models being used on such systems. Execution models are a conceptual layer between applications & algorithms and the underlying parallel hardware and systems software on which those applications run. This paper presents the SCaLeM (Synchronization, Concurrency, Locality, Memory) framework for characterizing and execution models. SCaLeM consists of three basic elements: attributes, compositions and mapping of these compositions to abstract parallel systems. The fundamental Synchronization, Concurrency, Locality and Memory attributesmore » are used to characterize each execution model, while the combinations of those attributes in the form of compositions are used to describe the primitive operations of the execution model. The mapping of the execution model’s primitive operations described by compositions, to an underlying abstract parallel system can be evaluated quantitatively to determine its effectiveness. Finally, SCaLeM also enables the representation and analysis of applications in terms of execution models, for the purpose of evaluating the effectiveness of such mapping.« less
An Example-Based Brain MRI Simulation Framework.
He, Qing; Roy, Snehashis; Jog, Amod; Pham, Dzung L
2015-02-21
The simulation of magnetic resonance (MR) images plays an important role in the validation of image analysis algorithms such as image segmentation, due to lack of sufficient ground truth in real MR images. Previous work on MRI simulation has focused on explicitly modeling the MR image formation process. However, because of the overwhelming complexity of MR acquisition these simulations must involve simplifications and approximations that can result in visually unrealistic simulated images. In this work, we describe an example-based simulation framework, which uses an "atlas" consisting of an MR image and its anatomical models derived from the hard segmentation. The relationships between the MR image intensities and its anatomical models are learned using a patch-based regression that implicitly models the physics of the MR image formation. Given the anatomical models of a new brain, a new MR image can be simulated using the learned regression. This approach has been extended to also simulate intensity inhomogeneity artifacts based on the statistical model of training data. Results show that the example based MRI simulation method is capable of simulating different image contrasts and is robust to different choices of atlas. The simulated images resemble real MR images more than simulations produced by a physics-based model.
A novel medical image data-based multi-physics simulation platform for computational life sciences.
Neufeld, Esra; Szczerba, Dominik; Chavannes, Nicolas; Kuster, Niels
2013-04-06
Simulating and modelling complex biological systems in computational life sciences requires specialized software tools that can perform medical image data-based modelling, jointly visualize the data and computational results, and handle large, complex, realistic and often noisy anatomical models. The required novel solvers must provide the power to model the physics, biology and physiology of living tissue within the full complexity of the human anatomy (e.g. neuronal activity, perfusion and ultrasound propagation). A multi-physics simulation platform satisfying these requirements has been developed for applications including device development and optimization, safety assessment, basic research, and treatment planning. This simulation platform consists of detailed, parametrized anatomical models, a segmentation and meshing tool, a wide range of solvers and optimizers, a framework for the rapid development of specialized and parallelized finite element method solvers, a visualization toolkit-based visualization engine, a Python scripting interface for customized applications, a coupling framework, and more. Core components are cross-platform compatible and use open formats. Several examples of applications are presented: hyperthermia cancer treatment planning, tumour growth modelling, evaluating the magneto-haemodynamic effect as a biomarker and physics-based morphing of anatomical models.
Moving vehicles segmentation based on Gaussian motion model
NASA Astrophysics Data System (ADS)
Zhang, Wei; Fang, Xiang Z.; Lin, Wei Y.
2005-07-01
Moving objects segmentation is a challenge in computer vision. This paper focuses on the segmentation of moving vehicles in dynamic scene. We analyses the psychology of human vision and present a framework for segmenting moving vehicles in the highway. The proposed framework consists of two parts. Firstly, we propose an adaptive background update method in which the background is updated according to the change of illumination conditions and thus can adapt to the change of illumination sensitively. Secondly, we construct a Gaussian motion model to segment moving vehicles, in which the motion vectors of the moving pixels are modeled as a Gaussian model and an on-line EM algorithm is used to update the model. The Gaussian distribution of the adaptive model is elevated to determine which moving vectors result from moving vehicles and which from other moving objects such as waving trees. Finally, the pixels with motion vector result from the moving vehicles are segmented. Experimental results of several typical scenes show that the proposed model can detect the moving vehicles correctly and is immune from influence of the moving objects caused by the waving trees and the vibration of camera.
Integrated modelling of ecosystem services and energy systems research
NASA Astrophysics Data System (ADS)
Agarwala, Matthew; Lovett, Andrew; Bateman, Ian; Day, Brett; Agnolucci, Paolo; Ziv, Guy
2016-04-01
The UK Government is formally committed to reducing carbon emissions and protecting and improving natural capital and the environment. However, actually delivering on these objectives requires an integrated approach to addressing two parallel challenges: de-carbonising future energy system pathways; and safeguarding natural capital to ensure the continued flow of ecosystem services. Although both emphasise benefiting from natural resources, efforts to connect natural capital and energy systems research have been limited, meaning opportunities to improve management of natural resources and meet society's energy needs could be missed. The ecosystem services paradigm provides a consistent conceptual framework that applies in multiple disciplines across the natural and economic sciences, and facilitates collaboration between them. At the forefront of the field, integrated ecosystem service - economy models have guided public- and private-sector decision making at all levels. Models vary in sophistication from simple spreadsheet tools to complex software packages integrating biophysical, GIS and economic models and draw upon many fields, including ecology, hydrology, geography, systems theory, economics and the social sciences. They also differ in their ability to value changes in natural capital and ecosystem services at various spatial and temporal scales. Despite these differences, current models share a common feature: their treatment of energy systems is superficial at best. In contrast, energy systems research has no widely adopted, unifying conceptual framework that organises thinking about key system components and interactions. Instead, the literature is organised around modelling approaches, including life cycle analyses, econometric investigations, linear programming and computable general equilibrium models. However, some consistencies do emerge. First, often contain a linear set of steps, from exploration to resource supply, fuel processing, conversion/generation, transmission, distribution, and finally, end energy use. Although each step clearly impacts upon natural capital, links to the natural environment are rarely identified or quantified within energy research. In short, the respective conceptual frameworks guiding ecosystem service and energy research are not well integrated. Major knowledge and research gaps appear at the system boundaries: while energy models may mention flows of residuals, exploring where exactly these flows enter the environment, and how they impact ecosystems and natural capital is often considered to be 'outside the system boundary'. While integrated modelling represents the frontier of ecosystem service research, current efforts largely ignore the future energy pathways set out by energy systems models and government carbon targets. This disconnect means that policy-oriented research on how best to (i) maintain natural capital and (ii) meet specific climate targets may be poorly aligned, or worse, offer conflicting advice. We present a re-imagined version of the ecosystem services conceptual framework, in which emphasis is placed on interactions between energy systems and the natural environment. Using the UK as a case study, we employ a recent integrated environmental-economic ecosystem service model, TIM, developed by Bateman et al (2014) and energy pathways developed by the UK Energy Research Centre and the UK Government Committee on Climate Change to illustrate how the new conceptual framework might apply in real world applications.
Ye, Xiaoduan; O'Neil, Patrick K; Foster, Adrienne N; Gajda, Michal J; Kosinski, Jan; Kurowski, Michal A; Bujnicki, Janusz M; Friedman, Alan M; Bailey-Kellogg, Chris
2004-12-01
Emerging high-throughput techniques for the characterization of protein and protein-complex structures yield noisy data with sparse information content, placing a significant burden on computation to properly interpret the experimental data. One such technique uses cross-linking (chemical or by cysteine oxidation) to confirm or select among proposed structural models (e.g., from fold recognition, ab initio prediction, or docking) by testing the consistency between cross-linking data and model geometry. This paper develops a probabilistic framework for analyzing the information content in cross-linking experiments, accounting for anticipated experimental error. This framework supports a mechanism for planning experiments to optimize the information gained. We evaluate potential experiment plans using explicit trade-offs among key properties of practical importance: discriminability, coverage, balance, ambiguity, and cost. We devise a greedy algorithm that considers those properties and, from a large number of combinatorial possibilities, rapidly selects sets of experiments expected to discriminate pairs of models efficiently. In an application to residue-specific chemical cross-linking, we demonstrate the ability of our approach to plan experiments effectively involving combinations of cross-linkers and introduced mutations. We also describe an experiment plan for the bacteriophage lambda Tfa chaperone protein in which we plan dicysteine mutants for discriminating threading models by disulfide formation. Preliminary results from a subset of the planned experiments are consistent and demonstrate the practicality of planning. Our methods provide the experimenter with a valuable tool (available from the authors) for understanding and optimizing cross-linking experiments.
Truccolo, Wilson
2017-01-01
Point process generalized linear models (PP-GLMs) provide an important statistical framework for modeling spiking activity in single-neurons and neuronal networks. Stochastic stability is essential when sampling from these models, as done in computational neuroscience to analyze statistical properties of neuronal dynamics and in neuro-engineering to implement closed-loop applications. Here we show, however, that despite passing common goodness-of-fit tests, PP-GLMs estimated from data are often unstable, leading to divergent firing rates. The inclusion of absolute refractory periods is not a satisfactory solution since the activity then typically settles into unphysiological rates. To address these issues, we derive a framework for determining the existence and stability of fixed points of the expected conditional intensity function (CIF) for general PP-GLMs. Specifically, in nonlinear Hawkes PP-GLMs, the CIF is expressed as a function of the previous spike history and exogenous inputs. We use a mean-field quasi-renewal (QR) approximation that decomposes spike history effects into the contribution of the last spike and an average of the CIF over all spike histories prior to the last spike. Fixed points for stationary rates are derived as self-consistent solutions of integral equations. Bifurcation analysis and the number of fixed points predict that the original models can show stable, divergent, and metastable (fragile) dynamics. For fragile models, fluctuations of the single-neuron dynamics predict expected divergence times after which rates approach unphysiologically high values. This metric can be used to estimate the probability of rates to remain physiological for given time periods, e.g., for simulation purposes. We demonstrate the use of the stability framework using simulated single-neuron examples and neurophysiological recordings. Finally, we show how to adapt PP-GLM estimation procedures to guarantee model stability. Overall, our results provide a stability framework for data-driven PP-GLMs and shed new light on the stochastic dynamics of state-of-the-art statistical models of neuronal spiking activity. PMID:28234899
Gerhard, Felipe; Deger, Moritz; Truccolo, Wilson
2017-02-01
Point process generalized linear models (PP-GLMs) provide an important statistical framework for modeling spiking activity in single-neurons and neuronal networks. Stochastic stability is essential when sampling from these models, as done in computational neuroscience to analyze statistical properties of neuronal dynamics and in neuro-engineering to implement closed-loop applications. Here we show, however, that despite passing common goodness-of-fit tests, PP-GLMs estimated from data are often unstable, leading to divergent firing rates. The inclusion of absolute refractory periods is not a satisfactory solution since the activity then typically settles into unphysiological rates. To address these issues, we derive a framework for determining the existence and stability of fixed points of the expected conditional intensity function (CIF) for general PP-GLMs. Specifically, in nonlinear Hawkes PP-GLMs, the CIF is expressed as a function of the previous spike history and exogenous inputs. We use a mean-field quasi-renewal (QR) approximation that decomposes spike history effects into the contribution of the last spike and an average of the CIF over all spike histories prior to the last spike. Fixed points for stationary rates are derived as self-consistent solutions of integral equations. Bifurcation analysis and the number of fixed points predict that the original models can show stable, divergent, and metastable (fragile) dynamics. For fragile models, fluctuations of the single-neuron dynamics predict expected divergence times after which rates approach unphysiologically high values. This metric can be used to estimate the probability of rates to remain physiological for given time periods, e.g., for simulation purposes. We demonstrate the use of the stability framework using simulated single-neuron examples and neurophysiological recordings. Finally, we show how to adapt PP-GLM estimation procedures to guarantee model stability. Overall, our results provide a stability framework for data-driven PP-GLMs and shed new light on the stochastic dynamics of state-of-the-art statistical models of neuronal spiking activity.
Q-space trajectory imaging for multidimensional diffusion MRI of the human brain
Westin, Carl-Fredrik; Knutsson, Hans; Pasternak, Ofer; Szczepankiewicz, Filip; Özarslan, Evren; van Westen, Danielle; Mattisson, Cecilia; Bogren, Mats; O’Donnell, Lauren; Kubicki, Marek; Topgaard, Daniel; Nilsson, Markus
2016-01-01
This work describes a new diffusion MR framework for imaging and modeling of microstructure that we call q-space trajectory imaging (QTI). The QTI framework consists of two parts: encoding and modeling. First we propose q-space trajectory encoding, which uses time-varying gradients to probe a trajectory in q-space, in contrast to traditional pulsed field gradient sequences that attempt to probe a point in q-space. Then we propose a microstructure model, the diffusion tensor distribution (DTD) model, which takes advantage of additional information provided by QTI to estimate a distributional model over diffusion tensors. We show that the QTI framework enables microstructure modeling that is not possible with the traditional pulsed gradient encoding as introduced by Stejskal and Tanner. In our analysis of QTI, we find that the well-known scalar b-value naturally extends to a tensor-valued entity, i.e., a diffusion measurement tensor, which we call the b-tensor. We show that b-tensors of rank 2 or 3 enable estimation of the mean and covariance of the DTD model in terms of a second order tensor (the diffusion tensor) and a fourth order tensor. The QTI framework has been designed to improve discrimination of the sizes, shapes, and orientations of diffusion microenvironments within tissue. We derive rotationally invariant scalar quantities describing intuitive microstructural features including size, shape, and orientation coherence measures. To demonstrate the feasibility of QTI on a clinical scanner, we performed a small pilot study comparing a group of five healthy controls with five patients with schizophrenia. The parameter maps derived from QTI were compared between the groups, and 9 out of the 14 parameters investigated showed differences between groups. The ability to measure and model the distribution of diffusion tensors, rather than a quantity that has already been averaged within a voxel, has the potential to provide a powerful paradigm for the study of complex tissue architecture. PMID:26923372
Coupling biology and oceanography in models.
Fennel, W; Neumann, T
2001-08-01
The dynamics of marine ecosystems, i.e. the changes of observable chemical-biological quantities in space and time, are driven by biological and physical processes. Predictions of future developments of marine systems need a theoretical framework, i.e. models, solidly based on research and understanding of the different processes involved. The natural way to describe marine systems theoretically seems to be the embedding of chemical-biological models into circulation models. However, while circulation models are relatively advanced the quantitative theoretical description of chemical-biological processes lags behind. This paper discusses some of the approaches and problems in the development of consistent theories and indicates the beneficial potential of the coupling of marine biology and oceanography in models.
From Old to New: The Australian Qualifications Framework
ERIC Educational Resources Information Center
Wheelahan, Leesa
2011-01-01
The Australian Qualifications Framework (AQF) is a "first generation" qualifications framework that was established in 1995. Its purpose was to create "a comprehensive, nationally consistent yet flexible framework for all qualifications in post-compulsory education and training." It encompasses all post-compulsory…
Storm time plasma transport in a unified and inter-coupled global magnetosphere model
NASA Astrophysics Data System (ADS)
Ilie, R.; Liemohn, M. W.; Toth, G.
2014-12-01
We present results from the two-way self-consistent coupling between the kinetic Hot Electron and Ion Drift Integrator (HEIDI) model and the Space Weather Modeling Framework (SWMF). HEIDI solves the time dependent, gyration and bounced averaged kinetic equation for the phase space density of different ring current species and computes full pitch angle distributions for all local times and radial distances. During geomagnetic times the dipole approximation becomes unsuitable even in the inner magnetosphere. Therefore the HEIDI model was generalized to accommodate an arbitrary magnetic field and through the coupling with SWMF it obtains a magnetic field description throughout the HEIDI domain along with a plasma distribution at the model outer boundary from the Block Adaptive Tree Solar Wind Roe Upwind Scheme (BATS-R-US) magnetohydrodynamics (MHD) model within SWMF. Electric field self-consistency is assured by the passing of convection potentials from the Ridley Ionosphere Model (RIM) within SWMF. In this study we test the various levels of coupling between the 3 physics based models, highlighting the role that the magnetic field, plasma sheet conditions and the cross polar cap potential play in the formation and evolution of the ring current. We show that the dynamically changing geospace environment itself plays a key role in determining the geoeffectiveness of the driver. The results of the self-consistent coupling between HEIDI, BATS-R-US and RIM during disturbed conditions emphasize the importance of a kinetic self-consistent approach to the description of geospace.
Study of impurity effects on CFETR steady-state scenario by self-consistent integrated modeling
NASA Astrophysics Data System (ADS)
Shi, Nan; Chan, Vincent S.; Jian, Xiang; Li, Guoqiang; Chen, Jiale; Gao, Xiang; Shi, Shengyu; Kong, Defeng; Liu, Xiaoju; Mao, Shifeng; Xu, Guoliang
2017-12-01
Impurity effects on fusion performance of China fusion engineering test reactor (CFETR) due to extrinsic seeding are investigated. An integrated 1.5D modeling workflow evolves plasma equilibrium and all transport channels to steady state. The one modeling framework for integrated tasks framework is used to couple the transport solver, MHD equilibrium solver, and source and sink calculations. A self-consistent impurity profile constructed using a steady-state background plasma, which satisfies quasi-neutrality and true steady state, is presented for the first time. Studies are performed based on an optimized fully non-inductive scenario with varying concentrations of Argon (Ar) seeding. It is found that fusion performance improves before dropping off with increasing {{Z}\\text{eff}} , while the confinement remains at high level. Further analysis of transport for these plasmas shows that low-k ion temperature gradient modes dominate the turbulence. The decrease in linear growth rate and resultant fluxes of all channels with increasing {{Z}\\text{eff}} can be traced to impurity profile change by transport. The improvement in confinement levels off at higher {{Z}\\text{eff}} . Over the regime of study there is a competition between the suppressed transport and increasing radiation that leads to a peak in the fusion performance at {{Z}\\text{eff}} (~2.78 for CFETR). Extrinsic impurity seeding to control divertor heat load will need to be optimized around this value for best fusion performance.
NASA Astrophysics Data System (ADS)
Sądowski, Aleksander; Wielgus, Maciek; Narayan, Ramesh; Abarca, David; McKinney, Jonathan C.; Chael, Andrew
2017-04-01
We present a numerical method that evolves a two-temperature, magnetized, radiative, accretion flow around a black hole, within the framework of general relativistic radiation magnetohydrodynamics. As implemented in the code KORAL, the gas consists of two sub-components - ions and electrons - which share the same dynamics but experience independent, relativistically consistent, thermodynamical evolution. The electrons and ions are heated independently according to a prescription from the literature for magnetohydrodynamical turbulent dissipation. Energy exchange between the particle species via Coulomb collisions is included. In addition, electrons gain and lose energy and momentum by absorbing and emitting synchrotron and bremsstrahlung radiation and through Compton scattering. All evolution equations are handled within a fully covariant framework in the relativistic fixed-metric space-time of the black hole. Numerical results are presented for five models of low-luminosity black hole accretion. In the case of a model with a mass accretion rate dot{M}˜ 4× 10^{-8} dot{M}_Edd, we find that radiation has a negligible effect on either the dynamics or the thermodynamics of the accreting gas. In contrast, a model with a larger dot{M}˜ 4× 10^{-4} dot{M}_Edd behaves very differently. The accreting gas is much cooler and the flow is geometrically less thick, though it is not quite a thin accretion disc.
A framework for the selection and ensemble development of flood vulnerability models
NASA Astrophysics Data System (ADS)
Figueiredo, Rui; Schröter, Kai; Kreibich, Heidi; Martina, Mario
2017-04-01
Effective understanding and management of flood risk requires comprehensive risk assessment studies that consider not only the hazard component, but also the impacts that the phenomena may have on the built environment, economy and society. This integrated approach has gained importance over recent decades, and with it so has the scientific attention given to flood vulnerability models describing the relationships between flood intensity metrics and damage to physical assets, also known as flood loss models. Despite considerable progress in this field, many challenges persist. Flood damage mechanisms are complex and depend on multiple variables, which can have different degrees of importance depending on the application setting. In addition, data required for the development and validation of such models tend to be scarce, particularly in data poor regions. These issues are reflected in the large amount of flood vulnerability models that are available in the literature today, as well as in their high heterogeneity: they are built with different modelling approaches, in different geographic contexts, utilizing different explanatory variables, and with varying levels of complexity. Notwithstanding recent developments in this area, uncertainty remains high, and large disparities exist among models. For these reasons, identifying which model or models, given their properties, are appropriate for a given context is not straightforward. In the present study, we propose a framework that guides the structured selection of flood vulnerability models and enables ranking them according to their suitability for a certain application, based on expert judgement. The approach takes advantage of current state of the art and most up-to-date knowledge on flood vulnerability processes. Given the heterogeneity and uncertainty currently present in flood vulnerability models, we propose the use of a model ensemble. With this in mind, the proposed approach is based on a weighting scheme within a logic-tree framework that enables the generation of such ensembles in a logically consistent manner. We test and discuss the results by applying the framework to the case study of the 2002 floods along the Mulde River in Germany. Applications of individual models and model ensembles are compared and discussed.
Quantifying the drivers of ocean-atmosphere CO2 fluxes
NASA Astrophysics Data System (ADS)
Lauderdale, Jonathan M.; Dutkiewicz, Stephanie; Williams, Richard G.; Follows, Michael J.
2016-07-01
A mechanistic framework for quantitatively mapping the regional drivers of air-sea CO2 fluxes at a global scale is developed. The framework evaluates the interplay between (1) surface heat and freshwater fluxes that influence the potential saturated carbon concentration, which depends on changes in sea surface temperature, salinity and alkalinity, (2) a residual, disequilibrium flux influenced by upwelling and entrainment of remineralized carbon- and nutrient-rich waters from the ocean interior, as well as rapid subduction of surface waters, (3) carbon uptake and export by biological activity as both soft tissue and carbonate, and (4) the effect on surface carbon concentrations due to freshwater precipitation or evaporation. In a steady state simulation of a coarse-resolution ocean circulation and biogeochemistry model, the sum of the individually determined components is close to the known total flux of the simulation. The leading order balance, identified in different dynamical regimes, is between the CO2 fluxes driven by surface heat fluxes and a combination of biologically driven carbon uptake and disequilibrium-driven carbon outgassing. The framework is still able to reconstruct simulated fluxes when evaluated using monthly averaged data and takes a form that can be applied consistently in models of different complexity and observations of the ocean. In this way, the framework may reveal differences in the balance of drivers acting across an ensemble of climate model simulations or be applied to an analysis and interpretation of the observed, real-world air-sea flux of CO2.
Thinking as the control of imagination: a conceptual framework for goal-directed systems.
Pezzulo, Giovanni; Castelfranchi, Cristiano
2009-07-01
This paper offers a conceptual framework which (re)integrates goal-directed control, motivational processes, and executive functions, and suggests a developmental pathway from situated action to higher level cognition. We first illustrate a basic computational (control-theoretic) model of goal-directed action that makes use of internal modeling. We then show that by adding the problem of selection among multiple action alternatives motivation enters the scene, and that the basic mechanisms of executive functions such as inhibition, the monitoring of progresses, and working memory, are required for this system to work. Further, we elaborate on the idea that the off-line re-enactment of anticipatory mechanisms used for action control gives rise to (embodied) mental simulations, and propose that thinking consists essentially in controlling mental simulations rather than directly controlling behavior and perceptions. We conclude by sketching an evolutionary perspective of this process, proposing that anticipation leveraged cognition, and by highlighting specific predictions of our model.
Modeling electrokinetic flows by consistent implicit incompressible smoothed particle hydrodynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pan, Wenxiao; Kim, Kyungjoo; Perego, Mauro
2017-04-01
We present an efficient implicit incompressible smoothed particle hydrodynamics (I2SPH) discretization of Navier-Stokes, Poisson-Boltzmann, and advection-diffusion equations subject to Dirichlet or Robin boundary conditions. It is applied to model various two and three dimensional electrokinetic flows in simple or complex geometries. The I2SPH's accuracy and convergence are examined via comparison with analytical solutions, grid-based numerical solutions, or empirical models. The new method provides a framework to explore broader applications of SPH in microfluidics and complex fluids with charged objects, such as colloids and biomolecules, in arbitrary complex geometries.
User's manual for the Composite HTGR Analysis Program (CHAP-1)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, J.S.; Secker, P.A. Jr.; Vigil, J.C.
1977-03-01
CHAP-1 is the first release version of an HTGR overall plant simulation program with both steady-state and transient solution capabilities. It consists of a model-independent systems analysis program and a collection of linked modules, each representing one or more components of the HTGR plant. Detailed instructions on the operation of the code and detailed descriptions of the HTGR model are provided. Information is also provided to allow the user to easily incorporate additional component modules, to modify or replace existing modules, or to incorporate a completely new simulation model into the CHAP systems analysis framework.
NASA Astrophysics Data System (ADS)
Floría, L. M.; Baesens, C.; Gómez-Gardeñes, J.
In the preface to his monograph on the structure of Evolutionary Theory [1], the late professor Stephen Jay Gould attributes to the philosopher Immanuel Kant the following aphorism in Science Philosophy: "Percepts without concepts are blind; concepts without percepts are empty". Using with a bit of freedom these Kantian terms, one would say that a scientific model is a framework (or network) of interrelated concepts and percepts where experts build up scientific consistent explanations of a given set of observations. Good models are those which are both, conceptually simple and universal in their perceptions. Let us illustrate with examples the meaning of this statement.
Melting of genomic DNA: Predictive modeling by nonlinear lattice dynamics
NASA Astrophysics Data System (ADS)
Theodorakopoulos, Nikos
2010-08-01
The melting behavior of long, heterogeneous DNA chains is examined within the framework of the nonlinear lattice dynamics based Peyrard-Bishop-Dauxois (PBD) model. Data for the pBR322 plasmid and the complete T7 phage have been used to obtain model fits and determine parameter dependence on salt content. Melting curves predicted for the complete fd phage and the Y1 and Y2 fragments of the ϕX174 phage without any adjustable parameters are in good agreement with experiment. The calculated probabilities for single base-pair opening are consistent with values obtained from imino proton exchange experiments.
Indoorgml - a Standard for Indoor Spatial Modeling
NASA Astrophysics Data System (ADS)
Li, Ki-Joune
2016-06-01
With recent progress of mobile devices and indoor positioning technologies, it becomes possible to provide location-based services in indoor space as well as outdoor space. It is in a seamless way between indoor and outdoor spaces or in an independent way only for indoor space. However, we cannot simply apply spatial models developed for outdoor space to indoor space due to their differences. For example, coordinate reference systems are employed to indicate a specific position in outdoor space, while the location in indoor space is rather specified by cell number such as room number. Unlike outdoor space, the distance between two points in indoor space is not determined by the length of the straight line but the constraints given by indoor components such as walls, stairs, and doors. For this reason, we need to establish a new framework for indoor space from fundamental theoretical basis, indoor spatial data models, and information systems to store, manage, and analyse indoor spatial data. In order to provide this framework, an international standard, called IndoorGML has been developed and published by OGC (Open Geospatial Consortium). This standard is based on a cellular notion of space, which considers an indoor space as a set of non-overlapping cells. It consists of two types of modules; core module and extension module. While core module consists of four basic conceptual and implementation modeling components (geometric model for cell, topology between cells, semantic model of cell, and multi-layered space model), extension modules may be defined on the top of the core module to support an application area. As the first version of the standard, we provide an extension for indoor navigation.
ERIC Educational Resources Information Center
Butler, Norman L.
2004-01-01
This article is the product of the writer's deliberations about the impact of Poland's 1990 Bill on Schools of Higher Education using an information technology theoretical model consisting of three parts: (1) participation; (2) feedback; and (3) partnership. The main findings of the investigation revealed that: (1) there is wide participation in…
Pharmacokinetic Modeling of JP-8 Jet Fuel Components: II. A Conceptual Framework
2003-12-01
example, a single type of (simple) binary interaction between 300 components would require the specification of some 105 interaction coefficients . One...individual substances, via binary mechanisms, is enough to predict the interactions present in the mixture. Secondly, complex mixtures can often be...approximated as pseudo- binary systems, consisting of the compound of interest plus a single interacting complex vehicle with well-defined, composite
ERIC Educational Resources Information Center
Gorlewski, Julie A., Ed.; Porfilio, Brad J., Ed.; Gorlewski, David A., Ed.
2012-01-01
This book overturns the typical conception of standards, empowering educators by providing concrete examples of how top-down models of assessment can be embraced and used in ways that are consistent with critical pedagogies. Although standards, as broad frameworks for setting learning targets, are not necessarily problematic, when they are…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agrawal, Prateek; Chacko, Zackaria; Fortes, Elaine C. F. S.
We explore a novel flavor structure in the interactions of dark matter with the Standard Model. We consider theories in which both the dark matter candidate, and the particles that mediate its interactions with the Standard Model fields, carry flavor quantum numbers. The interactions are skewed in flavor space, so that a dark matter particle does not directly couple to the Standard Model matter fields of the same flavor, but only to the other two flavors. This framework respects minimal flavor violation and is, therefore, naturally consistent with flavor constraints. We study the phenomenology of a benchmark model in whichmore » dark matter couples to right-handed charged leptons. In large regions of parameter space, the dark matter can emerge as a thermal relic, while remaining consistent with the constraints from direct and indirect detection. The collider signatures of this scenario include events with multiple leptons and missing energy. In conclusion, these events exhibit a characteristic flavor pattern that may allow this class of models to be distinguished from other theories of dark matter.« less