Science.gov

Sample records for quark-parton model framework

  1. The description of inclusive characteristics inbar pp interactions at 22.4 GeV/ c in terms of the quark-parton model

    NASA Astrophysics Data System (ADS)

    Batyunya, B. V.; Boguslavsky, I. V.; Gramenitsky, I. M.; Lednický, R.; Levonian, S. V.; Tikhonova, L. A.; Valkárová, A.; Vrba, V.; Zlatanov, Z.; Boos, E. G.; Samoilov, V. V.; Takibaev, Zh. S.; Temiraliev, T.; Lichard, P.; Mašejová, A.; Dumbrajs, S.; Ervanne, J.; Hannula, E.; Villanen, P.; Dementiev, R. K.; Korzhavina, I. A.; Leikin, E. M.; Rud, V. I.; Herynek, I.; Reimer, P.; Řídký, J.; Sedlák, J.; Šimák, V.; Suk, M.; Khudzadze, A. M.; Kuratashvili, G. O.; Topuriya, T. P.; Tzintzadze, V. D.

    1980-03-01

    We compare the inclusive characteristics ofbar pp interactions at 22.4 GeV/ c with quark-parton model predictions in terms of collective variables. The model qualitatively agrees with the data in contradiction to the simple cylindrical phase space and randomized charge model. The ways are proposed of a further development of the quark-parton model.

  2. Semi-inclusive charged-pion electroproduction off protons and deuterons: Cross sections, ratios, and access to the quark-parton model at low energies

    DOE PAGES

    Asaturyan, R.; Ent, R.; Mkrtchyan, H.; ...

    2012-01-01

    A large set of cross sections for semi-inclusive electroproduction of charged pions (π±) from both proton and deuteron targets was measured. The data are in the deep-inelastic scattering region with invariant mass squared W2 > 4 GeV2 and range in four-momentum transfer squared 2 < Q2 < 4 (GeV/c)2, and cover a range in the Bjorken scaling variable 0.2 < x < 0.6. The fractional energy of the pions spans a range 0.3 < z < 1, with small transverse momenta with respect to the virtual-photon direction, Pt2 < 0.2 (GeV/c)2. The invariant mass that goes undetected, Mx or W',more » is in the nucleon resonance region, W' < 2 GeV. The new data conclusively show the onset of quark-hadron duality in this process, and the relation of this phenomenon to the high-energy factorization ansatz of electron-quark scattering and subsequent quark → pion production mechanisms. The x, z and Pt2 dependences of several ratios (the ratios of favored-unfavored fragmentation functions, charged pion ratios, deuteron-hydrogen and aluminum-deuteron ratios for π+ and π-) have been studied. The ratios are found to be in good agreement with expectations based upon a high-energy quark-parton model description. We find the azimuthal dependences to be small, as compared to exclusive pion electroproduction, and consistent with theoretical expectations based on tree-level factorization in terms of transverse-momentum-dependent parton distribution and fragmentation functions. In the context of a simple model, the initial transverse momenta of d quarks are found to be slightly smaller than for u quarks, while the transverse momentum width of the favored fragmentation function is about the same as for the unfavored one, and both fragmentation widths are larger than the quark widths.« less

  3. Semi-inclusive charged-pion electroproduction off protons and deuterons: Cross sections, ratios, and access to the quark-parton model at low energies

    SciTech Connect

    Asaturyan, R.; Ent, R.; Mkrtchyan, H.; Navasardyan, T.; Tadevosyan, V.; Adams, G. S.; Ahmidouch, A.; Angelescu, T.; Arrington, J.; Asaturyan, A.; Baker, O. K.; Benmouna, N.; Bertoncini, C.; Blok, H. P.; Boeglin, W. U.; Bosted, P. E.; Breuer, H.; Christy, M. E.; Connell, S. H.; Cui, Y.; Dalton, M. M.; Danagoulian, S.; Day, D.; Dunne, J. A.; Dutta, D.; El Khayari, N.; Fenker, H. C.; Frolov, V. V.; Gan, L.; Gaskell, D.; Hafidi, K.; Hinton, W.; Holt, R. J.; Horn, T.; Huber, G. M.; Hungerford, E.; Jiang, X.; Jones, M.; Joo, K.; Kalantarians, N.; Kelly, J. J.; Keppel, C. E.; Kubarovsky, V.; Li, Y.; Liang, Y.; Mack, D.; Malace, S. P.; Markowitz, P.; McGrath, E.; McKee, P.; Meekins, D. G.; Mkrtchyan, A.; Moziak, B.; Niculescu, G.; Niculescu, I.; Opper, A. K.; Ostapenko, T.; Reimer, P. E.; Reinhold, J.; Roche, J.; Rock, S. E.; Schulte, E.; Segbefia, E.; Smith, C.; Smith, G. R.; Stoler, P.; Tang, L.; Ungaro, M.; Uzzle, A.; Vidakovic, S.; Villano, A.; Vulcan, W. F.; Wang, M.; Warren, G.; Wesselmann, F. R.; Wojtsekhowski, B.; Wood, S. A.; Xu, C.; Yuan, L.; Zheng, X.

    2012-01-01

    A large set of cross sections for semi-inclusive electroproduction of charged pions (π±) from both proton and deuteron targets was measured. The data are in the deep-inelastic scattering region with invariant mass squared W2 > 4 GeV2 and range in four-momentum transfer squared 2 < Q2 < 4 (GeV/c)2, and cover a range in the Bjorken scaling variable 0.2 < x < 0.6. The fractional energy of the pions spans a range 0.3 < z < 1, with small transverse momenta with respect to the virtual-photon direction, Pt2 < 0.2 (GeV/c)2. The invariant mass that goes undetected, Mx or W', is in the nucleon resonance region, W' < 2 GeV. The new data conclusively show the onset of quark-hadron duality in this process, and the relation of this phenomenon to the high-energy factorization ansatz of electron-quark scattering and subsequent quark → pion production mechanisms. The x, z and Pt2 dependences of several ratios (the ratios of favored-unfavored fragmentation functions, charged pion ratios, deuteron-hydrogen and aluminum-deuteron ratios for π+ and π-) have been studied. The ratios are found to be in good agreement with expectations based upon a high-energy quark-parton model description. We find the azimuthal dependences to be small, as compared to exclusive pion electroproduction, and consistent with theoretical expectations based on tree-level factorization in terms of transverse-momentum-dependent parton distribution and fragmentation functions. In the context of a simple model, the initial transverse momenta of d quarks are found to be slightly smaller than for u quarks, while the transverse momentum width of the favored fragmentation function is about the same as for the unfavored one, and both fragmentation widths are larger than the quark widths.

  4. Pion and kaon valence-quark parton distribution functions

    SciTech Connect

    Nguyen, Trang; Bashir, Adnan; Roberts, Craig D.; Tandy, Peter C.

    2011-06-15

    A rainbow-ladder truncation of QCD's Dyson-Schwinger equations, constrained by existing applications to hadron physics, is employed to compute the valence-quark parton distribution functions of the pion and kaon. Comparison is made to {pi}-N Drell-Yan data for the pion's u-quark distribution and to Drell-Yan data for the ratio u{sub K}(x)/u{sub {pi}}(x): the environmental influence of this quantity is a parameter-free prediction, which agrees well with existing data. Our analysis unifies the computation of distribution functions with that of numerous other properties of pseudoscalar mesons.

  5. Pion and kaon valence-quark parton distribution functions.

    SciTech Connect

    Nguyen, T.; Bashir, A.; Roberts, C. D.; Tandy, P. C.

    2011-06-16

    A rainbow-ladder truncation of QCD's Dyson-Schwinger equations, constrained by existing applications to hadron physics, is employed to compute the valence-quark parton distribution functions of the pion and kaon. Comparison is made to {pi}-N Drell-Yan data for the pion's u-quark distribution and to Drell-Yan data for the ratio u{sub K}(x)/u{sub {pi}}(x): the environmental influence of this quantity is a parameter-free prediction, which agrees well with existing data. Our analysis unifies the computation of distribution functions with that of numerous other properties of pseudoscalar mesons.

  6. Strange quark parton distribution functions and implications for Drell-Yan boson production at the LHC

    NASA Astrophysics Data System (ADS)

    Kusina, A.; Stavreva, T.; Berge, S.; Olness, F. I.; Schienbein, I.; Kovařík, K.; Ježo, T.; Yu, J. Y.; Park, K.

    2012-05-01

    Global analyses of parton distribution functions (PDFs) have provided incisive constraints on the up and down quark components of the proton, but constraining the other flavor degrees of freedom is more challenging. Higher-order theory predictions and new data sets have contributed to recent improvements. Despite these efforts, the strange quark parton distribution function has a sizable uncertainty, particularly in the small x region. We examine the constraints from experiment and theory, and investigate the impact of this uncertainty on LHC observables. In particular, we study W/Z production to see how the s quark uncertainty propagates to these observables, and examine the extent to which precise measurements at the LHC can provide additional information on the proton flavor structure.

  7. Dicyanometallates as Model Extended Frameworks

    PubMed Central

    2016-01-01

    We report the structures of eight new dicyanometallate frameworks containing molecular extra-framework cations. These systems include a number of hybrid inorganic–organic analogues of conventional ceramics, such as Ruddlesden–Popper phases and perovskites. The structure types adopted are rationalized in the broader context of all known dicyanometallate framework structures. We show that the structural diversity of this family can be understood in terms of (i) the charge and coordination preferences of the particular metal cation acting as framework node, and (ii) the size, shape, and extent of incorporation of extra-framework cations. In this way, we suggest that dicyanometallates form a particularly attractive model family of extended frameworks in which to explore the interplay between molecular degrees of freedom, framework topology, and supramolecular interactions. PMID:27057759

  8. Geologic Framework Model (GFM2000)

    SciTech Connect

    T. Vogt

    2004-08-26

    The purpose of this report is to document the geologic framework model, version GFM2000 with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, and the differences between GFM2000 and previous versions. The version number of this model reflects the year during which the model was constructed. This model supersedes the previous model version, documented in Geologic Framework Model (GFM 3.1) (CRWMS M&O 2000 [DIRS 138860]). The geologic framework model represents a three-dimensional interpretation of the geology surrounding the location of the monitored geologic repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain. The geologic framework model encompasses and is limited to an area of 65 square miles (168 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the geologic framework model (shown in Figure 1-1) were chosen to encompass the exploratory boreholes and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The upper surface of the model is made up of the surface topography and the depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The geologic framework model was constructed from geologic map and borehole data. Additional information from measured stratigraphic sections, gravity profiles, and seismic profiles was also considered. The intended use of the geologic framework model is to provide a geologic framework over the area of interest consistent with the level of detailed needed for hydrologic flow and radionuclide transport modeling through the UZ and for repository design. The model is limited by the availability of data and relative amount of geologic complexity found in an area. The geologic framework model is inherently limited by scale and content. The grid spacing used in the

  9. Environmental modeling framework invasiveness: analysis and implications

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Environmental modeling frameworks support scientific model development by providing an Application Programming Interface (API) which model developers use to implement models. This paper presents results of an investigation on the framework invasiveness of environmental modeling frameworks. Invasiv...

  10. Environmental modeling framework invasiveness: analysis and implications

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Environmental modeling frameworks support scientific model development by providing an Application Programming Interface (API) which model developers use to implement models. This paper presents results of an investigation on the framework invasiveness of environmental modeling frameworks. Invasiven...

  11. Sequentially Executed Model Evaluation Framework

    SciTech Connect

    2014-02-14

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and 1/0 through the time domain (or other discrete domain), and sample 1/0 drivers. This is a Framework library framework, and does not, itself, solve any problems or execute any modelling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Ha) applications in quality monitoring, and was developed as part of the CANARY-EDS software, where real-time water quality data is being analyzed

  12. CMAQ Model Evaluation Framework

    EPA Pesticide Factsheets

    CMAQ is tested to establish the modeling system’s credibility in predicting pollutants such as ozone and particulate matter. Evaluation of CMAQ has been designed to assess the model’s performance for specific time periods and for specific uses.

  13. Sequentially Executed Model Evaluation Framework

    SciTech Connect

    2015-10-20

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as part of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.

  14. Framework for Modeling the Cognitive Process

    DTIC Science & Technology

    2005-06-16

    Yaworsky Air Force Research Laboratory/IFSB Rome, NY Keywords: Cognitive Process Modeling, Cognition, Conceptual Framework , Information...center of our conceptual framework and will distinguish our use of terms within the context of this framework. 3. A Conceptual Framework for...Modeling the Cognitive Process We will describe our conceptual framework using graphical examples to help illustrate main points. We form the two

  15. Geologic Framework Model Analysis Model Report

    SciTech Connect

    R. Clayton

    2000-12-19

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the

  16. Deriving Framework Usages Based on Behavioral Models

    NASA Astrophysics Data System (ADS)

    Zenmyo, Teruyoshi; Kobayashi, Takashi; Saeki, Motoshi

    One of the critical issue in framework-based software development is a huge introduction cost caused by technical gap between developers and users of frameworks. This paper proposes a technique for deriving framework usages to implement a given requirements specification. By using the derived usages, the users can use the frameworks without understanding the framework in detail. Requirements specifications which describe definite behavioral requirements cannot be related to frameworks in as-is since the frameworks do not have definite control structure so that the users can customize them to suit given requirements specifications. To cope with this issue, a new technique based on satisfiability problems (SAT) is employed to derive the control structures of the framework model. In the proposed technique, requirements specifications and frameworks are modeled based on Labeled Transition Systems (LTSs) with branch conditions represented by predicates. Truth assignments of the branch conditions in the framework models are not given initially for representing the customizable control structure. The derivation of truth assignments of the branch conditions is regarded as the SAT by assuming relations between termination states of the requirements specification model and ones of the framework model. This derivation technique is incorporated into a technique we have proposed previously for relating actions of requirements specifications to ones of frameworks. Furthermore, this paper discuss a case study of typical use cases in e-commerce systems.

  17. A UML profile for framework modeling.

    PubMed

    Xu, Xiao-liang; Wang, Le-yu; Zhou, Hong

    2004-01-01

    The current standard Unified Modeling Language(UML) could not model framework flexibility and extendability adequately due to lack of appropriate constructs to distinguish framework hot-spots from kernel elements. A new UML profile that may customize UML for framework modeling was presented using the extension mechanisms of UML, providing a group of UML extensions to meet the needs of framework modeling. In this profile, the extended class diagrams and sequence diagrams were defined to straightforwardly identify the hot-spots and describe their instantiation restrictions. A transformation model based on design patterns was also put forward, such that the profile based framework design diagrams could be automatically mapped to the corresponding implementation diagrams. It was proved that the presented profile makes framework modeling more straightforwardly and therefore easier to understand and instantiate.

  18. The Generalized DINA Model Framework

    ERIC Educational Resources Information Center

    de la Torre, Jimmy

    2011-01-01

    The G-DINA ("generalized deterministic inputs, noisy and gate") model is a generalization of the DINA model with more relaxed assumptions. In its saturated form, the G-DINA model is equivalent to other general models for cognitive diagnosis based on alternative link functions. When appropriate constraints are applied, several commonly used…

  19. Knowledge Encapsulation Framework for Collaborative Social Modeling

    SciTech Connect

    Cowell, Andrew J.; Gregory, Michelle L.; Marshall, Eric J.; McGrath, Liam R.

    2009-03-24

    This paper describes the Knowledge Encapsulation Framework (KEF), a suite of tools to enable knowledge inputs (relevant, domain-specific facts) to modeling and simulation projects, as well as other domains that require effective collaborative workspaces for knowledge-based task. This framework can be used to capture evidence (e.g., trusted material such as journal articles and government reports), discover new evidence (covering both trusted and social media), enable discussions surrounding domain-specific topics and provide automatically generated semantic annotations for improved corpus investigation. The current KEF implementation is presented within a wiki environment, providing a simple but powerful collaborative space for team members to review, annotate, discuss and align evidence with their modeling frameworks. The novelty in this approach lies in the combination of automatically tagged and user-vetted resources, which increases user trust in the environment, leading to ease of adoption for the collaborative environment.

  20. Rethinking modeling framework design: object modeling system 3.0

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Object Modeling System (OMS) is a framework for environmental model development, data provisioning, testing, validation, and deployment. It provides a bridge for transferring technology from the research organization to the program delivery agency. The framework provides a consistent and efficie...

  1. A Framework to Manage Information Models

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; King, T.; Crichton, D.; Walker, R.; Roberts, A.; Thieman, J.

    2008-05-01

    The Information Model is the foundation on which an Information System is built. It defines the entities to be processed, their attributes, and the relationships that add meaning. The development and subsequent management of the Information Model is the single most significant factor for the development of a successful information system. A framework of tools has been developed that supports the management of an information model with the rigor typically afforded to software development. This framework provides for evolutionary and collaborative development independent of system implementation choices. Once captured, the modeling information can be exported to common languages for the generation of documentation, application databases, and software code that supports both traditional and semantic web applications. This framework is being successfully used for several science information modeling projects including those for the Planetary Data System (PDS), the International Planetary Data Alliance (IPDA), the National Cancer Institute's Early Detection Research Network (EDRN), and several Consultative Committee for Space Data Systems (CCSDS) projects. The objective of the Space Physics Archive Search and Exchange (SPASE) program is to promote collaboration and coordination of archiving activity for the Space Plasma Physics community and ensure the compatibility of the architectures used for a global distributed system and the individual data centers. Over the past several years, the SPASE data model working group has made great progress in developing the SPASE Data Model and supporting artifacts including a data dictionary, XML Schema, and two ontologies. The authors have captured the SPASE Information Model in this framework. This allows the generation of documentation that presents the SPASE Information Model in object-oriented notation including UML class diagrams and class hierarchies. The modeling information can also be exported to semantic web languages such

  2. Modelling Diffusion of a Personalized Learning Framework

    ERIC Educational Resources Information Center

    Karmeshu; Raman, Raghu; Nedungadi, Prema

    2012-01-01

    A new modelling approach for diffusion of personalized learning as an educational process innovation in social group comprising adopter-teachers is proposed. An empirical analysis regarding the perception of 261 adopter-teachers from 18 schools in India about a particular personalized learning framework has been made. Based on this analysis,…

  3. Multiple Mentor Model: A Conceptual Framework.

    ERIC Educational Resources Information Center

    Burlew, Larry D.

    1991-01-01

    Focuses on developing a conceptual framework for the mentoring process. The model is based on the premise that mentoring is not a single event in the life of a worker but rather several events with several different levels of mentoring. (Author)

  4. CAN A MODEL TRANSFERABILITY FRAMEWORK IMPROVE ...

    EPA Pesticide Factsheets

    Budget constraints and policies that limit primary data collection have fueled a practice of transferring estimates (or models to generate estimates) of ecological endpoints from sites where primary data exists to sites where little to no primary data were collected. Whereas benefit transfer has been well studied; there is no comparable framework for evaluating whether model transfer between sites is justifiable. We developed and applied a transferability assessment framework to a case study involving forest carbon sequestration for soils in Tillamook Bay, Oregon. The carbon sequestration capacity of forested watersheds is an important ecosystem service in the effort to reduce atmospheric greenhouse gas emissions. We used our framework, incorporating three basic steps (model selection, defining context variables, assessing logistical constraints) for evaluating model transferability, to compare estimates of carbon storage capacity derived from two models, COMET-Farm and Yasso. We applied each model to Tillamook Bay and compared results to data extracted from the Soil Survey Geographic Database (SSURGO) using ArcGIS. Context variables considered were: geographic proximity to Tillamook, dominant tree species, climate and soil type. Preliminary analyses showed that estimates from COMET-Farm were more similar to SSURGO data, likely because model context variables (e.g. proximity to Tillamook and dominant tree species) were identical to those in Tillamook. In contras

  5. An Extensible Model and Analysis Framework

    DTIC Science & Technology

    2010-11-01

    of a pre-existing, open-source modeling and analysis framework known as Ptolemy II (http://ptolemy.org). The University of California, Berkeley...worked with the Air Force Research Laboratory, Rome Research Site on adapting Ptolemy II for modeling and simulation of large scale dynamics of Political...capabilities were prototyped in Ptolemy II and delivered via version control and software releases. Each of these capabilities specifically supports one or

  6. Overarching framework for data-based modelling

    NASA Astrophysics Data System (ADS)

    Schelter, Björn; Mader, Malenka; Mader, Wolfgang; Sommerlade, Linda; Platt, Bettina; Lai, Ying-Cheng; Grebogi, Celso; Thiel, Marco

    2014-02-01

    One of the main modelling paradigms for complex physical systems are networks. When estimating the network structure from measured signals, typically several assumptions such as stationarity are made in the estimation process. Violating these assumptions renders standard analysis techniques fruitless. We here propose a framework to estimate the network structure from measurements of arbitrary non-linear, non-stationary, stochastic processes. To this end, we propose a rigorous mathematical theory that underlies this framework. Based on this theory, we present a highly efficient algorithm and the corresponding statistics that are immediately sensibly applicable to measured signals. We demonstrate its performance in a simulation study. In experiments of transitions between vigilance stages in rodents, we infer small network structures with complex, time-dependent interactions; this suggests biomarkers for such transitions, the key to understand and diagnose numerous diseases such as dementia. We argue that the suggested framework combines features that other approaches followed so far lack.

  7. An evaluation framework for participatory modelling

    NASA Astrophysics Data System (ADS)

    Krueger, T.; Inman, A.; Chilvers, J.

    2012-04-01

    Strong arguments for participatory modelling in hydrology can be made on substantive, instrumental and normative grounds. These arguments have led to increasingly diverse groups of stakeholders (here anyone affecting or affected by an issue) getting involved in hydrological research and the management of water resources. In fact, participation has become a requirement of many research grants, programs, plans and policies. However, evidence of beneficial outcomes of participation as suggested by the arguments is difficult to generate and therefore rare. This is because outcomes are diverse, distributed, often tacit, and take time to emerge. In this paper we develop an evaluation framework for participatory modelling focussed on learning outcomes. Learning encompasses many of the potential benefits of participation, such as better models through diversity of knowledge and scrutiny, stakeholder empowerment, greater trust in models and ownership of subsequent decisions, individual moral development, reflexivity, relationships, social capital, institutional change, resilience and sustainability. Based on the theories of experiential, transformative and social learning, complemented by practitioner experience our framework examines if, when and how learning has occurred. Special emphasis is placed on the role of models as learning catalysts. We map the distribution of learning between stakeholders, scientists (as a subgroup of stakeholders) and models. And we analyse what type of learning has occurred: instrumental learning (broadly cognitive enhancement) and/or communicative learning (change in interpreting meanings, intentions and values associated with actions and activities; group dynamics). We demonstrate how our framework can be translated into a questionnaire-based survey conducted with stakeholders and scientists at key stages of the participatory process, and show preliminary insights from applying the framework within a rural pollution management situation in

  8. Improvements in the Space Weather Modeling Framework

    NASA Astrophysics Data System (ADS)

    Ridley, A. J.; Liemohn, M.; Dezeeuw, D.; Ilie, R.; Sokolov, I.; Toth, G.; Yu, Y.

    2008-12-01

    The magnetosphere within the Space Weather Modeling Framework (SWMF) has been represented by a global magnetosphere model (BATSRUS), an inner magnetosphere model (the Rice Convection Model) and a model of the ionospheric electrodynamics. We present significant improvements in the SWMF: (1) We have implemented a spherical grid within BATSRUS and have utilized this for modeling the magnetosphere; (2) We have significantly improved the physics of the auroral oval within the ionospheric electrodynamics code, modeling a self-consistent diffuse and discrete auroral oval; (3) We utilize the multifluid MHD code within BATSRUS to allow for more accurate specification and differentiation of the density within the magnetosphere; and (4) we have incorporated the Hot Electron and Ion Drift Integrator (HEIDI) ring current code within the SWMF. We will present these improvements and show the quantitative differences within the model results when comparing to a suite of measurements for a number of different intervals.

  9. A framework for benchmarking land models

    SciTech Connect

    Luo, Yiqi; Randerson, J.; Abramowitz, G.; Bacour, C.; Blyth, E.; Carvalhais, N.; Ciais, Philippe; Dalmonech, D.; Fisher, J.B.; Fisher, R.; Friedlingstein, P.; Hibbard, Kathleen A.; Hoffman, F. M.; Huntzinger, Deborah; Jones, C.; Koven, C.; Lawrence, David M.; Li, D.J.; Mahecha, M.; Niu, S.L.; Norby, Richard J.; Piao, S.L.; Qi, X.; Peylin, P.; Prentice, I.C.; Riley, William; Reichstein, M.; Schwalm, C.; Wang, Y.; Xia, J. Y.; Zaehle, S.; Zhou, X. H.

    2012-10-09

    Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1) targeted aspects of model performance to be evaluated, (2) a set of benchmarks as defined references to test model performance, (3) metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4) model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1) a priori thresholds of acceptable model performance and (2) a scoring system to combine data–model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties of land models

  10. A framework for benchmarking land models

    SciTech Connect

    Luo, Yiqi; Randerson, James T.; Hoffman, Forrest; Norby, Richard J

    2012-01-01

    Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1) targeted aspects of model performance to be evaluated, (2) a set of benchmarks as defined references to test model performance, (3) metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4) model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1) a priori thresholds of acceptable model performance and (2) a scoring system to combine data model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties of land models

  11. An entropic framework for modeling economies

    NASA Astrophysics Data System (ADS)

    Caticha, Ariel; Golan, Amos

    2014-08-01

    We develop an information-theoretic framework for economic modeling. This framework is based on principles of entropic inference that are designed for reasoning on the basis of incomplete information. We take the point of view of an external observer who has access to limited information about broad macroscopic economic features. We view this framework as complementary to more traditional methods. The economy is modeled as a collection of agents about whom we make no assumptions of rationality (in the sense of maximizing utility or profit). States of statistical equilibrium are introduced as those macrostates that maximize entropy subject to the relevant information codified into constraints. The basic assumption is that this information refers to supply and demand and is expressed in the form of the expected values of certain quantities (such as inputs, resources, goods, production functions, utility functions and budgets). The notion of economic entropy is introduced. It provides a measure of the uniformity of the distribution of goods and resources. It captures both the welfare state of the economy as well as the characteristics of the market (say, monopolistic, concentrated or competitive). Prices, which turn out to be the Lagrange multipliers, are endogenously generated by the economy. Further studies include the equilibrium between two economies and the conditions for stability. As an example, the case of the nonlinear economy that arises from linear production and utility functions is treated in some detail.

  12. Architecting a Simulation Framework for Model Rehosting

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2004-01-01

    The utility of vehicle math models extends beyond human-in-the-loop simulation. It is desirable to deploy a given model across a multitude of applications that target design, analysis, and research. However, the vehicle model alone represents an incomplete simulation. One must also replicate the environment models (e.g., atmosphere, gravity, terrain) to achieve identical vehicle behavior across all applications. Environment models are increasing in complexity and represent a substantial investment to re-engineer for a new application. A software component that can be rehosted in each application is one solution to the deployment problem. The component must encapsulate both the vehicle and environment models. The component must have a well-defined interface that abstracts the bulk of the logic to operate the models. This paper examines the characteristics of a rehostable modeling component from the perspective of a human-in-the-loop simulation framework. The Langley Standard Real-Time Simulation in C++ (LaSRS++) is used as an example. LaSRS++ was recently redesigned to transform its modeling package into a rehostable component.

  13. A framework for multi-scale modelling

    PubMed Central

    Chopard, B.; Borgdorff, Joris; Hoekstra, A. G.

    2014-01-01

    We review a methodology to design, implement and execute multi-scale and multi-science numerical simulations. We identify important ingredients of multi-scale modelling and give a precise definition of them. Our framework assumes that a multi-scale model can be formulated in terms of a collection of coupled single-scale submodels. With concepts such as the scale separation map, the generic submodel execution loop (SEL) and the coupling templates, one can define a multi-scale modelling language which is a bridge between the application design and the computer implementation. Our approach has been successfully applied to an increasing number of applications from different fields of science and technology. PMID:24982249

  14. Conceptual Frameworks in the Doctoral Research Process: A Pedagogical Model

    ERIC Educational Resources Information Center

    Berman, Jeanette; Smyth, Robyn

    2015-01-01

    This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…

  15. A Smallholder Socio-hydrological Modelling Framework

    NASA Astrophysics Data System (ADS)

    Pande, S.; Savenije, H.; Rathore, P.

    2014-12-01

    Small holders are farmers who own less than 2 ha of farmland. They often have low productivity and thus remain at subsistence level. A fact that nearly 80% of Indian farmers are smallholders, who merely own a third of total farmlands and belong to the poorest quartile, but produce nearly 40% of countries foodgrains underlines the importance of understanding the socio-hydrology of a small holder. We present a framework to understand the socio-hydrological system dynamics of a small holder. It couples the dynamics of 6 main variables that are most relevant at the scale of a small holder: local storage (soil moisture and other water storage), capital, knowledge, livestock production, soil fertility and grass biomass production. The model incorporates rule-based adaptation mechanisms (for example: adjusting expenditures on food and fertilizers, selling livestocks etc.) of small holders when they face adverse socio-hydrological conditions, such as low annual rainfall, higher intra-annual variability in rainfall or variability in agricultural prices. It allows us to study sustainability of small holder farming systems under various settings. We apply the framework to understand the socio-hydrology of small holders in Aurangabad, Maharashtra, India. This district has witnessed suicides of many sugarcane farmers who could not extricate themselves out of the debt trap. These farmers lack irrigation and are susceptible to fluctuating sugar prices and intra-annual hydroclimatic variability. This presentation discusses two aspects in particular: whether government interventions to absolve the debt of farmers is enough and what is the value of investing in local storages that can buffer intra-annual variability in rainfall and strengthening the safety-nets either by creating opportunities for alternative sources of income or by crop diversification.

  16. The Aircraft Availability Model: Conceptual Framework and Mathematics

    DTIC Science & Technology

    1983-06-01

    THE AIRCRAFT AVAILABILITY MODEL: CONCEPTUAL FRAMEWORK AND MATHEMATICS June 1983 T. J. O’Malley Prepared pursuant to Department of Defense Contract No...OF REPORT & PERIOD COVERED The Aircraft Availability Model: Model Documentation Conceptual Framework and Mathematics 6. PERFORMING ORG. REPORT NUMBER

  17. Improving the physics models in the Space Weather Modeling Framework

    NASA Astrophysics Data System (ADS)

    Toth, G.; Fang, F.; Frazin, R. A.; Gombosi, T. I.; Ilie, R.; Liemohn, M. W.; Manchester, W. B.; Meng, X.; Pawlowski, D. J.; Ridley, A. J.; Sokolov, I.; van der Holst, B.; Vichare, G.; Yigit, E.; Yu, Y.; Buzulukova, N.; Fok, M. H.; Glocer, A.; Jordanova, V. K.; Welling, D. T.; Zaharia, S. G.

    2010-12-01

    The success of physics based space weather forecasting depends on several factors: we need sufficient amount and quality of timely observational data, we have to understand the physics of the Sun-Earth system well enough, we need sophisticated computational models, and the models have to run faster than real time on the available computational resources. This presentation will focus on a single ingredient, the recent improvements of the mathematical and numerical models in the Space Weather Modeling Framework. We have developed a new physics based CME initiation code using flux emergence from the convection zone solving the equations of radiative magnetohydrodynamics (MHD). Our new lower corona and solar corona models use electron heat conduction, Alfven wave heating, and boundary conditions based on solar tomography. We can obtain a physically consistent solar wind model from the surface of the Sun all the way to the L1 point without artificially changing the polytropic index. The global magnetosphere model can now solve the multi-ion MHD equations and take into account the oxygen outflow from the polar wind model. We have also added the options of solving for Hall MHD and anisotropic pressure. Several new inner magnetosphere models have been added to the framework: CRCM, HEIDI and RAM-SCB. These new models resolve the pitch angle distribution of the trapped particles. The upper atmosphere model GITM has been improved by including a self-consistent equatorial electrodynamics and the effects of solar flares. This presentation will very briefly describe the developments and highlight some results obtained with the improved and new models.

  18. Critical Thinking: Frameworks and Models for Teaching

    ERIC Educational Resources Information Center

    Fahim, Mansoor; Eslamdoost, Samaneh

    2014-01-01

    Developing critical thinking since the educational revolution gave rise to flourishing movements toward embedding critical thinking (CT henceforth) stimulating classroom activities in educational settings. Nevertheless the process faced with complications such as teachability potentiality, lack of practical frameworks concerning actualization of…

  19. A Simulation and Modeling Framework for Space Situational Awareness

    SciTech Connect

    Olivier, S S

    2008-09-15

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.

  20. A framework for modeling uncertainty in regional climate change

    EPA Science Inventory

    In this study, we present a new modeling framework and a large ensemble of climate projections to investigate the uncertainty in regional climate change over the United States associated with four dimensions of uncertainty. The sources of uncertainty considered in this framework ...

  1. A Framework for Dimensionality Assessment for Multidimensional Item Response Models

    ERIC Educational Resources Information Center

    Svetina, Dubravka; Levy, Roy

    2014-01-01

    A framework is introduced for considering dimensionality assessment procedures for multidimensional item response models. The framework characterizes procedures in terms of their confirmatory or exploratory approach, parametric or nonparametric assumptions, and applicability to dichotomous, polytomous, and missing data. Popular and emerging…

  2. Mid-Career Counseling--A Model Framework.

    ERIC Educational Resources Information Center

    College Placement Council, Bethlehem, PA.

    This model framework consists of client-centered strategies that can help the mid-career changer explore options. The framework presented in this document integrates theoretical and practical applications and can be adapted for use by a variety of campuses to meet the needs of the campus' adult population through individual or group counseling.…

  3. Coastal Ecosystem Integrated Compartment Model (ICM): Modeling Framework

    NASA Astrophysics Data System (ADS)

    Meselhe, E. A.; White, E. D.; Reed, D.

    2015-12-01

    The Integrated Compartment Model (ICM) was developed as part of the 2017 Coastal Master Plan modeling effort. It is a comprehensive and numerical hydrodynamic model coupled to various geophysical process models. Simplifying assumptions related to some of the flow dynamics are applied to increase the computational efficiency of the model. The model can be used to provide insights about coastal ecosystems and evaluate restoration strategies. It builds on existing tools where possible and incorporates newly developed tools where necessary. It can perform decadal simulations (~ 50 years) across the entire Louisiana coast. It includes several improvements over the approach used to support the 2012 Master Plan, such as: additional processes in the hydrology, vegetation, wetland and barrier island morphology subroutines, increased spatial resolution, and integration of previously disparate models into a single modeling framework. The ICM includes habitat suitability indices (HSIs) to predict broad spatial patterns of habitat change, and it provides an additional integration to a dynamic fish and shellfish community model which quantitatively predicts potential changes in important fishery resources. It can be used to estimate the individual and cumulative effects of restoration and protection projects on the landscape, including a general estimate of water levels associated with flooding. The ICM is also used to examine possible impacts of climate change and future environmental scenarios (e.g. precipitation, Eustatic sea level rise, subsidence, tropical storms, etc.) on the landscape and on the effectiveness of restoration projects. The ICM code is publically accessible, and coastal restoration and protection groups interested in planning-level modeling are encouraged to explore its utility as a computationally efficient tool to examine ecosystem response to future physical or ecological changes, including the implementation of restoration and protection strategies.

  4. A clothing modeling framework for uniform and armor system design

    NASA Astrophysics Data System (ADS)

    Man, Xiaolin; Swan, Colby C.; Rahmatalla, Salam

    2006-05-01

    In the analysis and design of military uniforms and body armor systems it is helpful to quantify the effects of the clothing/armor system on a wearer's physical performance capabilities. Toward this end, a clothing modeling framework for quantifying the mechanical interactions between a given uniform or body armor system design and a specific wearer performing defined physical tasks is proposed. The modeling framework consists of three interacting modules: (1) a macroscale fabric mechanics/dynamics model; (2) a collision detection and contact correction module; and (3) a human motion module. In the proposed framework, the macroscopic fabric model is based on a rigorous large deformation continuum-degenerated shell theory representation. The collision and contact module enforces non-penetration constraints between the fabric and human body and computes the associated contact forces between the two. The human body is represented in the current framework, as an assemblage of overlapping ellipsoids that undergo rigid body motions consistent with human motions while performing actions such as walking, running, or jumping. The transient rigid body motions of each ellipsoidal body segment in time are determined using motion capture technology. The integrated modeling framework is then exercised to quantify the resistance that the clothing exerts on the wearer during the specific activities under consideration. Current results from the framework are presented and its intended applications are discussed along with some of the key challenges remaining in clothing system modeling.

  5. Landscape development modeling based on statistical framework

    NASA Astrophysics Data System (ADS)

    Pohjola, Jari; Turunen, Jari; Lipping, Tarmo; Ikonen, Ari T. K.

    2014-01-01

    Future biosphere modeling has an essential role in assessing the safety of a proposed nuclear fuel repository. In Finland the basic inputs needed for future biosphere modeling are the digital elevation model and the land uplift model because the surface of the ground is still rising due to the download stress caused by the last ice age. The future site-scale land uplift is extrapolated by fitting mathematical expressions to known data from past shoreline positions. In this paper, the parameters of this fitting have been refined based on information about lake and mire basin isolation and archaeological findings. Also, an alternative eustatic model is used in parameter refinement. Both datasets involve uncertainties so Monte Carlo simulation is used to acquire several realizations of the model parameters. The two statistical models, the digital elevation model and the refined land uplift model, were used as inputs to a GIS-based toolbox where the characteristics of lake projections for the future Olkiluoto nuclear fuel repository site were estimated. The focus of the study was on surface water bodies since they are the major transport channels for radionuclides in containment failure scenarios. The results of the study show that the different land uplift modeling schemes relying on alternative eustatic models, Moho map versions and function fitting techniques yield largely similar landscape development tracks. However, the results also point out some more improbable realizations, which deviate significantly from the main development tracks.

  6. PGMC: a framework for probabilistic graphic model combination.

    PubMed

    Jiang, Chang An; Leong, Tze-Yun; Poh, Kim-Leng

    2005-01-01

    Decision making in biomedicine often involves incorporating new evidences into existing or working models reflecting the decision problems at hand. We propose a new framework that facilitates effective and incremental integration of multiple probabilistic graphical models. The proposed framework aims to minimize time and effort required to customize and extend the original models through preserving the conditional independence relationships inherent in two types of probabilistic graphical models: Bayesian networks and influence diagrams. We present a four-step algorithm to systematically combine the qualitative and the quantitative parts of the different models; we also describe three heuristic methods for target variable generation to reduce the complexity of the integrated models. Preliminary results from a case study in heart disease diagnosis demonstrate the feasibility and potential for applying the proposed framework in real applications.

  7. Mediation Analysis in a Latent Growth Curve Modeling Framework

    ERIC Educational Resources Information Center

    von Soest, Tilmann; Hagtvet, Knut A.

    2011-01-01

    This article presents several longitudinal mediation models in the framework of latent growth curve modeling and provides a detailed account of how such models can be constructed. Logical and statistical challenges that might arise when such analyses are conducted are also discussed. Specifically, we discuss how the initial status (intercept) and…

  8. A Computational Framework for Realistic Retina Modeling.

    PubMed

    Martínez-Cañada, Pablo; Morillas, Christian; Pino, Begoña; Ros, Eduardo; Pelayo, Francisco

    2016-11-01

    Computational simulations of the retina have led to valuable insights about the biophysics of its neuronal activity and processing principles. A great number of retina models have been proposed to reproduce the behavioral diversity of the different visual processing pathways. While many of these models share common computational stages, previous efforts have been more focused on fitting specific retina functions rather than generalizing them beyond a particular model. Here, we define a set of computational retinal microcircuits that can be used as basic building blocks for the modeling of different retina mechanisms. To validate the hypothesis that similar processing structures may be repeatedly found in different retina functions, we implemented a series of retina models simply by combining these computational retinal microcircuits. Accuracy of the retina models for capturing neural behavior was assessed by fitting published electrophysiological recordings that characterize some of the best-known phenomena observed in the retina: adaptation to the mean light intensity and temporal contrast, and differential motion sensitivity. The retinal microcircuits are part of a new software platform for efficient computational retina modeling from single-cell to large-scale levels. It includes an interface with spiking neural networks that allows simulation of the spiking response of ganglion cells and integration with models of higher visual areas.

  9. GeoFramework: Coupling multiple models of mantle convection within a computational framework

    NASA Astrophysics Data System (ADS)

    Tan, E.; Choi, E.; Thoutireddy, P.; Gurnis, M.; Aivazis, M.

    2004-12-01

    Geological processes usually encompass a broad spectrum of length and time scales. Traditionally, a modeling code (solver) is developed for a problem of specific length and time scales, but the utility of the solver beyond the designated purpose is usually limited. As we have come to recognize that geological processes often result from the dynamic coupling of deformation across a wide range of time and spatial scales, more robust methods are needed. One means to address this need is through the integration of complementary modeling codes, while attempting to reuse existing software as much as possible. The GeoFramework project addresses this by developing a suite of reusable and combinable tools for the Earth science community. GeoFramework is based on and extends Pyre, a Python-based modeling framework, developed to link solid (Lagrangian) and fluid (Eulerian) solvers, as well as mesh generators, visualization packages, and databases, with one another for engineering applications. Under the framework, a solver is aware of the presence of other solvers and can interact with each other via exchanging information across adjacent mesh boundary. We will show an example of linking two instances of the CitcomS finite element solver within GeoFramework. A high-resolution regional mantle convection model is linked with a global mantle convection model. The global solver has a resolution of ˜180 km horizontally and 35-100 km (with mesh refinement) vertically. The fine mesh has a resolution of ˜40 km horizontally and vertically. The fine mesh is center on the Hawaii hotspot. A vertical plume is used as an initial condition. Time-varying plate velocity models are imposed since 80 Ma and we have investigated how the plume conduit is deflected by the global circulation patterns as a function of mantle viscosity, plume flux, and plate motion.

  10. Traffic modelling framework for electric vehicles

    NASA Astrophysics Data System (ADS)

    Schlote, Arieh; Crisostomi, Emanuele; Kirkland, Stephen; Shorten, Robert

    2012-07-01

    This article reviews and improves a recently proposed model of road network dynamics. The model is also adapted and generalised to represent the patterns of battery consumption of electric vehicles travelling in the road network. Simulations from the mobility simulator SUMO are given to support and to illustrate the efficacy of the proposed approach. Applications relevant in the field of electric vehicles, such as optimal routing and traffic load control, are provided to illustrate how the proposed model can be used to address typical problems arising in contemporary road network planning and electric vehicle mobility.

  11. Aero-thermal modeling framework for TMT

    NASA Astrophysics Data System (ADS)

    Vogiatzis, Konstantinos

    2011-09-01

    The Performance Error Budget of the Thirty Meter Telescope (TMT) suggests that nearly one third of the total image degradation is due to aero-thermal disturbances (mirror and dome seeing, dynamic wind loading and thermal deformations of the optics, telescope structure and enclosure). An update of the current status of aero-thermal modeling and Computational Fluid-Solid Dynamics (CFSD) simulations for TMT is presented. A fast three-dimensional transient conduction-convection-radiation bulk-air-volume model has also been developed for the enclosure and selected telescope components in order to track the temperature variations of the surfaces, structure and interstitial air over a period of three years using measured environmental conditions. It is used for Observatory Heat Budget analysis and also provides estimates of thermal boundary conditions required by the CFD/FEA models and guidance to the design. Detailed transient CFSD conjugate heat transfer simulations of the mirror support assemblies determine the direction of heat flow from important heat sources and provide guidance to the design. Finally, improved CFD modeling is used to calculate wind forces and temperature fields. Wind loading simulations are demonstrated through TMT aperture deflector forcing. Temperature fields are transformed into refractive index ones and the resulting Optical Path Differences (OPDs) are fed into an updated thermal seeing model to estimate seeing performance metrics. Keck II simulations are the demonstrator for the latter type of modeling.

  12. Entity-Centric Abstraction and Modeling Framework for Transportation Architectures

    NASA Technical Reports Server (NTRS)

    Lewe, Jung-Ho; DeLaurentis, Daniel A.; Mavris, Dimitri N.; Schrage, Daniel P.

    2007-01-01

    A comprehensive framework for representing transpportation architectures is presented. After discussing a series of preceding perspectives and formulations, the intellectual underpinning of the novel framework using an entity-centric abstraction of transportation is described. The entities include endogenous and exogenous factors and functional expressions are offered that relate these and their evolution. The end result is a Transportation Architecture Field which permits analysis of future concepts under the holistic perspective. A simulation model which stems from the framework is presented and exercised producing results which quantify improvements in air transportation due to advanced aircraft technologies. Finally, a modeling hypothesis and its accompanying criteria are proposed to test further use of the framework for evaluating new transportation solutions.

  13. Evolutionary Framework for Lepidoptera Model Systems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Model systems” are specific organisms upon which detailed studies have been conducted examining a fundamental biological question. If the studies are robust, their results can be extrapolated among an array of organisms that possess features in common with the subject organism. The true power of...

  14. Multicriteria framework for selecting a process modelling language

    NASA Astrophysics Data System (ADS)

    Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel

    2016-01-01

    The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.

  15. First-Order Frameworks for Managing Models in Engineering Optimization

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natlia M.; Lewis, Robert Michael

    2000-01-01

    Approximation/model management optimization (AMMO) is a rigorous methodology for attaining solutions of high-fidelity optimization problems with minimal expense in high- fidelity function and derivative evaluation. First-order AMMO frameworks allow for a wide variety of models and underlying optimization algorithms. Recent demonstrations with aerodynamic optimization achieved three-fold savings in terms of high- fidelity function and derivative evaluation in the case of variable-resolution models and five-fold savings in the case of variable-fidelity physics models. The savings are problem dependent but certain trends are beginning to emerge. We give an overview of the first-order frameworks, current computational results, and an idea of the scope of the first-order framework applicability.

  16. Theoretical Tinnitus Framework: A Neurofunctional Model

    PubMed Central

    Ghodratitoostani, Iman; Zana, Yossi; Delbem, Alexandre C. B.; Sani, Siamak S.; Ekhtiari, Hamed; Sanchez, Tanit G.

    2016-01-01

    Subjective tinnitus is the conscious (attended) awareness perception of sound in the absence of an external source and can be classified as an auditory phantom perception. Earlier literature establishes three distinct states of conscious perception as unattended, attended, and attended awareness conscious perception. The current tinnitus development models depend on the role of external events congruently paired with the causal physical events that precipitate the phantom perception. We propose a novel Neurofunctional Tinnitus Model to indicate that the conscious (attended) awareness perception of phantom sound is essential in activating the cognitive-emotional value. The cognitive-emotional value plays a crucial role in governing attention allocation as well as developing annoyance within tinnitus clinical distress. Structurally, the Neurofunctional Tinnitus Model includes the peripheral auditory system, the thalamus, the limbic system, brainstem, basal ganglia, striatum, and the auditory along with prefrontal cortices. Functionally, we assume the model includes presence of continuous or intermittent abnormal signals at the peripheral auditory system or midbrain auditory paths. Depending on the availability of attentional resources, the signals may or may not be perceived. The cognitive valuation process strengthens the lateral-inhibition and noise canceling mechanisms in the mid-brain, which leads to the cessation of sound perception and renders the signal evaluation irrelevant. However, the “sourceless” sound is eventually perceived and can be cognitively interpreted as suspicious or an indication of a disease in which the cortical top-down processes weaken the noise canceling effects. This results in an increase in cognitive and emotional negative reactions such as depression and anxiety. The negative or positive cognitive-emotional feedbacks within the top-down approach may have no relation to the previous experience of the patients. They can also be

  17. Theoretical Tinnitus Framework: A Neurofunctional Model.

    PubMed

    Ghodratitoostani, Iman; Zana, Yossi; Delbem, Alexandre C B; Sani, Siamak S; Ekhtiari, Hamed; Sanchez, Tanit G

    2016-01-01

    Subjective tinnitus is the conscious (attended) awareness perception of sound in the absence of an external source and can be classified as an auditory phantom perception. Earlier literature establishes three distinct states of conscious perception as unattended, attended, and attended awareness conscious perception. The current tinnitus development models depend on the role of external events congruently paired with the causal physical events that precipitate the phantom perception. We propose a novel Neurofunctional Tinnitus Model to indicate that the conscious (attended) awareness perception of phantom sound is essential in activating the cognitive-emotional value. The cognitive-emotional value plays a crucial role in governing attention allocation as well as developing annoyance within tinnitus clinical distress. Structurally, the Neurofunctional Tinnitus Model includes the peripheral auditory system, the thalamus, the limbic system, brainstem, basal ganglia, striatum, and the auditory along with prefrontal cortices. Functionally, we assume the model includes presence of continuous or intermittent abnormal signals at the peripheral auditory system or midbrain auditory paths. Depending on the availability of attentional resources, the signals may or may not be perceived. The cognitive valuation process strengthens the lateral-inhibition and noise canceling mechanisms in the mid-brain, which leads to the cessation of sound perception and renders the signal evaluation irrelevant. However, the "sourceless" sound is eventually perceived and can be cognitively interpreted as suspicious or an indication of a disease in which the cortical top-down processes weaken the noise canceling effects. This results in an increase in cognitive and emotional negative reactions such as depression and anxiety. The negative or positive cognitive-emotional feedbacks within the top-down approach may have no relation to the previous experience of the patients. They can also be

  18. Frameworks for Assessing the Quality of Modeling and Simulation Capabilities

    NASA Astrophysics Data System (ADS)

    Rider, W. J.

    2012-12-01

    The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are

  19. Argumentation in Science Education: A Model-Based Framework

    ERIC Educational Resources Information Center

    Bottcher, Florian; Meisert, Anke

    2011-01-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons…

  20. Industrial Sector Energy Efficiency Modeling (ISEEM) Framework Documentation

    SciTech Connect

    Karali, Nihan; Xu, Tengfang; Sathaye, Jayant

    2012-12-12

    The goal of this study is to develop a new bottom-up industry sector energy-modeling framework with an agenda of addressing least cost regional and global carbon reduction strategies, improving the capabilities and limitations of the existing models that allows trading across regions and countries as an alternative.

  1. A Philosophical Framework for Integrating Systems Pharmacology Models Into Pharmacometrics

    PubMed Central

    2016-01-01

    The framework for systems pharmacology style models does not naturally sit with the usual scientific dogma of parsimony and falsifiability based on deductive reasoning. This does not invalidate the importance or need for overarching models based on pharmacology to describe and understand complicated biological systems. However, it does require some consideration on how systems pharmacology fits into the overall scientific approach. PMID:27863137

  2. A Model Framework for Course Materials Construction. Third Edition.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    A model framework for course materials construction is presented as an aid to Coast Guard course writers and coordinators, curriculum developers, and instructors who must modify a course or draft a new one. The model assumes that the instructor or other designated person has: (1) completed a task analysis which identifies the competencies, skills…

  3. Characteristics and Conceptual Framework of the Easy-Play Model

    ERIC Educational Resources Information Center

    Lu, Chunlei; Steele, Kyle

    2014-01-01

    The Easy-Play Model offers a defined framework to organize games that promote an inclusive and enjoyable sport experience. The model can be implemented by participants playing sports in educational, recreational or social contexts with the goal of achieving an active lifestyle in an inclusive, cooperative and enjoyable environment. The Easy-Play…

  4. A National Modeling Framework for Water Management Decisions

    NASA Astrophysics Data System (ADS)

    Bales, J. D.; Cline, D. W.; Pietrowsky, R.

    2013-12-01

    The National Weather Service (NWS), the U.S. Army Corps of Engineers (USACE), and the U.S. Geological Survey (USGS), all Federal agencies with complementary water-resources activities, entered into an Interagency Memorandum of Understanding (MOU) "Collaborative Science Services and Tools to Support Integrated and Adaptive Water Resources Management" to collaborate in activities that are supportive to their respective missions. One of the interagency activities is the development of a highly integrated national water modeling framework and information services framework. Together these frameworks establish a common operating picture, improve modeling and synthesis, support the sharing of data and products among agencies, and provide a platform for incorporation of new scientific understanding. Each of the agencies has existing operational systems to assist in carrying out their respective missions. The systems generally are designed, developed, tested, fielded, and supported by specialized teams. A broader, shared approach is envisioned and would include community modeling, wherein multiple independent investigators or teams develop and contribute new modeling capabilities based on science advances; modern technology in coupling model components and visualizing results; and a coupled atmospheric - hydrologic model construct such that the framework could be used in real-time water-resources decision making or for long-term management decisions. The framework also is being developed to account for organizational structures of the three partners such that, for example, national data sets can move down to the regional scale, and vice versa. We envision the national water modeling framework to be an important element of North American Water Program, to contribute to goals of the Program, and to be informed by the science and approaches developed as a part of the Program.

  5. Modeling QCD for Hadron Physics

    SciTech Connect

    Tandy, P. C.

    2011-10-24

    We review the approach to modeling soft hadron physics observables based on the Dyson-Schwinger equations of QCD. The focus is on light quark mesons and in particular the pseudoscalar and vector ground states, their decays and electromagnetic couplings. We detail the wide variety of observables that can be correlated by a ladder-rainbow kernel with one infrared parameter fixed to the chiral quark condensate. A recently proposed novel perspective in which the quark condensate is contained within hadrons and not the vacuum is mentioned. The valence quark parton distributions, in the pion and kaon, as measured in the Drell Yan process, are investigated with the same ladder-rainbow truncation of the Dyson-Schwinger and Bethe-Salpeter equations.

  6. A software engineering perspective on environmental modeling framework design: The object modeling system

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The environmental modeling community has historically been concerned with the proliferation of models and the effort associated with collective model development tasks (e.g., code generation, data provisioning and transformation, etc.). Environmental modeling frameworks (EMFs) have been developed to...

  7. 3-D HYDRODYNAMIC MODELING IN A GEOSPATIAL FRAMEWORK

    SciTech Connect

    Bollinger, J; Alfred Garrett, A; Larry Koffman, L; David Hayes, D

    2006-08-24

    3-D hydrodynamic models are used by the Savannah River National Laboratory (SRNL) to simulate the transport of thermal and radionuclide discharges in coastal estuary systems. Development of such models requires accurate bathymetry, coastline, and boundary condition data in conjunction with the ability to rapidly discretize model domains and interpolate the required geospatial data onto the domain. To facilitate rapid and accurate hydrodynamic model development, SRNL has developed a pre- and post-processor application in a geospatial framework to automate the creation of models using existing data. This automated capability allows development of very detailed models to maximize exploitation of available surface water radionuclide sample data and thermal imagery.

  8. Argumentation in Science Education: A Model-based Framework

    NASA Astrophysics Data System (ADS)

    Böttcher, Florian; Meisert, Anke

    2011-02-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons for the appropriateness of a theoretical model which explains a certain phenomenon. Argumentation is considered to be the process of the critical evaluation of such a model if necessary in relation to alternative models. Secondly, some methodological details are exemplified for the use of a model-based analysis in the concrete classroom context. Third, the application of the approach in comparison with other analytical models will be presented to demonstrate the explicatory power and depth of the model-based perspective. Primarily, the framework of Toulmin to structurally analyse arguments is contrasted with the approach presented here. It will be demonstrated how common methodological and theoretical problems in the context of Toulmin's framework can be overcome through a model-based perspective. Additionally, a second more complex argumentative sequence will also be analysed according to the invented analytical scheme to give a broader impression of its potential in practical use.

  9. A Liver-Centric Multiscale Modeling Framework for Xenobiotics

    PubMed Central

    Swat, Maciej; Cosmanescu, Alin; Clendenon, Sherry G.; Wambaugh, John F.; Glazier, James A.

    2016-01-01

    We describe a multi-scale, liver-centric in silico modeling framework for acetaminophen pharmacology and metabolism. We focus on a computational model to characterize whole body uptake and clearance, liver transport and phase I and phase II metabolism. We do this by incorporating sub-models that span three scales; Physiologically Based Pharmacokinetic (PBPK) modeling of acetaminophen uptake and distribution at the whole body level, cell and blood flow modeling at the tissue/organ level and metabolism at the sub-cellular level. We have used standard modeling modalities at each of the three scales. In particular, we have used the Systems Biology Markup Language (SBML) to create both the whole-body and sub-cellular scales. Our modeling approach allows us to run the individual sub-models separately and allows us to easily exchange models at a particular scale without the need to extensively rework the sub-models at other scales. In addition, the use of SBML greatly facilitates the inclusion of biological annotations directly in the model code. The model was calibrated using human in vivo data for acetaminophen and its sulfate and glucuronate metabolites. We then carried out extensive parameter sensitivity studies including the pairwise interaction of parameters. We also simulated population variation of exposure and sensitivity to acetaminophen. Our modeling framework can be extended to the prediction of liver toxicity following acetaminophen overdose, or used as a general purpose pharmacokinetic model for xenobiotics. PMID:27636091

  10. A Liver-Centric Multiscale Modeling Framework for Xenobiotics.

    PubMed

    Sluka, James P; Fu, Xiao; Swat, Maciej; Belmonte, Julio M; Cosmanescu, Alin; Clendenon, Sherry G; Wambaugh, John F; Glazier, James A

    2016-01-01

    We describe a multi-scale, liver-centric in silico modeling framework for acetaminophen pharmacology and metabolism. We focus on a computational model to characterize whole body uptake and clearance, liver transport and phase I and phase II metabolism. We do this by incorporating sub-models that span three scales; Physiologically Based Pharmacokinetic (PBPK) modeling of acetaminophen uptake and distribution at the whole body level, cell and blood flow modeling at the tissue/organ level and metabolism at the sub-cellular level. We have used standard modeling modalities at each of the three scales. In particular, we have used the Systems Biology Markup Language (SBML) to create both the whole-body and sub-cellular scales. Our modeling approach allows us to run the individual sub-models separately and allows us to easily exchange models at a particular scale without the need to extensively rework the sub-models at other scales. In addition, the use of SBML greatly facilitates the inclusion of biological annotations directly in the model code. The model was calibrated using human in vivo data for acetaminophen and its sulfate and glucuronate metabolites. We then carried out extensive parameter sensitivity studies including the pairwise interaction of parameters. We also simulated population variation of exposure and sensitivity to acetaminophen. Our modeling framework can be extended to the prediction of liver toxicity following acetaminophen overdose, or used as a general purpose pharmacokinetic model for xenobiotics.

  11. New framework for standardized notation in wastewater treatment modelling.

    PubMed

    Corominas, L L; Rieger, L; Takács, I; Ekama, G; Hauduc, H; Vanrolleghem, P A; Oehmen, A; Gernaey, K V; van Loosdrecht, M C M; Comeau, Y

    2010-01-01

    Many unit process models are available in the field of wastewater treatment. All of these models use their own notation, causing problems for documentation, implementation and connection of different models (using different sets of state variables). The main goal of this paper is to propose a new notational framework which allows unique and systematic naming of state variables and parameters of biokinetic models in the wastewater treatment field. The symbols are based on one main letter that gives a general description of the state variable or parameter and several subscript levels that provide greater specification. Only those levels that make the name unique within the model context are needed in creating the symbol. The paper describes specific problems encountered with the currently used notation, presents the proposed framework and provides additional practical examples. The overall result is a framework that can be used in whole plant modelling, which consists of different fields such as activated sludge, anaerobic digestion, sidestream treatment, membrane bioreactors, metabolic approaches, fate of micropollutants and biofilm processes. The main objective of this consensus building paper is to establish a consistent set of rules that can be applied to existing and most importantly, future models. Applying the proposed notation should make it easier for everyone active in the wastewater treatment field to read, write and review documents describing modelling projects.

  12. A computational framework for a database of terrestrial biosphere models

    NASA Astrophysics Data System (ADS)

    Metzler, Holger; Müller, Markus; Ceballos-Núñez, Verónika; Sierra, Carlos A.

    2016-04-01

    Most terrestrial biosphere models consist of a set of coupled ordinary first order differential equations. Each equation represents a pool containing carbon with a certain turnover rate. Although such models share some basic mathematical structures, they can have very different properties such as number of pools, cycling rates, and internal fluxes. We present a computational framework that helps analyze the structure and behavior of terrestrial biosphere models using as an example the process of soil organic matter decomposition. The same framework can also be used for other sub-processes such as carbon fixation or allocation. First, the models have to be fed into a database consisting of simple text files with a common structure. Then they are read in using Python and transformed into an internal 'Model Class' that can be used to automatically create an overview stating the model's structure, state variables, internal and external fluxes. SymPy, a Python library for symbolic mathematics, helps to also calculate the Jacobian matrix at possibly given steady states and the eigenvalues of this matrix. If complete parameter sets are available, the model can also be run using R to simulate its behavior under certain conditions and to support a deeper stability analysis. In this case, the framework is also able to provide phase-plane plots if appropriate. Furthermore, an overview of all the models in the database can be given to help identify their similarities and differences.

  13. Framework for Understanding Structural Errors (FUSE): a modular framework to diagnose differences between hydrological models

    USGS Publications Warehouse

    Clark, Martyn P.; Slater, Andrew G.; Rupp, David E.; Woods, Ross A.; Vrugt, Jasper A.; Gupta, Hoshin V.; Wagener, Thorsten; Hay, Lauren E.

    2008-01-01

    The problems of identifying the most appropriate model structure for a given problem and quantifying the uncertainty in model structure remain outstanding research challenges for the discipline of hydrology. Progress on these problems requires understanding of the nature of differences between models. This paper presents a methodology to diagnose differences in hydrological model structures: the Framework for Understanding Structural Errors (FUSE). FUSE was used to construct 79 unique model structures by combining components of 4 existing hydrological models. These new models were used to simulate streamflow in two of the basins used in the Model Parameter Estimation Experiment (MOPEX): the Guadalupe River (Texas) and the French Broad River (North Carolina). Results show that the new models produced simulations of streamflow that were at least as good as the simulations produced by the models that participated in the MOPEX experiment. Our initial application of the FUSE method for the Guadalupe River exposed relationships between model structure and model performance, suggesting that the choice of model structure is just as important as the choice of model parameters. However, further work is needed to evaluate model simulations using multiple criteria to diagnose the relative importance of model structural differences in various climate regimes and to assess the amount of independent information in each of the models. This work will be crucial to both identifying the most appropriate model structure for a given problem and quantifying the uncertainty in model structure. To facilitate research on these problems, the FORTRAN-90 source code for FUSE is available upon request from the lead author.

  14. A Theoretical Framework for Physics Education Research: Modeling Student Thinking

    ERIC Educational Resources Information Center

    Redish, Edward F.

    2004-01-01

    Education is a goal-oriented field. But if we want to treat education scientifically so we can accumulate, evaluate, and refine what we learn, then we must develop a theoretical framework that is strongly rooted in objective observations and through which different theoretical models of student thinking can be compared. Much that is known in the…

  15. The BMW Model: A New Framework for Teaching Monetary Economics

    ERIC Educational Resources Information Center

    Bofinger, Peter; Mayer, Eric; Wollmershauser, Timo

    2006-01-01

    Although the IS/LM-AS/AD model is still the central tool of macroeconomic teaching in most macroeconomic textbooks, it has been criticized by several economists. Colander (1995) demonstrated that the framework is logically inconsistent, Romer (2000) showed that it is unable to deal with a monetary policy that uses the interest rate as its…

  16. A Liver-centric Multiscale Modeling Framework for Xenobiotics

    EPA Science Inventory

    We describe a multi-scale framework for modeling acetaminophen-induced liver toxicity. Acetaminophen is a widely used analgesic. Overdose of acetaminophen can result in liver injury via its biotransformation into toxic product, which further induce massive necrosis. Our study foc...

  17. Language Arts Curriculum Framework: Sample Curriculum Model, Grade 8.

    ERIC Educational Resources Information Center

    Arkansas State Dept. of Education, Little Rock.

    Based on the 1998 Arkansas English Language Arts Curriculum Frameworks, this sample curriculum model for grade eight language arts is divided into sections focusing on writing; reading; and listening, speaking, and viewing. The writing section's stated goals are to help students employ a wide range of strategies as they write; use different…

  18. Language Arts Curriculum Framework: Sample Curriculum Model, Grade 5.

    ERIC Educational Resources Information Center

    Arkansas State Dept. of Education, Little Rock.

    Based on the 1998 Arkansas English Language Arts Curriculum Frameworks, this sample curriculum model for grade five language arts is divided into sections focusing on writing; reading; and listening, speaking, and viewing. The writing section's stated goals are to help students employ a wide range of strategies as they write; use different writing…

  19. Language Arts Curriculum Framework: Sample Curriculum Model, Grade 7.

    ERIC Educational Resources Information Center

    Arkansas State Dept. of Education, Little Rock.

    Based on the 1998 Arkansas English Language Arts Curriculum Frameworks, this sample curriculum model for grade seven language arts is divided into sections focusing on writing; reading; and listening, speaking, and viewing. The writing section's stated goals are to help students employ a wide range of strategies as they write; use different…

  20. Language Arts Curriculum Framework: Sample Curriculum Model, Grade 6.

    ERIC Educational Resources Information Center

    Arkansas State Dept. of Education, Little Rock.

    Based on the 1998 Arkansas English Language Arts Curriculum Frameworks, this sample curriculum model for grade six language arts is divided into sections focusing on writing; reading; and listening, speaking, and viewing. The writing section's stated goals are to help students employ a wide range of strategies as they write; use different writing…

  1. A Framework for Understanding Physics Students' Computational Modeling Practices

    NASA Astrophysics Data System (ADS)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by

  2. A numerical framework for modelling floating wind turbines

    NASA Astrophysics Data System (ADS)

    Vire, Axelle; Xiang, Jiansheng; Piggott, Matthew; Latham, John-Paul; Pain, Christopher

    2012-11-01

    This work couples a fluid/ocean- and a solid- dynamics model in order to numerically study fluid-structure interactions. The fully non-linear Navier-Stokes and solid-dynamics equations are solved on two distinct finite-element and unstructured grids. The interplay between fluid and solid is represented through a penalty force in the momentum balances of each material. The present algorithm is novel in that it spatially conserves the discrete penalty force, when exchanging it between both models, independently of the mesh resolution and of the shape-function orders in each model. This numerical framework targets the modelling of offshore floating wind turbines. Results will be shown for the flow past a moving pile and an actuator-disk representation of a turbine. This research is supported by the European Union Seventh Framework Programme (grant agreement PIEF-GA-2010-272437).

  3. Theoretical Models and Operational Frameworks in Public Health Ethics

    PubMed Central

    Petrini, Carlo

    2010-01-01

    The article is divided into three sections: (i) an overview of the main ethical models in public health (theoretical foundations); (ii) a summary of several published frameworks for public health ethics (practical frameworks); and (iii) a few general remarks. Rather than maintaining the superiority of one position over the others, the main aim of the article is to summarize the basic approaches proposed thus far concerning the development of public health ethics by describing and comparing the various ideas in the literature. With this in mind, an extensive list of references is provided. PMID:20195441

  4. A High Performance Bayesian Computing Framework for Spatiotemporal Uncertainty Modeling

    NASA Astrophysics Data System (ADS)

    Cao, G.

    2015-12-01

    All types of spatiotemporal measurements are subject to uncertainty. With spatiotemporal data becomes increasingly involved in scientific research and decision making, it is important to appropriately model the impact of uncertainty. Quantitatively modeling spatiotemporal uncertainty, however, is a challenging problem considering the complex dependence and dataheterogeneities.State-space models provide a unifying and intuitive framework for dynamic systems modeling. In this paper, we aim to extend the conventional state-space models for uncertainty modeling in space-time contexts while accounting for spatiotemporal effects and data heterogeneities. Gaussian Markov Random Field (GMRF) models, also known as conditional autoregressive models, are arguably the most commonly used methods for modeling of spatially dependent data. GMRF models basically assume that a geo-referenced variable primarily depends on its neighborhood (Markov property), and the spatial dependence structure is described via a precision matrix. Recent study has shown that GMRFs are efficient approximation to the commonly used Gaussian fields (e.g., Kriging), and compared with Gaussian fields, GMRFs enjoy a series of appealing features, such as fast computation and easily accounting for heterogeneities in spatial data (e.g, point and areal). This paper represents each spatial dataset as a GMRF and integrates them into a state-space form to statistically model the temporal dynamics. Different types of spatial measurements (e.g., categorical, count or continuous), can be accounted for by according link functions. A fast alternative to MCMC framework, so-called Integrated Nested Laplace Approximation (INLA), was adopted for model inference.Preliminary case studies will be conducted to showcase the advantages of the described framework. In the first case, we apply the proposed method for modeling the water table elevation of Ogallala aquifer over the past decades. In the second case, we analyze the

  5. A Practical Ontology Framework for Static Model Analysis

    DTIC Science & Technology

    2011-04-26

    throughout the model. We implement our analysis framework on top of Ptolemy II [3], an extensible open source model-based design tool written in Java...While Ptolemy II makes a good testbed for im- plementing and experimenting with new analyses, we also feel that the techniques we present here are...broadly use- ful. For this reason, we aim to make our analysis frame- work orthogonal to the execution semantics of Ptolemy II, allowing it to be

  6. Possibilities: A framework for modeling students' deductive reasoning in physics

    NASA Astrophysics Data System (ADS)

    Gaffney, Jonathan David Housley

    Students often make errors when trying to solve qualitative or conceptual physics problems, and while many successful instructional interventions have been generated to prevent such errors, the process of deduction that students use when solving physics problems has not been thoroughly studied. In an effort to better understand that reasoning process, I have developed a new framework, which is based on the mental models framework in psychology championed by P. N. Johnson-Laird. My new framework models how students search possibility space when thinking about conceptual physics problems and suggests that errors arise from failing to flesh out all possibilities. It further suggests that instructional interventions should focus on making apparent those possibilities, as well as all physical consequences those possibilities would incur. The possibilities framework emerged from the analysis of data from a unique research project specifically invented for the purpose of understanding how students use deductive reasoning. In the selection task, participants were given a physics problem along with three written possible solutions with the goal of identifying which one of the three possible solutions was correct. Each participant was also asked to identify the errors in the incorrect solutions. For the study presented in this dissertation, participants not only performed the selection task individually on four problems, but they were also placed into groups of two or three and asked to discuss with each other the reasoning they used in making their choices and attempt to reach a consensus about which solution was correct. Finally, those groups were asked to work together to perform the selection task on three new problems. The possibilities framework appropriately models the reasoning that students use, and it makes useful predictions about potentially helpful instructional interventions. The study reported in this dissertation emphasizes the useful insight the

  7. An enhanced BSIM modeling framework for selfheating aware circuit design

    NASA Astrophysics Data System (ADS)

    Schleyer, M.; Leuschner, S.; Baumgartner, P.; Mueller, J.-E.; Klar, H.

    2014-11-01

    This work proposes a modeling framework to enhance the industry-standard BSIM4 MOSFET models with capabilities for coupled electro-thermal simulations. An automated simulation environment extracts thermal information from model data as provided by the semiconductor foundry. The standard BSIM4 model is enhanced with a Verilog-A based wrapper module, adding thermal nodes which can be connected to a thermal-equivalent RC network. The proposed framework allows a fully automated extraction process based on the netlist of the top-level design and the model library. A numerical analysis tool is used to control the extraction flow and to obtain all required parameters. The framework is used to model self-heating effects on a fully integrated class A/AB power amplifier (PA) designed in a standard 65 nm CMOS process. The PA is driven with +30 dBm output power, leading to an average temperature rise of approximately 40 °C over ambient temperature.

  8. A modeling framework for system restoration from cascading failures.

    PubMed

    Liu, Chaoran; Li, Daqing; Zio, Enrico; Kang, Rui

    2014-01-01

    System restoration from cascading failures is an integral part of the overall defense against catastrophic breakdown in networked critical infrastructures. From the outbreak of cascading failures to the system complete breakdown, actions can be taken to prevent failure propagation through the entire network. While most analysis efforts have been carried out before or after cascading failures, restoration during cascading failures has been rarely studied. In this paper, we present a modeling framework to investigate the effects of in-process restoration, which depends strongly on the timing and strength of the restoration actions. Furthermore, in the model we also consider additional disturbances to the system due to restoration actions themselves. We demonstrate that the effect of restoration is also influenced by the combination of system loading level and restoration disturbance. Our modeling framework will help to provide insights on practical restoration from cascading failures and guide improvements of reliability and resilience of actual network systems.

  9. A Modeling Framework for System Restoration from Cascading Failures

    PubMed Central

    Liu, Chaoran; Li, Daqing; Zio, Enrico; Kang, Rui

    2014-01-01

    System restoration from cascading failures is an integral part of the overall defense against catastrophic breakdown in networked critical infrastructures. From the outbreak of cascading failures to the system complete breakdown, actions can be taken to prevent failure propagation through the entire network. While most analysis efforts have been carried out before or after cascading failures, restoration during cascading failures has been rarely studied. In this paper, we present a modeling framework to investigate the effects of in-process restoration, which depends strongly on the timing and strength of the restoration actions. Furthermore, in the model we also consider additional disturbances to the system due to restoration actions themselves. We demonstrate that the effect of restoration is also influenced by the combination of system loading level and restoration disturbance. Our modeling framework will help to provide insights on practical restoration from cascading failures and guide improvements of reliability and resilience of actual network systems. PMID:25474408

  10. Open source data assimilation framework for hydrological modeling

    NASA Astrophysics Data System (ADS)

    Ridler, Marc; Hummel, Stef; van Velzen, Nils; Katrine Falk, Anne; Madsen, Henrik

    2013-04-01

    An open-source data assimilation framework is proposed for hydrological modeling. Data assimilation (DA) in hydrodynamic and hydrological forecasting systems has great potential to improve predictions and improve model result. The basic principle is to incorporate measurement information into a model with the aim to improve model results by error minimization. Great strides have been made to assimilate traditional in-situ measurements such as discharge, soil moisture, hydraulic head and snowpack into hydrologic models. More recently, remotely sensed data retrievals of soil moisture, snow water equivalent or snow cover area, surface water elevation, terrestrial water storage and land surface temperature have been successfully assimilated in hydrological models. The assimilation algorithms have become increasingly sophisticated to manage measurement and model bias, non-linear systems, data sparsity (time & space) and undetermined system uncertainty. It is therefore useful to use a pre-existing DA toolbox such as OpenDA. OpenDA is an open interface standard for (and free implementation of) a set of tools to quickly implement DA and calibration for arbitrary numerical models. The basic design philosophy of OpenDA is to breakdown DA into a set of building blocks programmed in object oriented languages. To implement DA, a model must interact with OpenDA to create model instances, propagate the model, get/set variables (or parameters) and free the model once DA is completed. An open-source interface for hydrological models exists capable of all these tasks: OpenMI. OpenMI is an open source standard interface already adopted by key hydrological model providers. It defines a universal approach to interact with hydrological models during simulation to exchange data during runtime, thus facilitating the interactions between models and data sources. The interface is flexible enough so that models can interact even if the model is coded in a different language, represent

  11. A new framework for an electrophotographic printer model

    NASA Astrophysics Data System (ADS)

    Colon-Lopez, Fermin A.

    Digital halftoning is a printing technology that creates the illusion of continuous tone images for printing devices such as electrophotographic printers that can only produce a limited number of tone levels. Digital halftoning works because the human visual system has limited spatial resolution which blurs the printed dots of the halftone image, creating the gray sensation of a continuous tone image. Because the printing process is imperfect it introduces distortions to the halftone image. The quality of the printed image depends, among other factors, on the complex interactions between the halftone image, the printer characteristics, the colorant, and the printing substrate. Printer models are used to assist in the development of new types of halftone algorithms that are designed to withstand the effects of printer distortions. For example, model-based halftone algorithms optimize the halftone image through an iterative process that integrates a printer model within the algorithm. The two main goals of a printer model are to provide accurate estimates of the tone and of the spatial characteristics of the printed halftone pattern. Various classes of printer models, from simple tone calibrations to complex mechanistic models, have been reported in the literature. Existing models have one or more of the following limiting factors: they only predict tone reproduction, they depend on the halftone pattern, they require complex calibrations or complex calculations, they are printer specific, they reproduce unrealistic dot structures, and they are unable to adapt responses to new data. The two research objectives of this dissertation are (1) to introduce a new framework for printer modeling and (2) to demonstrate the feasibility of such a framework in building an electrophotographic printer model. The proposed framework introduces the concept of modeling a printer as a texture transformation machine. The basic premise is that modeling the texture differences between the

  12. Framework for the Parametric System Modeling of Space Exploration Architectures

    NASA Technical Reports Server (NTRS)

    Komar, David R.; Hoffman, Jim; Olds, Aaron D.; Seal, Mike D., II

    2008-01-01

    This paper presents a methodology for performing architecture definition and assessment prior to, or during, program formulation that utilizes a centralized, integrated architecture modeling framework operated by a small, core team of general space architects. This framework, known as the Exploration Architecture Model for IN-space and Earth-to-orbit (EXAMINE), enables: 1) a significantly larger fraction of an architecture trade space to be assessed in a given study timeframe; and 2) the complex element-to-element and element-to-system relationships to be quantitatively explored earlier in the design process. Discussion of the methodology advantages and disadvantages with respect to the distributed study team approach typically used within NASA to perform architecture studies is presented along with an overview of EXAMINE s functional components and tools. An example Mars transportation system architecture model is used to demonstrate EXAMINE s capabilities in this paper. However, the framework is generally applicable for exploration architecture modeling with destinations to any celestial body in the solar system.

  13. A VGI data integration framework based on linked data model

    NASA Astrophysics Data System (ADS)

    Wan, Lin; Ren, Rongrong

    2015-12-01

    This paper aims at the geographic data integration and sharing method for multiple online VGI data sets. We propose a semantic-enabled framework for online VGI sources cooperative application environment to solve a target class of geospatial problems. Based on linked data technologies - which is one of core components of semantic web, we can construct the relationship link among geographic features distributed in diverse VGI platform by using linked data modeling methods, then deploy these semantic-enabled entities on the web, and eventually form an interconnected geographic data network to support geospatial information cooperative application across multiple VGI data sources. The mapping and transformation from VGI sources to RDF linked data model is presented to guarantee the unique data represent model among different online social geographic data sources. We propose a mixed strategy which combined spatial distance similarity and feature name attribute similarity as the measure standard to compare and match different geographic features in various VGI data sets. And our work focuses on how to apply Markov logic networks to achieve interlinks of the same linked data in different VGI-based linked data sets. In our method, the automatic generating method of co-reference object identification model according to geographic linked data is discussed in more detail. It finally built a huge geographic linked data network across loosely-coupled VGI web sites. The results of the experiment built on our framework and the evaluation of our method shows the framework is reasonable and practicable.

  14. An Intercomparison of 2-D Models Within a Common Framework

    NASA Technical Reports Server (NTRS)

    Weisenstein, Debra K.; Ko, Malcolm K. W.; Scott, Courtney J.; Jackman, Charles H.; Fleming, Eric L.; Considine, David B.; Kinnison, Douglas E.; Connell, Peter S.; Rotman, Douglas A.; Bhartia, P. K. (Technical Monitor)

    2002-01-01

    A model intercomparison among the Atmospheric and Environmental Research (AER) 2-D model, the Goddard Space Flight Center (GSFC) 2-D model, and the Lawrence Livermore National Laboratory 2-D model allows us to separate differences due to model transport from those due to the model's chemical formulation. This is accomplished by constructing two hybrid models incorporating the transport parameters of the GSFC and LLNL models within the AER model framework. By comparing the results from the native models (AER and e.g. GSFC) with those from the hybrid model (e.g. AER chemistry with GSFC transport), differences due to chemistry and transport can be identified. For the analysis, we examined an inert tracer whose emission pattern is based on emission from a High Speed Civil Transport (HSCT) fleet; distributions of trace species in the 2015 atmosphere; and the response of stratospheric ozone to an HSCT fleet. Differences in NO(y) in the upper stratosphere are found between models with identical transport, implying different model representations of atmospheric chemical processes. The response of O3 concentration to HSCT aircraft emissions differs in the models from both transport-dominated differences in the HSCT-induced perturbations of H2O and NO(y) as well as from differences in the model represent at ions of O3 chemical processes. The model formulations of cold polar processes are found to be the most significant factor in creating large differences in the calculated ozone perturbations

  15. A Structural Model Decomposition Framework for Systems Health Management

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Daigle, Matthew J.; Bregon, Anibal; Pulido, Belamino

    2013-01-01

    Systems health management (SHM) is an important set of technologies aimed at increasing system safety and reliability by detecting, isolating, and identifying faults; and predicting when the system reaches end of life (EOL), so that appropriate fault mitigation and recovery actions can be taken. Model-based SHM approaches typically make use of global, monolithic system models for online analysis, which results in a loss of scalability and efficiency for large-scale systems. Improvement in scalability and efficiency can be achieved by decomposing the system model into smaller local submodels and operating on these submodels instead. In this paper, the global system model is analyzed offline and structurally decomposed into local submodels. We define a common model decomposition framework for extracting submodels from the global model. This framework is then used to develop algorithms for solving model decomposition problems for the design of three separate SHM technologies, namely, estimation (which is useful for fault detection and identification), fault isolation, and EOL prediction. We solve these model decomposition problems using a three-tank system as a case study.

  16. An Integrated Modeling Framework for Probable Maximum Precipitation and Flood

    NASA Astrophysics Data System (ADS)

    Gangrade, S.; Rastogi, D.; Kao, S. C.; Ashfaq, M.; Naz, B. S.; Kabela, E.; Anantharaj, V. G.; Singh, N.; Preston, B. L.; Mei, R.

    2015-12-01

    With the increasing frequency and magnitude of extreme precipitation and flood events projected in the future climate, there is a strong need to enhance our modeling capabilities to assess the potential risks on critical energy-water infrastructures such as major dams and nuclear power plants. In this study, an integrated modeling framework is developed through high performance computing to investigate the climate change effects on probable maximum precipitation (PMP) and probable maximum flood (PMF). Multiple historical storms from 1981-2012 over the Alabama-Coosa-Tallapoosa River Basin near the Atlanta metropolitan area are simulated by the Weather Research and Forecasting (WRF) model using the Climate Forecast System Reanalysis (CFSR) forcings. After further WRF model tuning, these storms are used to simulate PMP through moisture maximization at initial and lateral boundaries. A high resolution hydrological model, Distributed Hydrology-Soil-Vegetation Model, implemented at 90m resolution and calibrated by the U.S. Geological Survey streamflow observations, is then used to simulate the corresponding PMF. In addition to the control simulation that is driven by CFSR, multiple storms from the Community Climate System Model version 4 under the Representative Concentrations Pathway 8.5 emission scenario are used to simulate PMP and PMF in the projected future climate conditions. The multiple PMF scenarios developed through this integrated modeling framework may be utilized to evaluate the vulnerability of existing energy-water infrastructures with various aspects associated PMP and PMF.

  17. Modelling Framework and Assistive Device for Peripheral Intravenous Injections

    NASA Astrophysics Data System (ADS)

    Kam, Kin F.; Robinson, Martin P.; Gilbert, Mathew A.; Pelah, Adar

    2016-02-01

    Intravenous access for blood sampling or drug administration that requires peripheral venepuncture is perhaps the most common invasive procedure practiced in hospitals, clinics and general practice surgeries.We describe an idealised mathematical framework for modelling the dynamics of the peripheral venepuncture process. Basic assumptions of the model are confirmed through motion analysis of needle trajectories during venepuncture, taken from video recordings of a skilled practitioner injecting into a practice kit. The framework is also applied to the design and construction of a proposed device for accurate needle guidance during venepuncture administration, assessed as consistent and repeatable in application and does not lead to over puncture. The study provides insights into the ubiquitous peripheral venepuncture process and may contribute to applications in training and in the design of new devices, including for use in robotic automation.

  18. A Framework for Modeling and Simulation of the Artificial

    DTIC Science & Technology

    2012-01-01

    style of symphonic, folk, or jazz . A musical performance can also therefore have an ensemble of orches- tra, small group, or soloist. With no...constraint m3 :musical-performance (==> (equale (e@ style) jazz ) (or (equale (e@ ensemble) small-group) (equale (e@ ensemble) orchestra)))) (orv (ifv...equale (e@ style) jazz ) (assert! (orv (equale (e@ ensemble) orchestra) (equale (e@ ensemble) small-group))))) A Framework for Modeling and Simulation of

  19. Flexible Modeling of Epidemics with an Empirical Bayes Framework.

    PubMed

    Brooks, Logan C; Farrow, David C; Hyun, Sangwon; Tibshirani, Ryan J; Rosenfeld, Roni

    2015-08-01

    Seasonal influenza epidemics cause consistent, considerable, widespread loss annually in terms of economic burden, morbidity, and mortality. With access to accurate and reliable forecasts of a current or upcoming influenza epidemic's behavior, policy makers can design and implement more effective countermeasures. This past year, the Centers for Disease Control and Prevention hosted the "Predict the Influenza Season Challenge", with the task of predicting key epidemiological measures for the 2013-2014 U.S. influenza season with the help of digital surveillance data. We developed a framework for in-season forecasts of epidemics using a semiparametric Empirical Bayes framework, and applied it to predict the weekly percentage of outpatient doctors visits for influenza-like illness, and the season onset, duration, peak time, and peak height, with and without using Google Flu Trends data. Previous work on epidemic modeling has focused on developing mechanistic models of disease behavior and applying time series tools to explain historical data. However, tailoring these models to certain types of surveillance data can be challenging, and overly complex models with many parameters can compromise forecasting ability. Our approach instead produces possibilities for the epidemic curve of the season of interest using modified versions of data from previous seasons, allowing for reasonable variations in the timing, pace, and intensity of the seasonal epidemics, as well as noise in observations. Since the framework does not make strict domain-specific assumptions, it can easily be applied to some other diseases with seasonal epidemics. This method produces a complete posterior distribution over epidemic curves, rather than, for example, solely point predictions of forecasting targets. We report prospective influenza-like-illness forecasts made for the 2013-2014 U.S. influenza season, and compare the framework's cross-validated prediction error on historical data to that of a

  20. A coupled multi-physics modeling framework for induced seismicity

    NASA Astrophysics Data System (ADS)

    Karra, S.; Dempsey, D. E.

    2015-12-01

    There is compelling evidence that moderate-magnitude seismicity in the central and eastern US is on the rise. Many of these earthquakes are attributable to anthropogenic injection of fluids into deep formations resulting in incidents where state regulators have even intervened. Earthquakes occur when a high-pressure fluid (water or CO2) enters a fault, reducing its resistance to shear failure and causing runaway sliding. However, induced seismicity does not manifest as a solitary event, but rather as a sequence of earthquakes evolving in time and space. Additionally, one needs to consider the changes in the permeability due to slip within a fault and the subsequent effects on fluid transport and pressure build-up. A modeling framework that addresses the complex two-way coupling between seismicity and fluid-flow is thus needed. In this work, a new parallel physics-based coupled framework for induced seismicity that couples the slip in faults and fluid flow is presented. The framework couples the highly parallel subsurface flow code PFLOTRAN (www.pflotran.org) and a fast Fourier transform based earthquake simulator QK3. Stresses in the fault are evaluated using Biot's formulation in PFLOTRAN and is used to calculate slip in QK3. Permeability is updated based on the slip in the fault which in turn influences flow. Application of the framework to synthetic examples and datasets from Colorado and Oklahoma will also be discussed.

  1. A framework for modeling contaminant impacts on reservoir water quality

    NASA Astrophysics Data System (ADS)

    Jeznach, Lillian C.; Jones, Christina; Matthews, Thomas; Tobiason, John E.; Ahlfeld, David P.

    2016-06-01

    This study presents a framework for using hydrodynamic and water quality models to understand the fate and transport of potential contaminants in a reservoir and to develop appropriate emergency response and remedial actions. In the event of an emergency situation, prior detailed modeling efforts and scenario evaluations allow for an understanding of contaminant plume behavior, including maximum concentrations that could occur at the drinking water intake and contaminant travel time to the intake. A case study assessment of the Wachusett Reservoir, a major drinking water supply for metropolitan Boston, MA, provides an example of an application of the framework and how hydrodynamic and water quality models can be used to quantitatively and scientifically guide management in response to varieties of contaminant scenarios. The model CE-QUAL-W2 was used to investigate the water quality impacts of several hypothetical contaminant scenarios, including hypothetical fecal coliform input from a sewage overflow as well as an accidental railway spill of ammonium nitrate. Scenarios investigated the impacts of decay rates, season, and inter-reservoir transfers on contaminant arrival times and concentrations at the drinking water intake. The modeling study highlights the importance of a rapid operational response by managers to contain a contaminant spill in order to minimize the mass of contaminant that enters the water column, based on modeled reservoir hydrodynamics. The development and use of hydrodynamic and water quality models for surface drinking water sources subject to the potential for contaminant entry can provide valuable guidance for making decisions about emergency response and remediation actions.

  2. New model framework and structure and the commonality evaluation model. [concerning unmanned spacecraft projects

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The development of a framework and structure for shuttle era unmanned spacecraft projects and the development of a commonality evaluation model is documented. The methodology developed for model utilization in performing cost trades and comparative evaluations for commonality studies is discussed. The model framework consists of categories of activities associated with the spacecraft system's development process. The model structure describes the physical elements to be treated as separate identifiable entities. Cost estimating relationships for subsystem and program-level components were calculated.

  3. An Integrated Snow Radiance and Snow Physics Modeling Framework for Cold Land Surface Modeling

    NASA Technical Reports Server (NTRS)

    Kim, Edward J.; Tedesco, Marco

    2006-01-01

    Recent developments in forward radiative transfer modeling and physical land surface modeling are converging to allow the assembly of an integrated snow/cold lands modeling framework for land surface modeling and data assimilation applications. The key elements of this framework include: a forward radiative transfer model (FRTM) for snow, a snowpack physical model, a land surface water/energy cycle model, and a data assimilation scheme. Together these form a flexible framework for self-consistent remote sensing and water/energy cycle studies. In this paper we will describe the elements and the integration plan. Each element of this framework is modular so the choice of element can be tailored to match the emphasis of a particular study. For example, within our framework, four choices of a FRTM are available to simulate the brightness temperature of snow: Two models are available to model the physical evolution of the snowpack and underlying soil, and two models are available to handle the water/energy balance at the land surface. Since the framework is modular, other models-physical or statistical--can be accommodated, too. All modules will operate within the framework of the Land Information System (LIS), a land surface modeling framework with data assimilation capabilities running on a parallel-node computing cluster at the NASA Goddard Space Flight Center. The advantages of such an integrated modular framework built on the LIS will be described through examples-e.g., studies to analyze snow field experiment observations, and simulations of future satellite missions for snow and cold land processes.

  4. The ontology model of FrontCRM framework

    NASA Astrophysics Data System (ADS)

    Budiardjo, Eko K.; Perdana, Wira; Franshisca, Felicia

    2013-03-01

    Adoption and implementation of Customer Relationship Management (CRM) is not merely a technological installation, but the emphasis is more on the application of customer-centric philosophy and culture as a whole. CRM must begin at the level of business strategy, the only level that thorough organizational changes are possible to be done. Changes agenda can be directed to each departmental plans, and supported by information technology. Work processes related to CRM concept include marketing, sales, and services. FrontCRM is developed as framework to guide in identifying business processes related to CRM in which based on the concept of strategic planning approach. This leads to processes and practices identification in every process area related to marketing, sales, and services. The Ontology model presented on this paper by means serves as tools to avoid framework misunderstanding, to define practices systematically within process area and to find CRM software features related to those practices.

  5. Modeling phenotypic plasticity in growth trajectories: a statistical framework.

    PubMed

    Wang, Zhong; Pang, Xiaoming; Wu, Weimiao; Wang, Jianxin; Wang, Zuoheng; Wu, Rongling

    2014-01-01

    Phenotypic plasticity, that is multiple phenotypes produced by a single genotype in response to environmental change, has been thought to play an important role in evolution and speciation. Historically, knowledge about phenotypic plasticity has resulted from the analysis of static traits measured at a single time point. New insight into the adaptive nature of plasticity can be gained by an understanding of how organisms alter their developmental processes in a range of environments. Recent advances in statistical modeling of functional data and developmental genetics allow us to construct a dynamic framework of plastic response in developmental form and pattern. Under this framework, development, genetics, and evolution can be synthesized through statistical bridges to better address how evolution results from phenotypic variation in the process of development via genetic alterations.

  6. Concepts as Semantic Pointers: A Framework and Computational Model.

    PubMed

    Blouw, Peter; Solodkin, Eugene; Thagard, Paul; Eliasmith, Chris

    2016-07-01

    The reconciliation of theories of concepts based on prototypes, exemplars, and theory-like structures is a longstanding problem in cognitive science. In response to this problem, researchers have recently tended to adopt either hybrid theories that combine various kinds of representational structure, or eliminative theories that replace concepts with a more finely grained taxonomy of mental representations. In this paper, we describe an alternative approach involving a single class of mental representations called "semantic pointers." Semantic pointers are symbol-like representations that result from the compression and recursive binding of perceptual, lexical, and motor representations, effectively integrating traditional connectionist and symbolic approaches. We present a computational model using semantic pointers that replicates experimental data from categorization studies involving each prior paradigm. We argue that a framework involving semantic pointers can provide a unified account of conceptual phenomena, and we compare our framework to existing alternatives in accounting for the scope, content, recursive combination, and neural implementation of concepts.

  7. A General Framework for Multiphysics Modeling Based on Numerical Averaging

    NASA Astrophysics Data System (ADS)

    Lunati, I.; Tomin, P.

    2014-12-01

    In the last years, multiphysics (hybrid) modeling has attracted increasing attention as a tool to bridge the gap between pore-scale processes and a continuum description at the meter-scale (laboratory scale). This approach is particularly appealing for complex nonlinear processes, such as multiphase flow, reactive transport, density-driven instabilities, and geomechanical coupling. We present a general framework that can be applied to all these classes of problems. The method is based on ideas from the Multiscale Finite-Volume method (MsFV), which has been originally developed for Darcy-scale application. Recently, we have reformulated MsFV starting with a local-global splitting, which allows us to retain the original degree of coupling for the local problems and to use spatiotemporal adaptive strategies. The new framework is based on the simple idea that different characteristic temporal scales are inherited from different spatial scales, and the global and the local problems are solved with different temporal resolutions. The global (coarse-scale) problem is constructed based on a numerical volume-averaging paradigm and a continuum (Darcy-scale) description is obtained by introducing additional simplifications (e.g., by assuming that pressure is the only independent variable at the coarse scale, we recover an extended Darcy's law). We demonstrate that it is possible to adaptively and dynamically couple the Darcy-scale and the pore-scale descriptions of multiphase flow in a single conceptual and computational framework. Pore-scale problems are solved only in the active front region where fluid distribution changes with time. In the rest of the domain, only a coarse description is employed. This framework can be applied to other important problems such as reactive transport and crack propagation. As it is based on a numerical upscaling paradigm, our method can be used to explore the limits of validity of macroscopic models and to illuminate the meaning of

  8. PyCatch: catchment modelling in the PCRaster framework

    NASA Astrophysics Data System (ADS)

    Karssenberg, Derek; Lana-Renault, Noemí; Schmitz, Oliver

    2015-04-01

    PCRaster is an open source software framework for the construction and execution of stochastic, spatio-temporal, forward, models. It provides a large number of spatial operations on raster maps, with an emphasis on operations that are capable of transporting material (water, sediment) over a drainage network. These operations have been written in C++ and are provided to the model builder as Python functions. Models are constructed by combining these functions in a Python script. To ease implementation of models that use time steps and Monte Carlo iterations, the software comes with a Python framework providing control flow for temporal modelling and Monte Carlo simulation, including options for Bayesian data assimilation (Ensemble Kalman Filter, Particle Filter). A sophisticated visualization tool is provided capable of visualizing, animating, and exploring stochastic, spatio-temporal input or model output data. PCRaster is used for construction of for instance hydrological models (hillslope to global scale), land use change models, and geomorphological models. It is still being improved upon, for instance by adding under the hood functionality for executing models on multiple CPU cores, and by adding components for agent-based and network simulation. The software runs in MS Windows and Linux and is available at http://www.pcraster.eu. We provide an extensive set of online course materials (partly available free of charge). Using the PCRaster software framework, we recently developed the PyCatch model components for hydrological modelling and land degradation modelling at catchment scale. The PyCatch components run at time steps of seconds to weeks, and grid cell sizes of approximately 1-100 m, which can be selected depending on the case study for which PyCatch is used. Hydrological components currently implemented include classes for simulation of incoming solar radiation, evapotranspiration (Penman-Monteith), surface storage, infiltration (Green and Ampt

  9. A logic model framework for community nutrition education.

    PubMed

    Medeiros, Lydia C; Butkus, Sue Nicholson; Chipman, Helen; Cox, Ruby H; Jones, Larry; Little, Deborah

    2005-01-01

    Logic models are a practical method for systematically collecting impact data for community nutrition efforts, such as the Food Stamp Nutrition Education program. This report describes the process used to develop and test the Community Nutrition Education Logic Model and the results of a pilot study to determine whether national evaluation data could be captured without losing flexibility of programming and evaluation at the state level. The objectives were to develop an evaluation framework based on the Logic Model to include dietary quality, food safety, food security, and shopping behavior/food resource management and to develop a training mechanism for use. The portability feature of the model should allow application to a variety of community education programs.

  10. Modeling air pollution in the Tracking and Analysis Framework (TAF)

    SciTech Connect

    Shannon, J.D.

    1998-12-31

    The Tracking and Analysis Framework (TAF) is a set of interactive computer models for integrated assessment of the Acid Rain Provisions (Title IV) of the 1990 Clean Air Act Amendments. TAF is designed to execute in minutes on a personal computer, thereby making it feasible for a researcher or policy analyst to examine quickly the effects of alternate modeling assumptions or policy scenarios. Because the development of TAF involves researchers in many different disciplines, TAF has been given a modular structure. In most cases, the modules contain reduced-form models that are based on more complete models exercised off-line. The structure of TAF as of December 1996 is shown. Both the Atmospheric Pathways Module produce estimates for regional air pollution variables.

  11. A Structural Model Decomposition Framework for Hybrid Systems Diagnosis

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Bregon, Anibal; Roychoudhury, Indranil

    2015-01-01

    Nowadays, a large number of practical systems in aerospace and industrial environments are best represented as hybrid systems that consist of discrete modes of behavior, each defined by a set of continuous dynamics. These hybrid dynamics make the on-line fault diagnosis task very challenging. In this work, we present a new modeling and diagnosis framework for hybrid systems. Models are composed from sets of user-defined components using a compositional modeling approach. Submodels for residual generation are then generated for a given mode, and reconfigured efficiently when the mode changes. Efficient reconfiguration is established by exploiting causality information within the hybrid system models. The submodels can then be used for fault diagnosis based on residual generation and analysis. We demonstrate the efficient causality reassignment, submodel reconfiguration, and residual generation for fault diagnosis using an electrical circuit case study.

  12. Flexible Modeling of Epidemics with an Empirical Bayes Framework

    PubMed Central

    Brooks, Logan C.; Farrow, David C.; Hyun, Sangwon; Tibshirani, Ryan J.; Rosenfeld, Roni

    2015-01-01

    Seasonal influenza epidemics cause consistent, considerable, widespread loss annually in terms of economic burden, morbidity, and mortality. With access to accurate and reliable forecasts of a current or upcoming influenza epidemic’s behavior, policy makers can design and implement more effective countermeasures. This past year, the Centers for Disease Control and Prevention hosted the “Predict the Influenza Season Challenge”, with the task of predicting key epidemiological measures for the 2013–2014 U.S. influenza season with the help of digital surveillance data. We developed a framework for in-season forecasts of epidemics using a semiparametric Empirical Bayes framework, and applied it to predict the weekly percentage of outpatient doctors visits for influenza-like illness, and the season onset, duration, peak time, and peak height, with and without using Google Flu Trends data. Previous work on epidemic modeling has focused on developing mechanistic models of disease behavior and applying time series tools to explain historical data. However, tailoring these models to certain types of surveillance data can be challenging, and overly complex models with many parameters can compromise forecasting ability. Our approach instead produces possibilities for the epidemic curve of the season of interest using modified versions of data from previous seasons, allowing for reasonable variations in the timing, pace, and intensity of the seasonal epidemics, as well as noise in observations. Since the framework does not make strict domain-specific assumptions, it can easily be applied to some other diseases with seasonal epidemics. This method produces a complete posterior distribution over epidemic curves, rather than, for example, solely point predictions of forecasting targets. We report prospective influenza-like-illness forecasts made for the 2013–2014 U.S. influenza season, and compare the framework’s cross-validated prediction error on historical data to

  13. LQCD workflow execution framework: Models, provenance and fault-tolerance

    NASA Astrophysics Data System (ADS)

    Piccoli, Luciano; Dubey, Abhishek; Simone, James N.; Kowalkowlski, James B.

    2010-04-01

    Large computing clusters used for scientific processing suffer from systemic failures when operated over long continuous periods for executing workflows. Diagnosing job problems and faults leading to eventual failures in this complex environment is difficult, specifically when the success of an entire workflow might be affected by a single job failure. In this paper, we introduce a model-based, hierarchical, reliable execution framework that encompass workflow specification, data provenance, execution tracking and online monitoring of each workflow task, also referred to as participants. The sequence of participants is described in an abstract parameterized view, which is translated into a concrete data dependency based sequence of participants with defined arguments. As participants belonging to a workflow are mapped onto machines and executed, periodic and on-demand monitoring of vital health parameters on allocated nodes is enabled according to pre-specified rules. These rules specify conditions that must be true pre-execution, during execution and post-execution. Monitoring information for each participant is propagated upwards through the reflex and healing architecture, which consists of a hierarchical network of decentralized fault management entities, called reflex engines. They are instantiated as state machines or timed automatons that change state and initiate reflexive mitigation action(s) upon occurrence of certain faults. We describe how this cluster reliability framework is combined with the workflow execution framework using formal rules and actions specified within a structure of first order predicate logic that enables a dynamic management design that reduces manual administrative workload, and increases cluster-productivity.

  14. A modular Human Exposure Model (HEM) framework to ...

    EPA Pesticide Factsheets

    Life Cycle Impact Analysis (LCIA) has proven to be a valuable tool for systematically comparing processes and products, and has been proposed for use in Chemical Alternatives Analysis (CAA). The exposure assessment portion of the human health impact scores of LCIA has historically focused on far-field sources (environmentally mediated exposures) while research has shown that use related exposures, (near-field exposures) typically dominate population exposure. Characterizing the human health impacts of chemicals in consumer products over the life cycle of these products requires an evaluation of both near-field as well far-field sources. Assessing the impacts of the near-field exposures requires bridging the scientific and technical gaps that currently prevent the harmonious use of the best available methods and tools from the fields of LCIA and human health exposure and risk assessment. The U.S. EPA’s Chemical Safety and Sustainability LC-HEM project is developing the Human Exposure Model (HEM) to assess near-field exposures to chemicals that occur to various populations over the life cycle of a commercial product. The HEM will be a publically available, web-based, modular system which will allow for the evaluation of chemical/product impacts in a LCIA framework to support CAA. We present here an overview of the framework for the modular HEM system. The framework includes a data flow diagram of in-progress and future planned modules, the definition of each mod

  15. ASSESSING MULTIMEDIA/MULTIPATHWAY EXPOSURE TO ARSENIC USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    A series of case studies is presented focusing on multimedia/multipathway population exposures to arsenic, employing the Population Based Modeling approach of the MENTOR (Modeling Environment for Total Risks) framework. This framework considers currently five exposure routes: i...

  16. A Robust Control Design Framework for Substructure Models

    NASA Technical Reports Server (NTRS)

    Lim, Kyong B.

    1994-01-01

    A framework for designing control systems directly from substructure models and uncertainties is proposed. The technique is based on combining a set of substructure robust control problems by an interface stiffness matrix which appears as a constant gain feedback. Variations of uncertainties in the interface stiffness are treated as a parametric uncertainty. It is shown that multivariable robust control can be applied to generate centralized or decentralized controllers that guarantee performance with respect to uncertainties in the interface stiffness, reduced component modes and external disturbances. The technique is particularly suited for large, complex, and weakly coupled flexible structures.

  17. A Framework and Model for Evaluating Clinical Decision Support Architectures

    PubMed Central

    Wright, Adam; Sittig, Dean F.

    2008-01-01

    In this paper, we develop a four-phase model for evaluating architectures for clinical decision support that focuses on: defining a set of desirable features for a decision support architecture; building a proof-of-concept prototype; demonstrating that the architecture is useful by showing that it can be integrated with existing decision support systems and comparing its coverage to that of other architectures. We apply this framework to several well-known decision support architectures, including Arden Syntax, GLIF, SEBASTIAN and SAGE PMID:18462999

  18. A hybrid parallel framework for the cellular Potts model simulations

    SciTech Connect

    Jiang, Yi; He, Kejing; Dong, Shoubin

    2009-01-01

    The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approach achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).

  19. An Integrated Framework Advancing Membrane Protein Modeling and Design

    PubMed Central

    Weitzner, Brian D.; Duran, Amanda M.; Tilley, Drew C.; Elazar, Assaf; Gray, Jeffrey J.

    2015-01-01

    Membrane proteins are critical functional molecules in the human body, constituting more than 30% of open reading frames in the human genome. Unfortunately, a myriad of difficulties in overexpression and reconstitution into membrane mimetics severely limit our ability to determine their structures. Computational tools are therefore instrumental to membrane protein structure prediction, consequently increasing our understanding of membrane protein function and their role in disease. Here, we describe a general framework facilitating membrane protein modeling and design that combines the scientific principles for membrane protein modeling with the flexible software architecture of Rosetta3. This new framework, called RosettaMP, provides a general membrane representation that interfaces with scoring, conformational sampling, and mutation routines that can be easily combined to create new protocols. To demonstrate the capabilities of this implementation, we developed four proof-of-concept applications for (1) prediction of free energy changes upon mutation; (2) high-resolution structural refinement; (3) protein-protein docking; and (4) assembly of symmetric protein complexes, all in the membrane environment. Preliminary data show that these algorithms can produce meaningful scores and structures. The data also suggest needed improvements to both sampling routines and score functions. Importantly, the applications collectively demonstrate the potential of combining the flexible nature of RosettaMP with the power of Rosetta algorithms to facilitate membrane protein modeling and design. PMID:26325167

  20. An empirical generative framework for computational modeling of language acquisition.

    PubMed

    Waterfall, Heidi R; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-06-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of generative grammars from raw CHILDES data and give an account of the generative performance of the acquired grammars. Next, we summarize findings from recent longitudinal and experimental work that suggests how certain statistically prominent structural properties of child-directed speech may facilitate language acquisition. We then present a series of new analyses of CHILDES data indicating that the desired properties are indeed present in realistic child-directed speech corpora. Finally, we suggest how our computational results, behavioral findings, and corpus-based insights can be integrated into a next-generation model aimed at meeting the four requirements of our modeling framework.

  1. CIMS: A FRAMEWORK FOR INFRASTRUCTURE INTERDEPENDENCY MODELING AND ANALYSIS

    SciTech Connect

    Donald D. Dudenhoeffer; May R. Permann; Milos Manic

    2006-12-01

    Today’s society relies greatly upon an array of complex national and international infrastructure networks such as transportation, utilities, telecommunication, and even financial networks. While modeling and simulation tools have provided insight into the behavior of individual infrastructure networks, a far less understood area is that of the interrelationships among multiple infrastructure networks including the potential cascading effects that may result due to these interdependencies. This paper first describes infrastructure interdependencies as well as presenting a formalization of interdependency types. Next the paper describes a modeling and simulation framework called CIMS© and the work that is being conducted at the Idaho National Laboratory (INL) to model and simulate infrastructure interdependencies and the complex behaviors that can result.

  2. A unifying modeling framework for highly multivariate disease mapping.

    PubMed

    Botella-Rocamora, P; Martinez-Beneito, M A; Banerjee, S

    2015-04-30

    Multivariate disease mapping refers to the joint mapping of multiple diseases from regionally aggregated data and continues to be the subject of considerable attention for biostatisticians and spatial epidemiologists. The key issue is to map multiple diseases accounting for any correlations among themselves. Recently, Martinez-Beneito (2013) provided a unifying framework for multivariate disease mapping. While attractive in that it colligates a variety of existing statistical models for mapping multiple diseases, this and other existing approaches are computationally burdensome and preclude the multivariate analysis of moderate to large numbers of diseases. Here, we propose an alternative reformulation that accrues substantial computational benefits enabling the joint mapping of tens of diseases. Furthermore, the approach subsumes almost all existing classes of multivariate disease mapping models and offers substantial insight into the properties of statistical disease mapping models.

  3. A framework for modeling emerging diseases to inform management

    USGS Publications Warehouse

    Russell, Robin E.; Katz, Rachel A.; Richgels, Katherine L.D.; Walsh, Daniel P.; Grant, Evan

    2017-01-01

    The rapid emergence and reemergence of zoonotic diseases requires the ability to rapidly evaluate and implement optimal management decisions. Actions to control or mitigate the effects of emerging pathogens are commonly delayed because of uncertainty in the estimates and the predicted outcomes of the control tactics. The development of models that describe the best-known information regarding the disease system at the early stages of disease emergence is an essential step for optimal decision-making. Models can predict the potential effects of the pathogen, provide guidance for assessing the likelihood of success of different proposed management actions, quantify the uncertainty surrounding the choice of the optimal decision, and highlight critical areas for immediate research. We demonstrate how to develop models that can be used as a part of a decision-making framework to determine the likelihood of success of different management actions given current knowledge.

  4. The Application of Architecture Frameworks to Modelling Exploration Operations Costs

    NASA Technical Reports Server (NTRS)

    Shishko, Robert

    2006-01-01

    Developments in architectural frameworks and system-of-systems thinking have provided useful constructs for systems engineering. DoDAF concepts, language, and formalisms, in particular, provide a natural way of conceptualizing an operations cost model applicable to NASA's space exploration vision. Not all DoDAF products have meaning or apply to a DoDAF inspired operations cost model, but this paper describes how such DoDAF concepts as nodes, systems, and operational activities relate to the development of a model to estimate exploration operations costs. The paper discusses the specific implementation to the Mission Operations Directorate (MOD) operational functions/activities currently being developed and presents an overview of how this powerful representation can apply to robotic space missions as well.

  5. A Framework for Modeling Emerging Diseases to Inform Management

    PubMed Central

    Katz, Rachel A.; Richgels, Katherine L.D.; Walsh, Daniel P.; Grant, Evan H.C.

    2017-01-01

    The rapid emergence and reemergence of zoonotic diseases requires the ability to rapidly evaluate and implement optimal management decisions. Actions to control or mitigate the effects of emerging pathogens are commonly delayed because of uncertainty in the estimates and the predicted outcomes of the control tactics. The development of models that describe the best-known information regarding the disease system at the early stages of disease emergence is an essential step for optimal decision-making. Models can predict the potential effects of the pathogen, provide guidance for assessing the likelihood of success of different proposed management actions, quantify the uncertainty surrounding the choice of the optimal decision, and highlight critical areas for immediate research. We demonstrate how to develop models that can be used as a part of a decision-making framework to determine the likelihood of success of different management actions given current knowledge. PMID:27983501

  6. A Liver-centric Multiscale Modeling Framework for Xenobiotics ...

    EPA Pesticide Factsheets

    We describe a multi-scale framework for modeling acetaminophen-induced liver toxicity. Acetaminophen is a widely used analgesic. Overdose of acetaminophen can result in liver injury via its biotransformation into toxic product, which further induce massive necrosis. Our study focuses on developing a multi-scale computational model to characterize both phase I and phase II metabolism of acetaminophen, by bridging Physiologically Based Pharmacokinetic (PBPK) modeling at the whole body level, cell movement and blood flow at the tissue level and cell signaling and drug metabolism at the sub-cellular level. To validate the model, we estimated our model parameters by fi?tting serum concentrations of acetaminophen and its glucuronide and sulfate metabolites to experiments, and carried out sensitivity analysis on 35 parameters selected from three modules. Our study focuses on developing a multi-scale computational model to characterize both phase I and phase II metabolism of acetaminophen, by bridging Physiologically Based Pharmacokinetic (PBPK) modeling at the whole body level, cell movement and blood flow at the tissue level and cell signaling and drug metabolism at the sub-cellular level. This multiscale model bridges the CompuCell3D tool used by the Virtual Tissue project with the httk tool developed by the Rapid Exposure and Dosimetry project.

  7. A Systems Perspective on Situation Awareness I: Conceptual Framework, Modeling, and Quantitative Measurement

    DTIC Science & Technology

    2003-05-01

    A Systems Perspective on Situation Awareness I: Conceptual Framework , Modeling, and Quantitative Measurement Alex Kirlik (University of...I: Conceptual Framework , Modeling, and Quantitative Measurement 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d...Systems Perspective on Situation Awareness I: Conceptual Framework , Modeling, and Quantitative Measurement ALEX KIRLIK Institute of Aviation

  8. Applying Human Capital Management to Model Manpower Readiness: A Conceptual Framework

    DTIC Science & Technology

    2005-12-01

    CAPITAL MANAGEMENT TO MODEL MANPOWER READINESS: A CONCEPTUAL FRAMEWORK by Pert Chin Ngin December 2005 Associate Advisors: William R...Management to Model Manpower Readiness: A Conceptual Framework 6. AUTHOR(S) Pert Chin Ngin 5. FUNDING NUMBERS 7. PERFORMING ORGANIZATION NAME(S...distribution is unlimited. APPLYING HUMAN CAPITAL MANAGEMENT TO MODEL MANPOWER READINESS: A CONCEPTUAL FRAMEWORK Pert Chin Ngin MAJOR, Republic of

  9. Sol-Terra - AN Operational Space Weather Forecasting Model Framework

    NASA Astrophysics Data System (ADS)

    Bisi, M. M.; Lawrence, G.; Pidgeon, A.; Reid, S.; Hapgood, M. A.; Bogdanova, Y.; Byrne, J.; Marsh, M. S.; Jackson, D.; Gibbs, M.

    2015-12-01

    The SOL-TERRA project is a collaboration between RHEA Tech, the Met Office, and RAL Space funded by the UK Space Agency. The goal of the SOL-TERRA project is to produce a Roadmap for a future coupled Sun-to-Earth operational space weather forecasting system covering domains from the Sun down to the magnetosphere-ionosphere-thermosphere and neutral atmosphere. The first stage of SOL-TERRA is underway and involves reviewing current models that could potentially contribute to such a system. Within a given domain, the various space weather models will be assessed how they could contribute to such a coupled system. This will be done both by reviewing peer reviewed papers, and via direct input from the model developers to provide further insight. Once the models have been reviewed then the optimal set of models for use in support of forecast-based SWE modelling will be selected, and a Roadmap for the implementation of an operational forecast-based SWE modelling framework will be prepared. The Roadmap will address the current modelling capability, knowledge gaps and further work required, and also the implementation and maintenance of the overall architecture and environment that the models will operate within. The SOL-TERRA project will engage with external stakeholders in order to ensure independently that the project remains on track to meet its original objectives. A group of key external stakeholders have been invited to provide their domain-specific expertise in reviewing the SOL-TERRA project at critical stages of Roadmap preparation; namely at the Mid-Term Review, and prior to submission of the Final Report. This stakeholder input will ensure that the SOL-TERRA Roadmap will be enhanced directly through the input of modellers and end-users. The overall goal of the SOL-TERRA project is to develop a Roadmap for an operational forecast-based SWE modelling framework with can be implemented within a larger subsequent activity. The SOL-TERRA project is supported within

  10. An integrated modelling framework for neural circuits with multiple neuromodulators

    PubMed Central

    Vemana, Vinith

    2017-01-01

    Neuromodulators are endogenous neurochemicals that regulate biophysical and biochemical processes, which control brain function and behaviour, and are often the targets of neuropharmacological drugs. Neuromodulator effects are generally complex partly owing to the involvement of broad innervation, co-release of neuromodulators, complex intra- and extrasynaptic mechanism, existence of multiple receptor subtypes and high interconnectivity within the brain. In this work, we propose an efficient yet sufficiently realistic computational neural modelling framework to study some of these complex behaviours. Specifically, we propose a novel dynamical neural circuit model that integrates the effective neuromodulator-induced currents based on various experimental data (e.g. electrophysiology, neuropharmacology and voltammetry). The model can incorporate multiple interacting brain regions, including neuromodulator sources, simulate efficiently and easily extendable to large-scale brain models, e.g. for neuroimaging purposes. As an example, we model a network of mutually interacting neural populations in the lateral hypothalamus, dorsal raphe nucleus and locus coeruleus, which are major sources of neuromodulator orexin/hypocretin, serotonin and norepinephrine/noradrenaline, respectively, and which play significant roles in regulating many physiological functions. We demonstrate that such a model can provide predictions of systemic drug effects of the popular antidepressants (e.g. reuptake inhibitors), neuromodulator antagonists or their combinations. Finally, we developed user-friendly graphical user interface software for model simulation and visualization for both fundamental sciences and pharmacological studies. PMID:28100828

  11. A python framework for environmental model uncertainty analysis

    USGS Publications Warehouse

    White, Jeremy; Fienen, Michael; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  12. Modeling of active transmembrane transport in a mixture theory framework.

    PubMed

    Ateshian, Gerard A; Morrison, Barclay; Hung, Clark T

    2010-05-01

    This study formulates governing equations for active transport across semi-permeable membranes within the framework of the theory of mixtures. In mixture theory, which models the interactions of any number of fluid and solid constituents, a supply term appears in the conservation of linear momentum to describe momentum exchanges among the constituents. In past applications, this momentum supply was used to model frictional interactions only, thereby describing passive transport processes. In this study, it is shown that active transport processes, which impart momentum to solutes or solvent, may also be incorporated in this term. By projecting the equation of conservation of linear momentum along the normal to the membrane, a jump condition is formulated for the mechano-electrochemical potential of fluid constituents which is generally applicable to nonequilibrium processes involving active transport. The resulting relations are simple and easy to use, and address an important need in the membrane transport literature.

  13. Modeling the spectral solar irradiance in the SOTERIA Project Framework

    NASA Astrophysics Data System (ADS)

    Vieira, Luis Eduardo; Dudok de Wit, Thierry; Kretzschmar, Matthieu; Cessateur, Gaël

    The evolution of the radiative energy input is a key element to understand the variability of the Earth's neutral and ionized atmospheric components. However, reliable observations are limited to the last decades, when observations realized above the Earth's atmosphere became possible. These observations have provide insights about the variability of the spectral solar irradiance on time scales from days to years, but there is still large uncertainties on the evolu-tion on time scales from decades to centuries. Here we discuss the physics-based modeling of the ultraviolet solar irradiance under development in the Solar-Terrestrial Investigations and Archives (SOTERIA) project framework. In addition, we compare the modeled solar emission with variability observed by LYRA instrument onboard of Proba2 spacecraft.

  14. A framework for quantifying net benefits of alternative prognostic models.

    PubMed

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G

    2012-01-30

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks.

  15. A Categorical Framework for Model Classification in the Geosciences

    NASA Astrophysics Data System (ADS)

    Hauhs, Michael; Trancón y Widemann, Baltasar; Lange, Holger

    2016-04-01

    Models have a mixed record of success in the geosciences. In meteorology, model development and implementation has been among the first and most successful examples of triggering computer technology in science. On the other hand, notorious problems such as the 'equifinality issue' in hydrology lead to a rather mixed reputation of models in other areas. The most successful models in geosciences are applications of dynamic systems theory to non-living systems or phenomena. Thus, we start from the hypothesis that the success of model applications relates to the influence of life on the phenomenon under study. We thus focus on the (formal) representation of life in models. The aim is to investigate whether disappointment in model performance is due to system properties such as heterogeneity and historicity of ecosystems, or rather reflects an abstraction and formalisation problem at a fundamental level. As a formal framework for this investigation, we use category theory as applied in computer science to specify behaviour at an interface. Its methods have been developed for translating and comparing formal structures among different application areas and seems highly suited for a classification of the current "model zoo" in the geosciences. The approach is rather abstract, with a high degree of generality but a low level of expressibility. Here, category theory will be employed to check the consistency of assumptions about life in different models. It will be shown that it is sufficient to distinguish just four logical cases to check for consistency of model content. All four cases can be formalised as variants of coalgebra-algebra homomorphisms. It can be demonstrated that transitions between the four variants affect the relevant observations (time series or spatial maps), the formalisms used (equations, decision trees) and the test criteria of success (prediction, classification) of the resulting model types. We will present examples from hydrology and ecology in

  16. Proposed framework for thermomechanical life modeling of metal matrix composites

    NASA Technical Reports Server (NTRS)

    Halford, Gary R.; Lerch, Bradley A.; Saltsman, James F.

    1993-01-01

    The framework of a mechanics of materials model is proposed for thermomechanical fatigue (TMF) life prediction of unidirectional, continuous-fiber metal matrix composites (MMC's). Axially loaded MMC test samples are analyzed as structural components whose fatigue lives are governed by local stress-strain conditions resulting from combined interactions of the matrix, interfacial layer, and fiber constituents. The metallic matrix is identified as the vehicle for tracking fatigue crack initiation and propagation. The proposed framework has three major elements. First, TMF flow and failure characteristics of in situ matrix material are approximated from tests of unreinforced matrix material, and matrix TMF life prediction equations are numerically calibrated. The macrocrack initiation fatigue life of the matrix material is divided into microcrack initiation and microcrack propagation phases. Second, the influencing factors created by the presence of fibers and interfaces are analyzed, characterized, and documented in equation form. Some of the influences act on the microcrack initiation portion of the matrix fatigue life, others on the microcrack propagation life, while some affect both. Influencing factors include coefficient of thermal expansion mismatch strains, residual (mean) stresses, multiaxial stress states, off-axis fibers, internal stress concentrations, multiple initiation sites, nonuniform fiber spacing, fiber debonding, interfacial layers and cracking, fractured fibers, fiber deflections of crack fronts, fiber bridging of matrix cracks, and internal oxidation along internal interfaces. Equations exist for some, but not all, of the currently identified influencing factors. The third element is the inclusion of overriding influences such as maximum tensile strain limits of brittle fibers that could cause local fractures and ensuing catastrophic failure of surrounding matrix material. Some experimental data exist for assessing the plausibility of the proposed

  17. Quasi-3D Multi-scale Modeling Framework Development

    NASA Astrophysics Data System (ADS)

    Arakawa, A.; Jung, J.

    2008-12-01

    When models are truncated in or near an energetically active range of the spectrum, model physics must be changed as the resolution changes. The model physics of GCMs and that of CRMs are, however, quite different from each other and at present there is no unified formulation of model physics that automatically provides transition between these model physics. The Quasi-3D (Q3D) Multi-scale Modeling Framework (MMF) is an attempt to bridge this gap. Like the recently proposed Heterogeneous Multiscale Method (HMM) (E and Engquist 2003), MMF combines a macroscopic model, GCM, and a microscopic model, CRM. Unlike the traditional multiscale methods such as the multi-grid and adapted mesh refinement techniques, HMM and MMF are for solving multi-physics problems. They share the common objective "to design combined macroscopic-microscopic computational methods that are much more efficient than solving the full microscopic model and at the same time give the information we need" (E et al. 2008). The question is then how to meet this objective in practice, which can be highly problem dependent. In HHM, the efficiency is gained typically by localization of the microscale problem. Following the pioneering work by Grabowski and Smolarkiewicz (1999) and Grabowski (2001), MMF takes advantage of the fact that 2D CRMs are reasonably successful in simulating deep clouds. In this approach, the efficiency is gained by sacrificing the three-dimensionality of cloud-scale motion. It also "localizes" the algorithm through embedding a CRM in each GCM grid box using cyclic boundary condition. The Q3D MMF is an attempt to reduce the expense due to these constraints by partially including the cloud-scale 3D effects and extending the CRM beyond individual GCM grid boxes. As currently formulated, the Q3D MMF is a 4D estimation/prediction framework that combines a GCM with a 3D anelastic cloud-resolving vector vorticity equation model (VVM) applied to a network of horizontal grids. The network

  18. Usage Intention Framework Model: A Fuzzy Logic Interpretation of the Classical Utaut Model

    ERIC Educational Resources Information Center

    Sandaire, Johnny

    2009-01-01

    A fuzzy conjoint analysis (FCA: Turksen, 1992) model for enhancing management decision in the technology adoption domain was implemented as an extension to the UTAUT model (Venkatesh, Morris, Davis, & Davis, 2003). Additionally, a UTAUT-based Usage Intention Framework Model (UIFM) introduced a closed-loop feedback system. The empirical evidence…

  19. Assessment of solution uncertainties in single-column modeling frameworks

    SciTech Connect

    Hack, J.J.; Pedretti, J.A.

    2000-01-15

    Single-column models (SCMs) have been extensively promoted in recent years as an effective means to develop and test physical parameterizations targeted for more complex three-dimensional climate models. Although there are some clear advantages associated with single-column modeling, there are also some significant disadvantages, including the absence of large-scale feedbacks. Basic limitations of an SCM framework can make it difficult to interpret solutions, and at times contribute to rather striking failures to identify even first-order sensitivities as they would be observed in a global climate simulation. This manuscript will focus on one of the basic experimental approaches currently exploited by the single-column modeling community, with an emphasis on establishing the inherent uncertainties in the numerical solutions. The analysis will employ the standard physics package from the NCAR CCM3 and will illustrate the nature of solution uncertainties that arise from nonlinearities in parameterized physics. The results of this study suggest the need to make use of an ensemble methodology when conducting single-column modeling investigations.

  20. Structural uncertainty in watershed phosphorus modeling: Toward a stochastic framework

    NASA Astrophysics Data System (ADS)

    Chen, Lei; Gong, Yongwei; Shen, Zhenyao

    2016-06-01

    Structural uncertainty is an important source of model predictive errors, but few studies have been conducted on the error-transitivity from model structure to nonpoint source (NPS) prediction. In this study, we focused on the structural uncertainty caused by the algorithms and equations that are used to describe the phosphorus (P) cycle at the watershed scale. The sensitivity of simulated P to each algorithm/equation was quantified using the Soil and Water Assessment Tool (SWAT) in the Three Gorges Reservoir Area, China. The results indicated that the ratios of C:N and P:N for humic materials, as well as the algorithm of fertilization and P leaching contributed the largest output uncertainties. In comparison, the initiation of inorganic P in the soil layer and the transformation algorithm between P pools are less sensitive for the NPS-P predictions. In addition, the coefficient of variation values were quantified as 0.028-0.086, indicating that the structure-induced uncertainty is minor compared to NPS-P prediction uncertainty caused by the model input and parameters. Using the stochastic framework, the cumulative probability of simulated NPS-P data provided a trade-off between expenditure burden and desired risk. In this sense, this paper provides valuable information for the control of model structural uncertainty, and can be extrapolated to other model-based studies.

  1. Modelling grain growth in the framework of Rational Extended Thermodynamics

    NASA Astrophysics Data System (ADS)

    Kertsch, Lukas; Helm, Dirk

    2016-05-01

    Grain growth is a significant phenomenon for the thermomechanical processing of metals. Since the mobility of the grain boundaries is thermally activated and energy stored in the grain boundaries is released during their motion, a mutual interaction with the process conditions occurs. To model such phenomena, a thermodynamic framework for the representation of thermomechanical coupling phenomena in metals including a microstructure description is required. For this purpose, Rational Extended Thermodynamics appears to be a useful tool. We apply an entropy principle to derive a thermodynamically consistent model for grain coarsening due to the growth and shrinkage of individual grains. Despite the rather different approaches applied, we obtain a grain growth model which is similar to existing ones and can be regarded as a thermodynamic extension of that by Hillert (1965) to more general systems. To demonstrate the applicability of the model, we compare our simulation results to grain growth experiments in pure copper by different authors, which we are able to reproduce very accurately. Finally, we study the implications of the energy release due to grain growth on the energy balance. The present unified approach combining a microstructure description and continuum mechanics is ready to be further used to develop more elaborate material models for complex thermo-chemo-mechanical coupling phenomena.

  2. Extreme Precipitation in a Multi-Scale Modeling Framework

    NASA Astrophysics Data System (ADS)

    Phillips, M.; Denning, S.; Arabi, M.

    2015-12-01

    Extreme precipitation events are characterized by infrequent but large magnitude accummulatations that generally occur on scales belowthat resolved by the typical Global Climate Model. The Multi-scale Modeling Framework allows for information about the precipitation on these scales to be simulated for long periods of time without the large computational resources required for the use of a full cloud permitting model. The Community Earth System Model was run for 30 years in both its MMF and GCM modes, and the annual maximum series of 24 hour precipitation accumulations were used to estimate the parameters of statistical distributions. The distributions generated from model ouput were then fit to a General Extreme Value distribution and evaluated against observations. These results indicate that the MMF produces extreme precipitation with a statistical distribution that closely resembles that of observations and motivates the continued use of the MMF for analysis of extreme precipitation, and shows an improvement over the traditional GCM. The improvement in statistical distributions of annual maxima is greatest in regions that are dominated by convective precipitation where the small-scale information provided by the MMF heavily influences precipitation processes.

  3. A Data Driven Framework for Integrating Regional Climate Models

    NASA Astrophysics Data System (ADS)

    Lansing, C.; Kleese van Dam, K.; Liu, Y.; Elsethagen, T.; Guillen, Z.; Stephan, E.; Critchlow, T.; Gorton, I.

    2012-12-01

    There are increasing needs for research addressing complex climate sensitive issues of concern to decision-makers and policy planners at a regional level. Decisions about allocating scarce water across competing municipal, agricultural, and ecosystem demands is just one of the challenges ahead, along with decisions regarding competing land use priorities such as biofuels, food, and species habitat. Being able to predict the extent of future climate change in the context of introducing alternative energy production strategies requires a new generation of modeling capabilities. We will also need more complete representations of human systems at regional scales, incorporating the influences of population centers, land use, agriculture and existing and planned electrical demand and generation infrastructure. At PNNL we are working towards creating a first-of-a-kind capability known as the Integrated Regional Earth System Model (iRESM). The fundamental goal of the iRESM initiative is the critical analyses of the tradeoffs and consequences of decision and policy making for integrated human and environmental systems. This necessarily combines different scientific processes, bridging different temporal and geographic scales and resolving the semantic differences between them. To achieve this goal, iRESM is developing a modeling framework and supporting infrastructure that enable the scientific team to evaluate different scenarios in light of specific stakeholder questions such as "How do regional changes in mean climate states and climate extremes affect water storage and energy consumption and how do such decisions influence possible mitigation and carbon management schemes?" The resulting capability will give analysts a toolset to gain insights into how regional economies can respond to climate change mitigation policies and accelerated deployment of alternative energy technologies. The iRESM framework consists of a collection of coupled models working with high

  4. A modelling framework to simulate foliar fungal epidemics using functional–structural plant models

    PubMed Central

    Garin, Guillaume; Fournier, Christian; Andrieu, Bruno; Houlès, Vianney; Robert, Corinne; Pradal, Christophe

    2014-01-01

    Background and Aims Sustainable agriculture requires the identification of new, environmentally responsible strategies of crop protection. Modelling of pathosystems can allow a better understanding of the major interactions inside these dynamic systems and may lead to innovative protection strategies. In particular, functional–structural plant models (FSPMs) have been identified as a means to optimize the use of architecture-related traits. A current limitation lies in the inherent complexity of this type of modelling, and thus the purpose of this paper is to provide a framework to both extend and simplify the modelling of pathosystems using FSPMs. Methods Different entities and interactions occurring in pathosystems were formalized in a conceptual model. A framework based on these concepts was then implemented within the open-source OpenAlea modelling platform, using the platform's general strategy of modelling plant–environment interactions and extending it to handle plant interactions with pathogens. New developments include a generic data structure for representing lesions and dispersal units, and a series of generic protocols to communicate with objects representing the canopy and its microenvironment in the OpenAlea platform. Another development is the addition of a library of elementary models involved in pathosystem modelling. Several plant and physical models are already available in OpenAlea and can be combined in models of pathosystems using this framework approach. Key Results Two contrasting pathosystems are implemented using the framework and illustrate its generic utility. Simulations demonstrate the framework's ability to simulate multiscaled interactions within pathosystems, and also show that models are modular components within the framework and can be extended. This is illustrated by testing the impact of canopy architectural traits on fungal dispersal. Conclusions This study provides a framework for modelling a large number of pathosystems

  5. LAMMPS framework for dynamic bonding and an application modeling DNA

    NASA Astrophysics Data System (ADS)

    Svaneborg, Carsten

    2012-08-01

    We have extended the Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) to support directional bonds and dynamic bonding. The framework supports stochastic formation of new bonds, breakage of existing bonds, and conversion between bond types. Bond formation can be controlled to limit the maximal functionality of a bead with respect to various bond types. Concomitant with the bond dynamics, angular and dihedral interactions are dynamically introduced between newly connected triplets and quartets of beads, where the interaction type is determined from the local pattern of bead and bond types. When breaking bonds, all angular and dihedral interactions involving broken bonds are removed. The framework allows chemical reactions to be modeled, and use it to simulate a simplistic, coarse-grained DNA model. The resulting DNA dynamics illustrates the power of the present framework. Catalogue identifier: AEME_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEME_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 2 243 491 No. of bytes in distributed program, including test data, etc.: 771 Distribution format: tar.gz Programming language: C++ Computer: Single and multiple core servers Operating system: Linux/Unix/Windows Has the code been vectorized or parallelized?: Yes. The code has been parallelized by the use of MPI directives. RAM: 1 Gb Classification: 16.11, 16.12 Nature of problem: Simulating coarse-grain models capable of chemistry e.g. DNA hybridization dynamics. Solution method: Extending LAMMPS to handle dynamic bonding and directional bonds. Unusual features: Allows bonds to be created and broken while angular and dihedral interactions are kept consistent. Additional comments: The distribution file for this program is approximately 36 Mbytes and therefore is not delivered directly

  6. A Multiple Reaction Modelling Framework for Microbial Electrochemical Technologies

    PubMed Central

    Oyetunde, Tolutola; Sarma, Priyangshu M.; Ahmad, Farrukh; Rodríguez, Jorge

    2017-01-01

    A mathematical model for the theoretical evaluation of microbial electrochemical technologies (METs) is presented that incorporates a detailed physico-chemical framework, includes multiple reactions (both at the electrodes and in the bulk phase) and involves a variety of microbial functional groups. The model is applied to two theoretical case studies: (i) A microbial electrolysis cell (MEC) for continuous anodic volatile fatty acids (VFA) oxidation and cathodic VFA reduction to alcohols, for which the theoretical system response to changes in applied voltage and VFA feed ratio (anode-to-cathode) as well as membrane type are investigated. This case involves multiple parallel electrode reactions in both anode and cathode compartments; (ii) A microbial fuel cell (MFC) for cathodic perchlorate reduction, in which the theoretical impact of feed flow rates and concentrations on the overall system performance are investigated. This case involves multiple electrode reactions in series in the cathode compartment. The model structure captures interactions between important system variables based on first principles and provides a platform for the dynamic description of METs involving electrode reactions both in parallel and in series and in both MFC and MEC configurations. Such a theoretical modelling approach, largely based on first principles, appears promising in the development and testing of MET control and optimization strategies. PMID:28054959

  7. A Multiple Reaction Modelling Framework for Microbial Electrochemical Technologies.

    PubMed

    Oyetunde, Tolutola; Sarma, Priyangshu M; Ahmad, Farrukh; Rodríguez, Jorge

    2017-01-04

    A mathematical model for the theoretical evaluation of microbial electrochemical technologies (METs) is presented that incorporates a detailed physico-chemical framework, includes multiple reactions (both at the electrodes and in the bulk phase) and involves a variety of microbial functional groups. The model is applied to two theoretical case studies: (i) A microbial electrolysis cell (MEC) for continuous anodic volatile fatty acids (VFA) oxidation and cathodic VFA reduction to alcohols, for which the theoretical system response to changes in applied voltage and VFA feed ratio (anode-to-cathode) as well as membrane type are investigated. This case involves multiple parallel electrode reactions in both anode and cathode compartments; (ii) A microbial fuel cell (MFC) for cathodic perchlorate reduction, in which the theoretical impact of feed flow rates and concentrations on the overall system performance are investigated. This case involves multiple electrode reactions in series in the cathode compartment. The model structure captures interactions between important system variables based on first principles and provides a platform for the dynamic description of METs involving electrode reactions both in parallel and in series and in both MFC and MEC configurations. Such a theoretical modelling approach, largely based on first principles, appears promising in the development and testing of MET control and optimization strategies.

  8. Factors of collaborative working: a framework for a collaboration model.

    PubMed

    Patel, Harshada; Pettitt, Michael; Wilson, John R

    2012-01-01

    The ability of organisations to support collaborative working environments is of increasing importance as they move towards more distributed ways of working. Despite the attention collaboration has received from a number of disparate fields, there is a lack of a unified understanding of the component factors of collaboration. As part of our work on a European Integrated Project, CoSpaces, collaboration and collaborative working and the factors which define it were examined through the literature and new empirical work with a number of partner user companies in the aerospace, automotive and construction sectors. This was to support development of a descriptive human factors model of collaboration - the CoSpaces Collaborative Working Model (CCWM). We identified seven main categories of factors involved in collaboration: Context, Support, Tasks, Interaction Processes, Teams, Individuals, and Overarching Factors, and summarised these in a framework which forms a basis for the model. We discuss supporting evidence for the factors which emerged from our fieldwork with user partners, and use of the model in activities such as collaboration readiness profiling.

  9. Improving NASA's Multiscale Modeling Framework for Tropical Cyclone Climate Study

    NASA Technical Reports Server (NTRS)

    Shen, Bo-Wen; Nelson, Bron; Cheung, Samson; Tao, Wei-Kuo

    2013-01-01

    One of the current challenges in tropical cyclone (TC) research is how to improve our understanding of TC interannual variability and the impact of climate change on TCs. Recent advances in global modeling, visualization, and supercomputing technologies at NASA show potential for such studies. In this article, the authors discuss recent scalability improvement to the multiscale modeling framework (MMF) that makes it feasible to perform long-term TC-resolving simulations. The MMF consists of the finite-volume general circulation model (fvGCM), supplemented by a copy of the Goddard cumulus ensemble model (GCE) at each of the fvGCM grid points, giving 13,104 GCE copies. The original fvGCM implementation has a 1D data decomposition; the revised MMF implementation retains the 1D decomposition for most of the code, but uses a 2D decomposition for the massive copies of GCEs. Because the vast majority of computation time in the MMF is spent computing the GCEs, this approach can achieve excellent speedup without incurring the cost of modifying the entire code. Intelligent process mapping allows differing numbers of processes to be assigned to each domain for load balancing. The revised parallel implementation shows highly promising scalability, obtaining a nearly 80-fold speedup by increasing the number of cores from 30 to 3,335.

  10. Young diabetics' compliance in the framework of the MIMIC model.

    PubMed

    Kyngäs, H; Hentinen, M; Koivukangas, P; Ohinmaa, A

    1996-11-01

    The compliance of 346 young diabetics aged 13-17 years with health regimens is analysed in the framework of a MIMIC (multiple indicators, multiple causes) model. The data were compiled by means of a questionnaire on compliance, conditions for compliance, the meaning attached to treatment and the impact of the disease, and the model constructed using the LISREL VII programme, treating compliance as an unobserved variable formulated in terms of observed causes (x-variables) and observed indicators (y-variables). The resulting solutions are entirely satisfactory. The goodness-of-fit index is 0.983, the root mean square residual 0.058 and the chi-squared statistic 43.35 (P < 0.001). The values for the individual parameters in the model are also shown to be reliable and valid. The model shows compliance to be indicated by self-care behaviour, responsibility for treatment, intention to pursue the treatment and collaboration with the physician, and to be greatly determined by motivation, experience of the results of treatment and having the energy and will-power to pursue the treatment and, to a lesser extent, by a sense of normality and fear.

  11. A Smart Modeling Framework for Integrating BMI-enabled Models as Web Services

    NASA Astrophysics Data System (ADS)

    Jiang, P.; Elag, M.; Kumar, P.; Peckham, S. D.; Liu, R.; Marini, L.; Hsu, L.

    2015-12-01

    Serviced-oriented computing provides an opportunity to couple web service models using semantic web technology. Through this approach, models that are exposed as web services can be conserved in their own local environment, thus making it easy for modelers to maintain and update the models. In integrated modeling, the serviced-oriented loose-coupling approach requires (1) a set of models as web services, (2) the model metadata describing the external features of a model (e.g., variable name, unit, computational grid, etc.) and (3) a model integration framework. We present the architecture of coupling web service models that are self-describing by utilizing a smart modeling framework. We expose models that are encapsulated with CSDMS (Community Surface Dynamics Modeling System) Basic Model Interfaces (BMI) as web services. The BMI-enabled models are self-describing by uncovering models' metadata through BMI functions. After a BMI-enabled model is serviced, a client can initialize, execute and retrieve the meta-information of the model by calling its BMI functions over the web. Furthermore, a revised version of EMELI (Peckham, 2015), an Experimental Modeling Environment for Linking and Interoperability, is chosen as the framework for coupling BMI-enabled web service models. EMELI allows users to combine a set of component models into a complex model by standardizing model interface using BMI as well as providing a set of utilities smoothing the integration process (e.g., temporal interpolation). We modify the original EMELI so that the revised modeling framework is able to initialize, execute and find the dependencies of the BMI-enabled web service models. By using the revised EMELI, an example will be presented on integrating a set of topoflow model components that are BMI-enabled and exposed as web services. Reference: Peckham, S.D. (2014) EMELI 1.0: An experimental smart modeling framework for automatic coupling of self-describing models, Proceedings of HIC 2014

  12. a Framework for AN Open Source Geospatial Certification Model

    NASA Astrophysics Data System (ADS)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105

  13. A framework of modeling detector systems for computed tomography simulations

    NASA Astrophysics Data System (ADS)

    Youn, H.; Kim, D.; Kim, S. H.; Kam, S.; Jeon, H.; Nam, J.; Kim, H. K.

    2016-01-01

    Ultimate development in computed tomography (CT) technology may be a system that can provide images with excellent lesion conspicuity with the patient dose as low as possible. Imaging simulation tools have been cost-effectively used for these developments and will continue. For a more accurate and realistic imaging simulation, the signal and noise propagation through a CT detector system has been modeled in this study using the cascaded linear-systems theory. The simulation results are validated in comparisons with the measured results using a laboratory flat-panel micro-CT system. Although the image noise obtained from the simulations at higher exposures is slightly smaller than that obtained from the measurements, the difference between them is reasonably acceptable. According to the simulation results for various exposure levels and additive electronic noise levels, x-ray quantum noise is more dominant than the additive electronic noise. The framework of modeling a CT detector system suggested in this study will be helpful for the development of an accurate and realistic projection simulation model.

  14. Internal modelling under Risk-Based Capital (RBC) framework

    NASA Astrophysics Data System (ADS)

    Ling, Ang Siew; Hin, Pooi Ah

    2015-12-01

    Very often the methods for the internal modelling under the Risk-Based Capital framework make use of the data which are in the form of run-off triangle. The present research will instead extract from a group of n customers, the historical data for the sum insured si of the i-th customer together with the amount paid yij and the amount aij reported but not yet paid in the j-th development year for j = 1, 2, 3, 4, 5, 6. We model the future value (yij+1, aij+1) to be dependent on the present year value (yij, aij) and the sum insured si via a conditional distribution which is derived from a multivariate power-normal mixture distribution. For a group of given customers with different original purchase dates, the distribution of the aggregate claims liabilities may be obtained from the proposed model. The prediction interval based on the distribution for the aggregate claim liabilities is found to have good ability of covering the observed aggregate claim liabilities.

  15. Investigating GPDs in the framework of the double distribution model

    NASA Astrophysics Data System (ADS)

    Nazari, F.; Mirjalili, A.

    2016-06-01

    In this paper, we construct the generalized parton distribution (GPD) in terms of the kinematical variables x, ξ, t, using the double distribution model. By employing these functions, we could extract some quantities which makes it possible to gain a three-dimensional insight into the nucleon structure function at the parton level. The main objective of GPDs is to combine and generalize the concepts of ordinary parton distributions and form factors. They also provide an exclusive framework to describe the nucleons in terms of quarks and gluons. Here, we first calculate, in the Double Distribution model, the GPD based on the usual parton distributions arising from the GRV and CTEQ phenomenological models. Obtaining quarks and gluons angular momenta from the GPD, we would be able to calculate the scattering observables which are related to spin asymmetries of the produced quarkonium. These quantities are represented by AN and ALS. We also calculate the Pauli and Dirac form factors in deeply virtual Compton scattering. Finally, in order to compare our results with the existing experimental data, we use the difference of the polarized cross-section for an initial longitudinal leptonic beam and unpolarized target particles (ΔσLU). In all cases, our obtained results are in good agreement with the available experimental data.

  16. Quasi-3D Algorithm in Multi-scale Modeling Framework

    NASA Astrophysics Data System (ADS)

    Jung, J.; Arakawa, A.

    2008-12-01

    As discussed in the companion paper by Arakawa and Jung, the Quasi-3D (Q3D) Multi-scale Modeling Framework (MMF) is a 4D estimation/prediction framework that combines a GCM with a 3D anelastic vector vorticity equation model (VVM) applied to a Q3D network of horizontal grid points. This paper presents an outline of the recently revised Q3D algorithm and a highlight of the results obtained by application of the algorithm to an idealized model setting. The Q3D network of grid points consists of two sets of grid-point arrays perpendicular to each other. For a scalar variable, for example, each set consists of three parallel rows of grid points. Principal and supplementary predictions are made on the central and the two adjacent rows, respectively. The supplementary prediction is to allow the principal prediction be three-dimensional at least to the second-order accuracy. To accommodate a higher-order accuracy and to make the supplementary predictions formally three-dimensional, a few rows of ghost points are added at each side of the array. Values at these ghost points are diagnostically determined by a combination of statistical estimation and extrapolation. The basic structure of the estimation algorithm is determined in view of the global stability of Q3D advection. The algorithm is calibrated using the statistics of past data at and near the intersections of the two sets of grid- point arrays. Since the CRM in the Q3D MMF extends beyond individual GCM boxes, the CRM can be a GCM by itself. However, it is better to couple the CRM with the GCM because (1) the CRM is a Q3D CRM based on a highly anisotropic network of grid points and (2) coupling with a GCM makes it more straightforward to inherit our experience with the conventional GCMs. In the coupled system we have selected, prediction of thermdynamic variables is almost entirely done by the Q3D CRM with no direct forcing by the GCM. The coupling of the dynamics between the two components is through mutual

  17. Evolution of Climate Science Modelling Language within international standards frameworks

    NASA Astrophysics Data System (ADS)

    Lowe, Dominic; Woolf, Andrew

    2010-05-01

    The Climate Science Modelling Language (CSML) was originally developed as part of the NERC Data Grid (NDG) project in the UK. It was one of the first Geography Markup Language (GML) application schemas describing complex feature types for the metocean domain. CSML feature types can be used to describe typical climate products such as model runs or atmospheric profiles. CSML has been successfully used within NDG to provide harmonised access to a number of different data sources. For example, meteorological observations held in heterogeneous databases by the British Atmospheric Data Centre (BADC) and Centre for Ecology and Hydrology (CEH) were served uniformly as CSML features via Web Feature Service. CSML has now been substantially revised to harmonise it with the latest developments in OGC and ISO conceptual modelling for geographic information. In particular, CSML is now aligned with the near-final ISO 19156 Observations & Measurements (O&M) standard. CSML combines the O&M concept of 'sampling features' together with an observation result based on the coverage model (ISO 19123). This general pattern is specialised for particular data types of interest, classified on the basis of sampling geometry and topology. In parallel work, the OGC Met Ocean Domain Working Group has established a conceptual modelling activity. This is a cross-organisational effort aimed at reaching consensus on a common core data model that could be re-used in a number of met-related application areas: operational meteorology, aviation meteorology, climate studies, and the research community. It is significant to note that this group has also identified sampling geometry and topology as a key classification axis for data types. Using the Model Driven Architecture (MDA) approach as adopted by INSPIRE we demonstrate how the CSML application schema is derived from a formal UML conceptual model based on the ISO TC211 framework. By employing MDA tools which map consistently between UML and GML we

  18. AN INTEGRATED MODELING FRAMEWORK FOR CARBON MANAGEMENT TECHNOLOGIES

    SciTech Connect

    Anand B. Rao; Edward S. Rubin; Michael B. Berkenpas

    2004-03-01

    CO{sub 2} capture and storage (CCS) is gaining widespread interest as a potential method to control greenhouse gas emissions from fossil fuel sources, especially electric power plants. Commercial applications of CO{sub 2} separation and capture technologies are found in a number of industrial process operations worldwide. Many of these capture technologies also are applicable to fossil fuel power plants, although applications to large-scale power generation remain to be demonstrated. This report describes the development of a generalized modeling framework to assess alternative CO{sub 2} capture and storage options in the context of multi-pollutant control requirements for fossil fuel power plants. The focus of the report is on post-combustion CO{sub 2} capture using amine-based absorption systems at pulverized coal-fired plants, which are the most prevalent technology used for power generation today. The modeling framework builds on the previously developed Integrated Environmental Control Model (IECM). The expanded version with carbon sequestration is designated as IECM-cs. The expanded modeling capability also includes natural gas combined cycle (NGCC) power plants and integrated coal gasification combined cycle (IGCC) systems as well as pulverized coal (PC) plants. This report presents details of the performance and cost models developed for an amine-based CO{sub 2} capture system, representing the baseline of current commercial technology. The key uncertainties and variability in process design, performance and cost parameters which influence the overall cost of carbon mitigation also are characterized. The new performance and cost models for CO{sub 2} capture systems have been integrated into the IECM-cs, along with models to estimate CO{sub 2} transport and storage costs. The CO{sub 2} control system also interacts with other emission control technologies such as flue gas desulfurization (FGD) systems for SO{sub 2} control. The integrated model is applied to

  19. Retrofitting Non-Cognitive-Diagnostic Reading Assessment under the Generalized DINA Model Framework

    ERIC Educational Resources Information Center

    Chen, Huilin; Chen, Jinsong

    2016-01-01

    Cognitive diagnosis models (CDMs) are psychometric models developed mainly to assess examinees' specific strengths and weaknesses in a set of skills or attributes within a domain. By adopting the Generalized-DINA model framework, the recently developed general modeling framework, we attempted to retrofit the PISA reading assessments, a…

  20. Models of Recognition, Repetition Priming, and Fluency : Exploring a New Framework

    ERIC Educational Resources Information Center

    Berry, Christopher J.; Shanks, David R.; Speekenbrink, Maarten; Henson, Richard N. A.

    2012-01-01

    We present a new modeling framework for recognition memory and repetition priming based on signal detection theory. We use this framework to specify and test the predictions of 4 models: (a) a single-system (SS) model, in which one continuous memory signal drives recognition and priming; (b) a multiple-systems-1 (MS1) model, in which completely…

  1. Model-Based Reasoning in the Physics Laboratory: Framework and Initial Results

    ERIC Educational Resources Information Center

    Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.

    2015-01-01

    We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable…

  2. Smart licensing and environmental flows: Modeling framework and sensitivity testing

    NASA Astrophysics Data System (ADS)

    Wilby, R. L.; Fenn, C. R.; Wood, P. J.; Timlett, R.; Lequesne, T.

    2011-12-01

    Adapting to climate change is just one among many challenges facing river managers. The response will involve balancing the long-term water demands of society with the changing needs of the environment in sustainable and cost effective ways. This paper describes a modeling framework for evaluating the sensitivity of low river flows to different configurations of abstraction licensing under both historical climate variability and expected climate change. A rainfall-runoff model is used to quantify trade-offs among environmental flow (e-flow) requirements, potential surface and groundwater abstraction volumes, and the frequency of harmful low-flow conditions. Using the River Itchen in southern England as a case study it is shown that the abstraction volume is more sensitive to uncertainty in the regional climate change projection than to the e-flow target. It is also found that "smarter" licensing arrangements (involving a mix of hands off flows and "rising block" abstraction rules) could achieve e-flow targets more frequently than conventional seasonal abstraction limits, with only modest reductions in average annual yield, even under a hotter, drier climate change scenario.

  3. Evaluation of Model Coupling Frameworks for Use by the Community Surface Dynamics Modeling System (CSDMS)

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.; Syvitski, J. P.

    2007-12-01

    The Community Surface Dynamics Modeling System (CSDMS) is a recently NSF-funded project that represents an effort to bring together a diverse community of surface dynamics modelers and model users. Key goals of the CSDMS project are to (1) promote open-source code sharing and re-use, (2) to develop a review process for code contributions, (3) promote recognition of contributors, (4) develop a "library" of low-level software tools and higher-level models that can be linked as easily as possible into new applications and (5) provide resources to simplify the efforts of surface dynamics modelers. The architectural framework of CSDMS is being designed to allow code contributions to be in any of several different programming languages (language independence), to support a migration towards parallel computation and to support multiple operating systems (platform independence). In addition, the architecture should permit structured, unstructured and adaptive grids. A variety of different "coupling frameworks" are currently in use or under development in support of similar projects in other communities. One of these, ESMF (Earth System Modeling Framework), is primarily centered on Fortran90, structured grids and Unix-based platforms. ESMF has significant buy-in from the climate modeling community in the U.S.; a closely-related framework called OASIS4 has been adopted by many climate modelers in Europe. OpenMI has emerged from the hydrologic community in Europe and is likely to be adopted for the NSF-funded CUAHSI project. OpenMI is primarily centered on the Windows platform and a programming language called "C-sharp" and is not oriented toward parallel computing. A third, DOE-funded framework called CCA (Common Component Architecture) achieves language interoperability using a tool called Babel. It fully supports parallel computation and virtually any operating system. CCA has also been shown to be interoperable with ESMF and MCT (Model Coupling Toolkit) and would appear

  4. A modeling framework for potential induced degradation in PV modules

    NASA Astrophysics Data System (ADS)

    Bermel, Peter; Asadpour, Reza; Zhou, Chao; Alam, Muhammad A.

    2015-09-01

    Major sources of performance degradation and failure in glass-encapsulated PV modules include moisture-induced gridline corrosion, potential-induced degradation (PID) of the cell, and stress-induced busbar delamination. Recent studies have shown that PV modules operating in damp heat at -600 V are vulnerable to large amounts of degradation, potentially up to 90% of the original power output within 200 hours. To improve module reliability and restore power production in the presence of PID and other failure mechanisms, a fundamental rethinking of accelerated testing is needed. This in turn will require an improved understanding of technology choices made early in development that impact failures later. In this work, we present an integrated approach of modeling, characterization, and validation to address these problems. A hierarchical modeling framework will allows us to clarify the mechanisms of corrosion, PID, and delamination. We will employ a physics-based compact model of the cell, topology of the electrode interconnection, geometry of the packaging stack, and environmental operating conditions to predict the current, voltage, temperature, and stress distributions in PV modules correlated with the acceleration of specific degradation modes. A self-consistent solution will capture the essential complexity of the technology-specific acceleration of PID and other degradation mechanisms as a function of illumination, ambient temperature, and relative humidity. Initial results from our model include specific lifetime predictions suitable for direct comparison with indoor and outdoor experiments, which are qualitatively validated by prior work. This approach could play a significant role in developing novel accelerated lifetime tests.

  5. A modeling and simulation framework for electrokinetic nanoparticle treatment

    NASA Astrophysics Data System (ADS)

    Phillips, James

    2011-12-01

    The focus of this research is to model and provide a simulation framework for the packing of differently sized spheres within a hard boundary. The novel contributions of this dissertation are the cylinders of influence (COI) method and sectoring method implementations. The impetus for this research stems from modeling electrokinetic nanoparticle (EN) treatment, which packs concrete pores with differently sized nanoparticles. We show an improved speed of the simulation compared to previously published results of EN treatment simulation while obtaining similar porosity reduction results. We mainly focused on readily, commercially available particle sizes of 2 nm and 20 nm particles, but have the capability to model other sizes. Our simulation has graphical capabilities and can provide additional data unobtainable from physical experimentation. The data collected has a median of 0.5750 and a mean of 0.5504. The standard error is 0.0054 at alpha = 0.05 for a 95% confidence interval of 0.5504 +/- 0.0054. The simulation has produced maximum packing densities of 65% and minimum packing densities of 34%. Simulation data are analyzed using linear regression via the R statistical language to obtain two equations: one that describes porosity reduction based on all cylinder and particle characteristics, and another that focuses on describing porosity reduction based on cylinder diameter for 2 and 20 nm particles into pores of 100 nm height. Simulation results are similar to most physical results obtained from MIP and WLR. Some MIP results do not fall within the simulation limits; however, this is expected as MIP has been documented to be an inaccurate measure of pore distribution and porosity of concrete. Despite the disagreement between WLR and MIP, there is a trend that porosity reduction is higher two inches from the rebar as compared to the rebar-concrete interface. The simulation also detects a higher porosity reduction further from the rebar. This may be due to particles

  6. Digital Moon: A three-dimensional framework for lunar modeling

    NASA Astrophysics Data System (ADS)

    Paige, D. A.; Elphic, R. C.; Foote, E. J.; Meeker, S. R.; Siegler, M. A.; Vasavada, A. R.

    2009-12-01

    The Moon has a complex three-dimensional shape with significant large-scale and small-scale topographic relief. The Moon’s topography largely controls the distribution of incident solar radiation, as well as the scattered solar and infrared radiation fields. Topography also affects the Moon’s interaction with the space environment, its magnetic field, and the propagation of seismic waves. As more extensive and detailed lunar datasets become available, there is an increasing need to interpret and compare them with the results of physical models in a fully three-dimensional context. We have developed a three-dimensional framework for lunar modeling we call the Digital Moon. The goal of this work is to enable high fidelity physical modeling and visualization of the Moon in a parallel computing environment. The surface of the Moon is described by a continuous triangular mesh of arbitrary shape and spatial scale. For regions of limited geographic extent, it is convenient to employ meshes on a rectilinear grid. However for global-scale modeling, we employ a continuous geodesic gridding scheme (Teanby, 2008). Each element in the mesh surface is allowed to have a unique set of physical properties. Photon and particle interactions between mesh elements are modeled using efficient ray tracing algorithms. Heat, mass, photon and particle transfer within each mesh element are modeled in one dimension. Each compute node is assigned a portion of the mesh and collective interactions between elements are handled through network interfaces. We have used the model to calculate lunar surface and subsurface temperatures that can be compared directly with radiometric temperatures measured by the Diviner Lunar Radiometer Experiment on the Lunar Reconnaissance Orbiter. The model includes realistic surface photometric functions based on goniometric measurements of lunar soil samples (Foote and Paige, 2009), and one-dimensional thermal models based on lunar remote sensing and Apollo

  7. Adaptive invasive species distribution models: A framework for modeling incipient invasions

    USGS Publications Warehouse

    Uden, Daniel R.; Allen, Craig R.; Angeler, David G.; Corral, Lucia; Fricke, Kent A.

    2015-01-01

    The utilization of species distribution model(s) (SDM) for approximating, explaining, and predicting changes in species’ geographic locations is increasingly promoted for proactive ecological management. Although frameworks for modeling non-invasive species distributions are relatively well developed, their counterparts for invasive species—which may not be at equilibrium within recipient environments and often exhibit rapid transformations—are lacking. Additionally, adaptive ecological management strategies address the causes and effects of biological invasions and other complex issues in social-ecological systems. We conducted a review of biological invasions, species distribution models, and adaptive practices in ecological management, and developed a framework for adaptive, niche-based, invasive species distribution model (iSDM) development and utilization. This iterative, 10-step framework promotes consistency and transparency in iSDM development, allows for changes in invasive drivers and filters, integrates mechanistic and correlative modeling techniques, balances the avoidance of type 1 and type 2 errors in predictions, encourages the linking of monitoring and management actions, and facilitates incremental improvements in models and management across space, time, and institutional boundaries. These improvements are useful for advancing coordinated invasive species modeling, management and monitoring from local scales to the regional, continental and global scales at which biological invasions occur and harm native ecosystems and economies, as well as for anticipating and responding to biological invasions under continuing global change.

  8. The BlueSky Smoke Modeling Framework: Recent Developments

    NASA Astrophysics Data System (ADS)

    Sullivan, D. C.; Larkin, N.; Raffuse, S. M.; Strand, T.; ONeill, S. M.; Leung, F. T.; Qu, J. J.; Hao, X.

    2012-12-01

    (TRMM) Multi-satellite Precipitation Analysis Real-Time (TMPA-RT) data set is being used to improve dead fuel moisture estimates. - EastFire live fuel moisture estimates, which are derived from NASA's MODIS direct broadcast, are being used to improve live fuel moisture estimates. - NASA's Multi-angle Imaging Spectroradiometer (MISR) stereo heights are being used to improve estimates of plume injection heights. Further, the Fire Location and Modeling of Burning Emissions (FLAMBÉ) model was incorporated into the BlueSky Framework as an alternative means of calculating fire emissions. FLAMBÉ directly estimates emissions on the basis of fire detections and radiance measures from NASA's MODIS and NOAA's GOES satellites. (The authors gratefully acknowledge NASA's Applied Sciences Program [Grant Nos. NN506AB52A and NNX09AV76G)], the USDA Forest Service, and the Joint Fire Science Program for their support.)

  9. Design of a framework for modeling, integration and simulation of physiological models.

    PubMed

    Erson, E; Cavusoglu, M

    2010-01-01

    Modeling and simulation of physiological processes deal with the challenges of multiscale models in which coupling is very high within and among scales. Information technology approaches together with related analytical and computational tools will help to deal with these challenges. Physiological Model Simulation, Integration and Modeling Framework, Phy-SIM, provides the modeling environment which will help to cultivate various approaches to deal with the inherent problem of multiscale modeling of physiological systems. In this paper, we present the modular design of Phy-SIM. The proposed layered design of Phy-SIM, separates structure from function in physiological processes advocating modular thinking in developing and integrating physiological models. Moreover, the ontology based architecture will improve the modeling process by the mechanisms to attach anatomical and physiological ontological information to the models. The ultimate aim of the proposed approaches is to enhance the physiological model development and integration processes by providing the tools and mechanisms in Phy-SIM.

  10. D Geological Framework Models as a Teaching Aid for Geoscience

    NASA Astrophysics Data System (ADS)

    Kessler, H.; Ward, E.; Geological ModelsTeaching Project Team

    2010-12-01

    3D geological models have great potential as a resource for universities when teaching foundation geological concepts as it allows the student to visualise and interrogate UK geology. They are especially useful when dealing with the conversion of 2D field, map and GIS outputs into three dimensional geological units, which is a common problem for all students of geology. Today’s earth science students use a variety of skills and processes during their learning experience including the application of schema’s, spatial thinking, image construction, detecting patterns, memorising figures, mental manipulation and interpretation, making predictions and deducing the orientation of themselves and the rocks. 3D geological models can reinforce spatial thinking strategies and encourage students to think about processes and properties, in turn helping the student to recognise pre-learnt geological principles in the field and to convert what they see at the surface into a picture of what is going on at depth. Learning issues faced by students may also be encountered by experts, policy managers, and stakeholders when dealing with environmental problems. Therefore educational research of student learning in earth science may also improve environmental decision making. 3D geological framework models enhance the learning of Geosciences because they: ● enable a student to observe, manipulate and interpret geology; in particular the models instantly convert two-dimensional geology (maps, boreholes and cross-sections) into three dimensions which is a notoriously difficult geospatial skill to acquire. ● can be orientated to whatever the user finds comfortable and most aids recognition and interpretation. ● can be used either to teach geosciences to complete beginners or add to experienced students body of knowledge (whatever point that may be at). Models could therefore be packaged as a complete educational journey or students and tutor can select certain areas of the model

  11. Subsurface and Surface Characterization using an Information Framework Model

    NASA Astrophysics Data System (ADS)

    Samuel-Ojo, Olusola

    Groundwater plays a critical dual role as a reservoir of fresh water for human consumption and as a cause of the most severe problems when dealing with construction works below the water table. This is why it is critical to monitor groundwater recharge, distribution, and discharge on a continuous basis. The conventional method of monitoring groundwater employs a network of sparsely distributed monitoring wells and it is laborious, expensive, and intrusive. The problem of sparse data and undersampling reduces the accuracy of sampled survey data giving rise to poor interpretation. This dissertation addresses this problem by investigating groundwater-deformation response in order to augment the conventional method. A blend of three research methods was employed, namely design science research, geological methods, and geophysical methods, to examine whether persistent scatterer interferometry, a remote sensing technique, might augment conventional groundwater monitoring. Observation data (including phase information for displacement deformation from permanent scatterer interferometric synthetic aperture radar and depth to groundwater data) was obtained from the Water District, Santa Clara Valley, California. An information framework model was built and applied, and then evaluated. Data was preprocessed and decomposed into five components or parts: trend, seasonality, low frequency, high frequency and octave bandwidth. Digital elevation models of observed and predicted hydraulic head were produced, illustrating the piezometric or potentiometric surface. The potentiometric surface characterizes the regional aquifer of the valley showing areal variation of rate of percolation, velocity and permeability, and completely defines flow direction, advising characteristics and design levels. The findings show a geologic forcing phenomenon which explains in part the long-term deformation behavior of the valley, characterized by poroelastic, viscoelastic, elastoplastic and

  12. A modeling framework for the evolution and spread of antibiotic resistance: literature review and model categorization.

    PubMed

    Spicknall, Ian H; Foxman, Betsy; Marrs, Carl F; Eisenberg, Joseph N S

    2013-08-15

    Antibiotic-resistant infections complicate treatment and increase morbidity and mortality. Mathematical modeling has played an integral role in improving our understanding of antibiotic resistance. In these models, parameter sensitivity is often assessed, while model structure sensitivity is not. To examine the implications of this, we first reviewed the literature on antibiotic-resistance modeling published between 1993 and 2011. We then classified each article's model structure into one or more of 6 categories based on the assumptions made in those articles regarding within-host and population-level competition between antibiotic-sensitive and antibiotic-resistant strains. Each model category has different dynamic implications with respect to how antibiotic use affects resistance prevalence, and therefore each may produce different conclusions about optimal treatment protocols that minimize resistance. Thus, even if all parameter values are correctly estimated, inferences may be incorrect because of the incorrect selection of model structure. Our framework provides insight into model selection.

  13. Modeling sports highlights using a time-series clustering framework and model interpretation

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Regunathan; Otsuka, Isao; Xiong, Ziyou; Divakaran, Ajay

    2005-01-01

    In our past work on sports highlights extraction, we have shown the utility of detecting audience reaction using an audio classification framework. The audio classes in the framework were chosen based on intuition. In this paper, we present a systematic way of identifying the key audio classes for sports highlights extraction using a time series clustering framework. We treat the low-level audio features as a time series and model the highlight segments as "unusual" events in a background of an "usual" process. The set of audio classes to characterize the sports domain is then identified by analyzing the consistent patterns in each of the clusters output from the time series clustering framework. The distribution of features from the training data so obtained for each of the key audio classes, is parameterized by a Minimum Description Length Gaussian Mixture Model (MDL-GMM). We also interpret the meaning of each of the mixture components of the MDL-GMM for the key audio class (the "highlight" class) that is correlated with highlight moments. Our results show that the "highlight" class is a mixture of audience cheering and commentator's excited speech. Furthermore, we show that the precision-recall performance for highlights extraction based on this "highlight" class is better than that of our previous approach which uses only audience cheering as the key highlight class.

  14. Modeling sports highlights using a time-series clustering framework and model interpretation

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Regunathan; Otsuka, Isao; Xiong, Ziyou; Divakaran, Ajay

    2004-12-01

    In our past work on sports highlights extraction, we have shown the utility of detecting audience reaction using an audio classification framework. The audio classes in the framework were chosen based on intuition. In this paper, we present a systematic way of identifying the key audio classes for sports highlights extraction using a time series clustering framework. We treat the low-level audio features as a time series and model the highlight segments as "unusual" events in a background of an "usual" process. The set of audio classes to characterize the sports domain is then identified by analyzing the consistent patterns in each of the clusters output from the time series clustering framework. The distribution of features from the training data so obtained for each of the key audio classes, is parameterized by a Minimum Description Length Gaussian Mixture Model (MDL-GMM). We also interpret the meaning of each of the mixture components of the MDL-GMM for the key audio class (the "highlight" class) that is correlated with highlight moments. Our results show that the "highlight" class is a mixture of audience cheering and commentator's excited speech. Furthermore, we show that the precision-recall performance for highlights extraction based on this "highlight" class is better than that of our previous approach which uses only audience cheering as the key highlight class.

  15. Linking Tectonics and Surface Processes through SNAC-CHILD Coupling: Preliminary Results Towards Interoperable Modeling Frameworks

    NASA Astrophysics Data System (ADS)

    Choi, E.; Kelbert, A.; Peckham, S. D.

    2014-12-01

    We demonstrate that code coupling can be an efficient and flexible method for modeling complicated two-way interactions between tectonic and surface processes with SNAC-CHILD coupling as an example. SNAC is a deep earth process model (a geodynamic/tectonics model), built upon a scientific software framework called StGermain and also compatible with a model coupling framework called Pyre. CHILD is a popular surface process model (a landscape evolution model), interfaced to the CSDMS (Community Surface Dynamics Modeling System) modeling framework. We first present proof-of-concept but non-trivial results from a simplistic coupling scheme. We then report progress towards augmenting SNAC with a Basic Model Interface (BMI), a framework-agnostic standard interface developed by CSDMS that uses the CSDMS Standard Names as controlled vocabulary for model communication across domains. Newly interfaced to BMI, SNAC will be easily coupled with CHILD as well as other BMI-compatible models. In broader context, this work will test BMI as a general and easy-to-implement mechanism for sharing models between modeling frameworks and is a part of the NSF-funded EarthCube Building Blocks project, "Earth System Bridge: Spanning Scientific Communities with Interoperable Modeling Frameworks."

  16. Evolution of 3-D geologic framework modeling and its application to groundwater flow studies

    USGS Publications Warehouse

    Blome, Charles D.; Smith, David V.

    2012-01-01

    In this Fact Sheet, the authors discuss the evolution of project 3-D subsurface framework modeling, research in hydrostratigraphy and airborne geophysics, and methodologies used to link geologic and groundwater flow models.

  17. Holland's RIASEC Model as an Integrative Framework for Individual Differences

    ERIC Educational Resources Information Center

    Armstrong, Patrick Ian; Day, Susan X.; McVay, Jason P.; Rounds, James

    2008-01-01

    Using data from published sources, the authors investigated J. L. Holland's (1959, 1997) theory of interest types as an integrative framework for organizing individual differences variables that are used in counseling psychology. Holland's interest types were used to specify 2- and 3-dimensional interest structures. In Study 1, measures of…

  18. A unified framework for modeling landscape evolution by discrete flows

    NASA Astrophysics Data System (ADS)

    Shelef, Eitan; Hilley, George E.

    2016-05-01

    Topographic features such as branched valley networks and undissected convex-up hillslopes are observed in disparate physical environments. In some cases, these features are formed by sediment transport processes that occur discretely in space and time, while in others, by transport processes that are uniformly distributed across the landscape. This paper presents an analytical framework that reconciles the basic attributes of such sediment transport processes with the topographic features that they form and casts those in terms that are likely common to different physical environments. In this framework, temporal changes in surface elevation reflect the frequency with which the landscape is traversed by geophysical flows generated discretely in time and space. This frequency depends on the distance to which flows travel downslope, which depends on the dynamics of individual flows, the lithologic and topographic properties of the underlying substrate, and the coevolution of topography, erosion, and the routing of flows over the topographic surface. To explore this framework, we postulate simple formulations for sediment transport and flow runout distance and demonstrate that the conditions for hillslope and channel network formation can be cast in terms of fundamental parameters such as distance from drainage divide and a friction-like coefficient that describes a flow's resistance to motion. The framework we propose is intentionally general, but the postulated formulas can be substituted with those that aim to describe a specific process and to capture variations in the size distribution of such flow events.

  19. A conceptual modeling framework for discrete event simulation using hierarchical control structures.

    PubMed

    Furian, N; O'Sullivan, M; Walker, C; Vössner, S; Neubacher, D

    2015-08-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM's applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models' system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example.

  20. A framework for modeling the cathode fall illustrated with a single beam model

    NASA Astrophysics Data System (ADS)

    Sommerer, T. J.; Lawler, J. E.; Hitchon, W. N. G.

    1988-08-01

    A framework for a model of the cathode fall region of a dc glow discharge is presented, and a simple model is solved as an illustration. An extremum condition independent of the model is placed on the electric field behavior to produce a unique solution that agrees with experiment. The zeroth and second moments of the Boltzmann equation are solved for the electrons with a self-consistent electric field. A single-beam model with only two parameters (number density and beam velocity) is assumed for the electron distribution function. Ion motion is modeled with a parametric fit to known ion mobilities. The model is solved for conditions corresponding to the experimental results and to Monte Carlo simulations of Doughty, Den Hartog, and Lawler [Phys. Rev. Lett. 58, 2668 (1987)]. The results are in good qualitative and ``factor-of-two'' quantitative agreement with the published results.

  1. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    PubMed

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department.

  2. Parameter estimation and model comparison for stochastic epidemiological processes in a Bayesian framework

    NASA Astrophysics Data System (ADS)

    Mateus, Luis; Stollenwerk, Nico; Zambrini, Jean Claude

    2012-09-01

    We compare two stochastic epidemiological models in a Bayesian framework, both models performing on the same simulated data set. In some cases of data obtained under one model with specific parameter values the model comparison favours the model not underlying the simulated data.

  3. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    PubMed Central

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  4. Advances in Geoscience Modeling: Smart Modeling Frameworks, Self-Describing Models and the Role of Standardized Metadata

    NASA Astrophysics Data System (ADS)

    Peckham, Scott

    2016-04-01

    Over the last decade, model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that make it much easier for modelers to connect heterogeneous sets of process models in a plug-and-play manner to create composite "system models". These mechanisms greatly simplify code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing with standardized metadata. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can use the self description functions to learn about each process model in a collection to be coupled and then automatically call framework service components (e.g. regridders

  5. Collaborative Project. A Flexible Atmospheric Modeling Framework for the Community Earth System Model (CESM)

    SciTech Connect

    Gettelman, Andrew

    2015-10-01

    In this project we have been upgrading the Multiscale Modeling Framework (MMF) in the Community Atmosphere Model (CAM), also known as Super-Parameterized CAM (SP-CAM). This has included a major effort to update the coding standards and interface with CAM so that it can be placed on the main development trunk. It has also included development of a new software structure for CAM to be able to handle sub-grid column information. These efforts have formed the major thrust of the work.

  6. Applying the Nominal Response Model within a Longitudinal Framework to Construct the Positive Family Relationships Scale

    ERIC Educational Resources Information Center

    Preston, Kathleen Suzanne Johnson; Parral, Skye N.; Gottfried, Allen W.; Oliver, Pamella H.; Gottfried, Adele Eskeles; Ibrahim, Sirena M.; Delany, Danielle

    2015-01-01

    A psychometric analysis was conducted using the nominal response model under the item response theory framework to construct the Positive Family Relationships scale. Using data from the Fullerton Longitudinal Study, this scale was constructed within a long-term longitudinal framework spanning middle childhood through adolescence. Items tapping…

  7. Assessing Students' Understandings of Biological Models and Their Use in Science to Evaluate a Theoretical Framework

    ERIC Educational Resources Information Center

    Grünkorn, Juliane; Upmeier zu Belzen, Annette; Krüger, Dirk

    2014-01-01

    Research in the field of students' understandings of models and their use in science describes different frameworks concerning these understandings. Currently, there is no conjoint framework that combines these structures and so far, no investigation has focused on whether it reflects students' understandings sufficiently (empirical evaluation).…

  8. Assessing Students' Understandings of Biological Models and their Use in Science to Evaluate a Theoretical Framework

    NASA Astrophysics Data System (ADS)

    Grünkorn, Juliane; Belzen, Annette Upmeier zu; Krüger, Dirk

    2014-07-01

    Research in the field of students' understandings of models and their use in science describes different frameworks concerning these understandings. Currently, there is no conjoint framework that combines these structures and so far, no investigation has focused on whether it reflects students' understandings sufficiently (empirical evaluation). Therefore, the purpose of this article is to present the results of an empirical evaluation of a conjoint theoretical framework. The theoretical framework integrates relevant research findings and comprises five aspects which are subdivided into three levels each: nature of models, multiple models, purpose of models, testing, and changing models. The study was conducted with a sample of 1,177 seventh to tenth graders (aged 11-19 years) using open-ended items. The data were analysed by identifying students' understandings of models (nature of models and multiple models) and their use in science (purpose of models, testing, and changing models), and comparing as well as assigning them to the content of the theoretical framework. A comprehensive category system of students' understandings was thus developed. Regarding the empirical evaluation, the students' understandings of the nature and the purpose of models were sufficiently described by the theoretical framework. Concerning the understandings of multiple, testing, and changing models, additional initial understandings (only one model possible, no testing of models, and no change of models) need to be considered. This conjoint and now empirically tested framework for students' understandings can provide a common basis for future science education research. Furthermore, evidence-based indications can be provided for teachers and their instructional practice.

  9. Integration of the DAYCENT Biogeochemical Model within a Multi-Model Framework

    SciTech Connect

    David Muth

    2012-07-01

    Agricultural residues are the largest near term source of cellulosic 13 biomass for bioenergy production, but removing agricultural residues sustainably 14 requires considering the critical roles that residues play in the agronomic system. 15 Determining sustainable removal rates for agricultural residues has received 16 significant attention and integrated modeling strategies have been built to evaluate 17 sustainable removal rates considering soil erosion and organic matter constraints. 18 However the current integrated model does not quantitatively assess soil carbon 19 and long term crop yields impacts of residue removal. Furthermore the current 20 integrated model does not evaluate the greenhouse gas impacts of residue 21 removal, specifically N2O and CO2 gas fluxes from the soil surface. The DAYCENT 22 model simulates several important processes for determining agroecosystem 23 performance. These processes include daily Nitrogen-gas flux, daily carbon dioxide 24 flux from soil respiration, soil organic carbon and nitrogen, net primary productivity, 25 and daily water and nitrate leaching. Each of these processes is an indicator of 26 sustainability when evaluating emerging cellulosic biomass production systems for 27 bioenergy. A potentially vulnerable cellulosic biomass resource is agricultural 28 residues. This paper presents the integration of the DAYCENT model with the 29 existing integration framework modeling tool to investigate additional environment 30 impacts of agricultural residue removal. The integrated model is extended to 31 facilitate two-way coupling between DAYCENT and the existing framework. The 32 extended integrated model is applied to investigate additional environmental 33 impacts from a recent sustainable agricultural residue removal dataset. The 34 integrated model with DAYCENT finds some differences in sustainable removal 35 rates compared to previous results for a case study county in Iowa. The extended 36 integrated model with

  10. EarthCube - Earth System Bridge: Spanning Scientific Communities with Interoperable Modeling Frameworks

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.; DeLuca, C.; Gochis, D. J.; Arrigo, J.; Kelbert, A.; Choi, E.; Dunlap, R.

    2014-12-01

    In order to better understand and predict environmental hazards of weather/climate, ecology and deep earth processes, geoscientists develop and use physics-based computational models. These models are used widely both in academic and federal communities. Because of the large effort required to develop and test models, there is widespread interest in component-based modeling, which promotes model reuse and simplified coupling to tackle problems that often cross discipline boundaries. In component-based modeling, the goal is to make relatively small changes to models that make it easy to reuse them as "plug-and-play" components. Sophisticated modeling frameworks exist to rapidly couple these components to create new composite models. They allow component models to exchange variables while accommodating different programming languages, computational grids, time-stepping schemes, variable names and units. Modeling frameworks have arisen in many modeling communities. CSDMS (Community Surface Dynamics Modeling System) serves the academic earth surface process dynamics community, while ESMF (Earth System Modeling Framework) serves many federal Earth system modeling projects. Others exist in both the academic and federal domains and each satisfies design criteria that are determined by the community they serve. While they may use different interface standards or semantic mediation strategies, they share fundamental similarities. The purpose of the Earth System Bridge project is to develop mechanisms for interoperability between modeling frameworks, such as the ability to share a model or service component. This project has three main goals: (1) Develop a Framework Description Language (ES-FDL) that allows modeling frameworks to be described in a standard way so that their differences and similarities can be assessed. (2) Demonstrate that if a model is augmented with a framework-agnostic Basic Model Interface (BMI), then simple, universal adapters can go from BMI to a

  11. Using a scalable modeling and simulation framework to evaluate the benefits of intelligent transportation systems.

    SciTech Connect

    Ewing, T.; Tentner, A.

    2000-03-21

    A scalable, distributed modeling and simulation framework has been developed at Argonne National Laboratory to study Intelligent Transportation Systems. The framework can run on a single-processor workstation, or run distributed on a multiprocessor computer or network of workstations. The framework is modular and supports plug-in models, hardware, and live data sources. The initial set of models currently includes road network and traffic flow, probe and smart vehicles, traffic management centers, communications between vehicles and centers, in-vehicle navigation systems, roadway traffic advisories. The modeling and simulation capability has been used to examine proposed ITS concepts. Results are presented from modeling scenarios from the Advanced Driver and Vehicle Advisory Navigation Concept (ADVANCE) experimental program to demonstrate how the framework can be used to evaluate the benefits of ITS and to plan future ITS operational tests and deployment initiatives.

  12. Temporo-spatial model construction using the MML and software framework.

    PubMed

    Chang, David C; Dokos, Socrates; Lovell, Nigel H

    2011-12-01

    Development of complex temporo-spatial biological computational models can be a time consuming and arduous task. These models may contain hundreds of differential equations as well as realistic geometries that may require considerable investment in time to ensure that all model components are correctly implemented and error free. To tackle this problem, the Modeling Markup Languages (MML) and software framework is a modular XML/HDF5-based specification and toolkits that aims to simplify this process. The main goal of this framework is to encourage reusability, sharing and storage. To achieve this, the MML framework utilizes the CellML specification and repository, which comprises an extensive range of curated models available for use. The MML framework is an open-source project available at http://mml.gsbme.unsw.edu.au.

  13. Design of a component-based integrated environmental modeling framework

    EPA Science Inventory

    Integrated environmental modeling (IEM) includes interdependent science-based components (e.g., models, databases, viewers, assessment protocols) that comprise an appropriate software modeling system. The science-based components are responsible for consuming and producing inform...

  14. Model-based reasoning in the physics laboratory: Framework and initial results

    NASA Astrophysics Data System (ADS)

    Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.

    2015-12-01

    [This paper is part of the Focused Collection on Upper Division Physics Courses.] We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable process, within physics education, it has been preferentially applied to the iterative development of broadly applicable principles (e.g., Newton's laws of motion in introductory mechanics). A significant feature of the new framework is that measurement tools (in addition to the physical system being studied) are subjected to the process of modeling. Think-aloud interviews were used to refine the framework and demonstrate its utility by documenting examples of model-based reasoning in the laboratory. When applied to the think-aloud interviews, the framework captures and differentiates students' model-based reasoning and helps identify areas of future research. The interviews showed how students productively applied similar facets of modeling to the physical system and measurement tools: construction, prediction, interpretation of data, identification of model limitations, and revision. Finally, we document students' challenges in explicitly articulating assumptions when constructing models of experimental systems and further challenges in model construction due to students' insufficient prior conceptual understanding. A modeling perspective reframes many of the seemingly arbitrary technical details of measurement tools and apparatus as an opportunity for authentic and engaging scientific sense making.

  15. Hydrogeologic Framework Model for the Saturated Zone Site Scale flow and Transport Model

    SciTech Connect

    T. Miller

    2004-11-15

    The purpose of this report is to document the 19-unit, hydrogeologic framework model (19-layer version, output of this report) (HFM-19) with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results in accordance with AP-SIII.10Q, Models. The HFM-19 is developed as a conceptual model of the geometric extent of the hydrogeologic units at Yucca Mountain and is intended specifically for use in the development of the ''Saturated Zone Site-Scale Flow Model'' (BSC 2004 [DIRS 170037]). Primary inputs to this model report include the GFM 3.1 (DTN: MO9901MWDGFM31.000 [DIRS 103769]), borehole lithologic logs, geologic maps, geologic cross sections, water level data, topographic information, and geophysical data as discussed in Section 4.1. Figure 1-1 shows the information flow among all of the saturated zone (SZ) reports and the relationship of this conceptual model in that flow. The HFM-19 is a three-dimensional (3-D) representation of the hydrogeologic units surrounding the location of the Yucca Mountain geologic repository for spent nuclear fuel and high-level radioactive waste. The HFM-19 represents the hydrogeologic setting for the Yucca Mountain area that covers about 1,350 km2 and includes a saturated thickness of about 2.75 km. The boundaries of the conceptual model were primarily chosen to be coincident with grid cells in the Death Valley regional groundwater flow model (DTN: GS960808312144.003 [DIRS 105121]) such that the base of the site-scale SZ flow model is consistent with the base of the regional model (2,750 meters below a smoothed version of the potentiometric surface), encompasses the exploratory boreholes, and provides a framework over the area of interest for groundwater flow and radionuclide transport modeling. In depth, the model domain extends from land surface to the base of the regional groundwater flow model (D'Agnese et al. 1997 [DIRS 100131], p 2). For the site-scale SZ flow model, the HFM

  16. An Integrated Modeling Framework Forcasting Ecosystem Services--Application to the Albemarle Pamlico Basins, NC and VA (USA) and Beyond

    EPA Science Inventory

    We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...

  17. An Integrated Modeling Framework Forecasting Ecosystem Services: Application to the Albemarle Pamlico Basins, NC and VA (USA)

    EPA Science Inventory

    We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...

  18. Design of a Model Execution Framework: Repetitive Object-Oriented Simulation Environment (ROSE)

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Briggs, Jeffery L.

    2008-01-01

    The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.

  19. A Framework for Multifaceted Evaluation of Student Models

    ERIC Educational Resources Information Center

    Huang, Yun; González-Brenes, José P.; Kumar, Rohit; Brusilovsky, Peter

    2015-01-01

    Latent variable models, such as the popular Knowledge Tracing method, are often used to enable adaptive tutoring systems to personalize education. However, finding optimal model parameters is usually a difficult non-convex optimization problem when considering latent variable models. Prior work has reported that latent variable models obtained…

  20. ASSESSING POPULATION EXPOSURES TO MULTIPLE AIR POLLUTANTS USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    The Modeling Environment for Total Risks studies (MENTOR) system, combined with an extension of the SHEDS (Stochastic Human Exposure and Dose Simulation) methodology, provide a mechanistically consistent framework for conducting source-to-dose exposure assessments of multiple pol...

  1. A Framework for Sharing and Integrating Remote Sensing and GIS Models Based on Web Service

    PubMed Central

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a “black box” and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users. PMID:24901016

  2. A framework for sharing and integrating remote sensing and GIS models based on Web service.

    PubMed

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a "black box" and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users.

  3. A flexible and efficient multi-model framework in support of water management

    NASA Astrophysics Data System (ADS)

    Wolfs, Vincent; Tran Quoc, Quan; Willems, Patrick

    2016-05-01

    Flexible, fast and accurate water quantity models are essential tools in support of water management. Adjustable levels of model detail and the ability to handle varying spatial and temporal resolutions are requisite model characteristics to ensure that such models can be employed efficiently in various applications. This paper uses a newly developed flexible modelling framework that aims to generate such models. The framework incorporates several approaches to model catchment hydrology, rivers and floodplains, and the urban drainage system by lumping processes on different levels. To illustrate this framework, a case study of integrated hydrological-hydraulic modelling is elaborated for the Grote Nete catchment in Belgium. Three conceptual rainfall-runoff models (NAM, PDM and VHM) were implemented in a generalized model structure, allowing flexibility in the spatial resolution by means of an innovative disaggregation/aggregation procedure. They were linked to conceptual hydraulic models of the rivers in the catchment, which were developed by means of an advanced model structure identification and calibration procedure. The conceptual models manage to emulate the simulation results of a detailed full hydrodynamic model accurately. The models configured using the approaches of this framework are well-suited for many applications in water management due to their very short calculation time, interfacing possibilities and adjustable level of detail.

  4. Landscape - Soilscape Modelling: Proposed framework for a model comparison benchmarking exercise, who wants to join?

    NASA Astrophysics Data System (ADS)

    Schoorl, Jeroen M.; Jetten, Victor G.; Coulthard, Thomas J.; Hancock, Greg R.; Renschler, Chris S.; Irvine, Brian J.; Cerdan, Olivier; Kirkby, Mike J.; (A) Veldkamp, Tom

    2014-05-01

    Current landscape - soilscape modelling frameworks are developed under a wide range of spatial and temporal resolutions and extents, from the so called event-based models, soil erosion models to the landscape evolution models. In addition, these models are based on different assumptions, include variable and different processes descriptions and produce different outcomes. Consequently, the models often need specific input data and their development and calibration is best linked to a specific area and local conditions. Model validation is often limited and restricted to the shorter time scales and single events. A first workshop on catchment based modelling (6 event based models were challenged then) was organised in the late 90's and the results lead to some excellent discussions on predictive modelling, equifinality and a special issue in Catena. It is time for a similar exercise: new models have been made, older models have been updated, and judging from literature there is a lot more experience in calibration/validation and reflections on processes observed in the field and how these should be simulated. In addition there are new data sources, such as high resolution remote sensing (including DEMs), new pattern analysis, comparison techniques and continuous developments and results in dating sediment archives and erosion rates. The main goal of this renewed exercise will be to come up with a benchmarking methodology for comparing and judging model behaviour including the issues of upscaling and downscaling of results. Model comparison may lead to the development of new research questions and lead to a firmer understanding of different models performance under different circumstances.

  5. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    PubMed

    Wu, Zujian; Pang, Wei; Coghill, George M

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  6. A scalable delivery framework and a pricing model for streaming media with advertisements

    NASA Astrophysics Data System (ADS)

    Al-Hadrusi, Musab; Sarhan, Nabil J.

    2008-01-01

    This paper presents a delivery framework for streaming media with advertisements and an associated pricing model. The delivery model combines the benefits of periodic broadcasting and stream merging. The advertisements' revenues are used to subsidize the price of the media content. The pricing is determined based on the total ads' viewing time. Moreover, this paper presents an efficient ad allocation scheme and three modified scheduling policies that are well suited to the proposed delivery framework. Furthermore, we study the effectiveness of the delivery framework and various scheduling polices through extensive simulation in terms of numerous metrics, including customer defection probability, average number of ads viewed per client, price, arrival rate, profit, and revenue.

  7. BioASF: a framework for automatically generating executable pathway models specified in BioPAX

    PubMed Central

    Haydarlou, Reza; Jacobsen, Annika; Bonzanni, Nicola; Feenstra, K. Anton; Abeln, Sanne; Heringa, Jaap

    2016-01-01

    Motivation: Biological pathways play a key role in most cellular functions. To better understand these functions, diverse computational and cell biology researchers use biological pathway data for various analysis and modeling purposes. For specifying these biological pathways, a community of researchers has defined BioPAX and provided various tools for creating, validating and visualizing BioPAX models. However, a generic software framework for simulating BioPAX models is missing. Here, we attempt to fill this gap by introducing a generic simulation framework for BioPAX. The framework explicitly separates the execution model from the model structure as provided by BioPAX, with the advantage that the modelling process becomes more reproducible and intrinsically more modular; this ensures natural biological constraints are satisfied upon execution. The framework is based on the principles of discrete event systems and multi-agent systems, and is capable of automatically generating a hierarchical multi-agent system for a given BioPAX model. Results: To demonstrate the applicability of the framework, we simulated two types of biological network models: a gene regulatory network modeling the haematopoietic stem cell regulators and a signal transduction network modeling the Wnt/β-catenin signaling pathway. We observed that the results of the simulations performed using our framework were entirely consistent with the simulation results reported by the researchers who developed the original models in a proprietary language. Availability and Implementation: The framework, implemented in Java, is open source and its source code, documentation and tutorial are available at http://www.ibi.vu.nl/programs/BioASF. Contact: j.heringa@vu.nl PMID:27307645

  8. A framework for regional modeling of past climates

    NASA Astrophysics Data System (ADS)

    Sloan, L. C.

    2006-09-01

    The methods of reconstructing ancient climate information from the rock record are summarized, and the climate forcing factors that have been active at global and regional scales through Earth history are reviewed. In this context, the challenges and approaches to modeling past climates by using a regional climate model are discussed. A significant challenge to such modeling efforts arises if the time period of interest occurred prior to the past ˜3 5 million years, at which point land sea distributions and topography markedly different from present must be specified at the spatial resolution required by regional climate models. Creating these boundary conditions requires a high degree of geologic knowledge, and also depends greatly upon the global climate model driving conditions. Despite this and other challenges, regional climate models represent an important and unique tool for paleoclimate investigations. Application of regional climate models to paleoclimate studies may provide another way to assess the overall performance of regional climate models.

  9. A general computational framework for modeling cellular structure and function.

    PubMed Central

    Schaff, J; Fink, C C; Slepchenko, B; Carson, J H; Loew, L M

    1997-01-01

    The "Virtual Cell" provides a general system for testing cell biological mechanisms and creates a framework for encapsulating the burgeoning knowledge base comprising the distribution and dynamics of intracellular biochemical processes. It approaches the problem by associating biochemical and electrophysiological data describing individual reactions with experimental microscopic image data describing their subcellular localizations. Individual processes are collected within a physical and computational infrastructure that accommodates any molecular mechanism expressible as rate equations or membrane fluxes. An illustration of the method is provided by a dynamic simulation of IP3-mediated Ca2+ release from endoplasmic reticulum in a neuronal cell. The results can be directly compared to experimental observations and provide insight into the role of experimentally inaccessible components of the overall mechanism. Images FIGURE 1 FIGURE 2 FIGURE 4 FIGURE 5 PMID:9284281

  10. An Integration and Evaluation Framework for ESPC Coupled Models

    DTIC Science & Technology

    2014-09-30

    application a version of the Community Earth System Model (CESM) running an optimized version of the HYbrid Coordinate Ocean Model (HYCOM...architectures and program needs, and to initiate scientific committees. APPROACH A one-year seed project entitled Optimized Infrastructure for...a component architecture like ESMF for accelerators; and coupling the HYbrid Coordinate Ocean Model (HYCOM) to the Community Earth System Model (CESM

  11. Multiple-species analysis of point count data: A more parsimonious modelling framework

    USGS Publications Warehouse

    Alldredge, M.W.; Pollock, K.H.; Simons, T.R.; Shriner, S.A.

    2007-01-01

    1. Although population surveys often provide information on multiple species, these data are rarely analysed within a multiple-species framework despite the potential for more efficient estimation of population parameters. 2. We have developed a multiple-species modelling framework that uses similarities in capture/detection processes among species to model multiple species data more parsimoniously. We present examples of this approach applied to distance, time of detection and multiple observer sampling for avian point count data. 3. Models that included species as a covariate and individual species effects were generally selected as the best models for distance sampling, but group models without species effects performed best for the time of detection and multiple observer methods. Population estimates were more precise for no-species-effect models than for species-effect models, demonstrating the benefits of exploiting species' similarities when modelling multiple species data. Partial species-effect models and additive models were also useful because they modelled similarities among species while allowing for species differences. 4. Synthesis and applications. We recommend the adoption of multiple-species modelling because of its potential for improved population estimates. This framework will be particularly beneficial for modelling count data from rare species because information on the detection process can be 'borrowed' from more common species. The multiple-species modelling framework presented here is applicable to a wide range of sampling techniques and taxa. ?? 2007 The Authors.

  12. A MODELLING FRAMEWORK FOR MERCURY CYCLING IN LAKE MICHIGAN

    EPA Science Inventory

    A time-dependent mercury model was developed to describe mercury cycling in Lake Michigan. The model addresses dynamic relationships between net mercury loadings and the resulting concentrations of mercury species in the water and sediment. The simplified predictive modeling fram...

  13. Enhancing a socio-hydrological modelling framework through field observations: a case study in India

    NASA Astrophysics Data System (ADS)

    den Besten, Nadja; Pande, Saket; Savenije, Huub H. G.

    2016-04-01

    Recently a smallholder socio-hydrological modelling framework was proposed and deployed to understand the underlying dynamics of Agrarian Crisis in Maharashtra state of India. It was found that cotton and sugarcane smallholders whom lack irrigation and storage techniques are most susceptible to distress. This study further expands the application of the modelling framework to other crops that are abundant in the state of Maharashtra, such as Paddy, Jowar and Soyabean to assess whether the conclusions on the possible causes behind smallholder distress still hold. Further, a fieldwork will be undertaken in March 2016 in the district of Pune. During the fieldwork 50 smallholders will be interviewed in which socio-hydrological assumptions on hydrology and capital equations and corresponding closure relationships, incorporated the current model, will be put to test. Besides the assumptions, the questionnaires will be used to better understand the hydrological reality of the farm holders, in terms of water usage and storage capacity. In combination with historical records on the smallholders' socio-economic data acquired over the last thirty years available through several NGOs in the region, socio-hydrological realism of the modelling framework will be enhanced. The preliminary outcomes of a desktop study show the possibilities of a water-centric modelling framework in understanding the constraints on smallholder farming. The results and methods described can be a first step guiding following research on the modelling framework: a start in testing the framework in multiple rural locations around the globe.

  14. Approaches to implementing deterministic models in a probabilistic framework

    SciTech Connect

    Talbott, D.V.

    1995-04-01

    The increasing use of results from probabilistic risk assessments in the decision-making process makes it ever more important to eliminate simplifications in probabilistic models that might lead to conservative results. One area in which conservative simplifications are often made is modeling the physical interactions that occur during the progression of an accident sequence. This paper demonstrates and compares different approaches for incorporating deterministic models of physical parameters into probabilistic models; parameter range binning, response curves, and integral deterministic models. An example that combines all three approaches in a probabilistic model for the handling of an energetic material (i.e. high explosive, rocket propellant,...) is then presented using a directed graph model.

  15. Computational Morphodynamics: A modeling framework to understand plant growth

    PubMed Central

    Chickarmane, Vijay; Roeder, Adrienne H.K.; Tarr, Paul T.; Cunha, Alexandre; Tobin, Cory; Meyerowitz, Elliot M.

    2014-01-01

    Computational morphodynamics utilizes computer modeling to understand the development of living organisms over space and time. Results from biological experiments are used to construct accurate and predictive models of growth. These models are then used to make novel predictions providing further insight into the processes in question, which can be tested experimentally to either confirm or rule out the validity of the computational models. This review highlights two fundamental issues: (1.) models should span and integrate single cell behavior with tissue development and (2.) the necessity to understand the feedback between mechanics of growth and chemical or molecular signaling. We review different approaches to model plant growth and discuss a variety of model types that can be implemented, with the aim of demonstrating how this methodology can be used, to explore the morphodynamics of plant development. PMID:20192756

  16. Experimental analysis of chaotic neural network models for combinatorial optimization under a unifying framework.

    PubMed

    Kwok, T; Smith, K A

    2000-09-01

    The aim of this paper is to study both the theoretical and experimental properties of chaotic neural network (CNN) models for solving combinatorial optimization problems. Previously we have proposed a unifying framework which encompasses the three main model types, namely, Chen and Aihara's chaotic simulated annealing (CSA) with decaying self-coupling, Wang and Smith's CSA with decaying timestep, and the Hopfield network with chaotic noise. Each of these models can be represented as a special case under the framework for certain conditions. This paper combines the framework with experimental results to provide new insights into the effect of the chaotic neurodynamics of each model. By solving the N-queen problem of various sizes with computer simulations, the CNN models are compared in different parameter spaces, with optimization performance measured in terms of feasibility, efficiency, robustness and scalability. Furthermore, characteristic chaotic neurodynamics crucial to effective optimization are identified, together with a guide to choosing the corresponding model parameters.

  17. Introducing MERGANSER: A Flexible Framework for Ecological Niche Modeling

    NASA Astrophysics Data System (ADS)

    Klawonn, M.; Dow, E. M.

    2015-12-01

    Ecological Niche Modeling (ENM) is a collection of techniques to find a "fundamental niche", the range of environmental conditions suitable for a species' survival in the absence of inter-species interactions, given a set of environmental parameters. Traditional approaches to ENM face a number of obstacles including limited data accessibility, data management problems, computational costs, interface usability, and model validation. The MERGANSER system, which stands for Modeling Ecological Residency Given A Normalized Set of Environmental Records, addresses these issues through powerful data persistence and flexible data access, coupled with a clear presentation of results and fine-tuned control over model parameters. MERGANSER leverages data measuring 72 weather related phenomena, land cover, soil type, population, species occurrence, general species information, and elevation, totaling over 1.5 TB of data. To the best of the authors' knowledge, MERGANSER uses higher-resolution spatial data sets than previously published models. Since MERGANSER stores data in an instance of Apache SOLR, layers generated in support of niche models are accessible to users via simplified Apache Lucene queries. This is made even simpler via an HTTP front end that generates Lucene queries automatically. Specifically, a user need only enter the name of a place and a species to run a model. Using this approach to synthesizing model layers, the MERGANSER system has successfully reproduced previously published niche model results with a simplified user experience. Input layers for the model are generated dynamically using OpenStreetMap and SOLR's spatial search functionality. Models are then run using either user-specified or automatically determined parameters after normalizing them into a common grid. Finally, results are visualized in the web interface, which allows for quick validation. Model results and all surrounding metadata are also accessible to the user for further study.

  18. Incorporating physically-based microstructures in materials modeling: Bridging phase field and crystal plasticity frameworks

    DOE PAGES

    Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.; ...

    2016-04-25

    Here, the mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FEmore » meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.« less

  19. Incorporating physically-based microstructures in materials modeling: Bridging phase field and crystal plasticity frameworks

    SciTech Connect

    Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.; Hanks, Byron W.; Foulk, James W.; Battaile, Corbett C.

    2016-04-25

    Here, the mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FE meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.

  20. Integration of the Radiation Belt Environment Model Into the Space Weather Modeling Framework

    NASA Technical Reports Server (NTRS)

    Glocer, A.; Toth, G.; Fok, M.; Gombosi, T.; Liemohn, M.

    2009-01-01

    We have integrated the Fok radiation belt environment (RBE) model into the space weather modeling framework (SWMF). RBE is coupled to the global magnetohydrodynamics component (represented by the Block-Adaptive-Tree Solar-wind Roe-type Upwind Scheme, BATS-R-US, code) and the Ionosphere Electrodynamics component of the SWMF, following initial results using the Weimer empirical model for the ionospheric potential. The radiation belt (RB) model solves the convection-diffusion equation of the plasma in the energy range of 10 keV to a few MeV. In stand-alone mode RBE uses Tsyganenko's empirical models for the magnetic field, and Weimer's empirical model for the ionospheric potential. In the SWMF the BATS-R-US model provides the time dependent magnetic field by efficiently tracing the closed magnetic field-lines and passing the geometrical and field strength information to RBE at a regular cadence. The ionosphere electrodynamics component uses a two-dimensional vertical potential solver to provide new potential maps to the RBE model at regular intervals. We discuss the coupling algorithm and show some preliminary results with the coupled code. We run our newly coupled model for periods of steady solar wind conditions and compare our results to the RB model using an empirical magnetic field and potential model. We also simulate the RB for an active time period and find that there are substantial differences in the RB model results when changing either the magnetic field or the electric field, including the creation of an outer belt enhancement via rapid inward transport on the time scale of tens of minutes.

  1. A FRAMEWORK FOR FINE-SCALE COMPUTATIONAL FLUID DYNAMICS AIR QUALITY MODELING AND ANALYSIS

    EPA Science Inventory

    This paper discusses a framework for fine-scale CFD modeling that may be developed to complement the present Community Multi-scale Air Quality (CMAQ) modeling system which itself is a computational fluid dynamics model. A goal of this presentation is to stimulate discussions on w...

  2. Modeling in Chemistry as Cultural Practice: A Theoretical Framework with Implications for Chemistry Education. Draft.

    ERIC Educational Resources Information Center

    Erduran, Sibel

    This paper reports on an interdisciplinary theoretical framework for the characterization of models and modeling that can be useful in application to chemistry education. The underlying argument marks a departure from an emphasis on concepts that are the outcomes of chemical inquiry about how knowledge growth occurs through modeling in chemistry.…

  3. Multi-Fidelity Framework for Modeling Combustion Instability

    DTIC Science & Technology

    2016-07-27

    widely used in premixed combustion for gas turbine engine application [2-5]. You et al. [6] developed an analytical model based on a level-set...Model of Acoustic Response of Turbulent Premixed Flame and Its Application to Gas- Turbine Combustion Instability Analysis," Combustion Science and

  4. Physical Models of Galaxy Formation in a Cosmological Framework

    NASA Astrophysics Data System (ADS)

    Somerville, Rachel S.; Davé, Romeel

    2015-08-01

    Modeling galaxy formation in a cosmological context presents one of the greatest challenges in astrophysics today due to the vast range of scales and numerous physical processes involved. Here we review the current status of models that employ two leading techniques to simulate the physics of galaxy formation: semianalytic models and numerical hydrodynamic simulations. We focus on a set of observational targets that describe the evolution of the global and structural properties of galaxies from roughly cosmic high noon (z â¼ 2-3) to the present. Although minor discrepancies remain, overall, models show remarkable convergence among different methods and make predictions that are in qualitative agreement with observations. Modelers have converged on a core set of physical processes that are critical for shaping galaxy properties. This core set includes cosmological accretion, strong stellar-driven winds that are more efficient at low masses, black hole feedback that preferentially suppresses star formation at high masses, and structural and morphological evolution through merging and environmental processes. However, all cosmological models currently adopt phenomenological implementations of many of these core processes, which must be tuned to observations. Many details of how these diverse processes interact within a hierarchical structure formation setting remain poorly understood. Emerging multiscale simulations are helping to bridge the gap between stellar and cosmological scales, placing models on a firmer, more physically grounded footing. Concurrently, upcoming telescope facilities will provide new challenges and constraints for models, particularly by directly constraining inflows and outflows through observations of gas in and around galaxies.

  5. Abdominal surgery process modeling framework for simulation using spreadsheets.

    PubMed

    Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja

    2015-08-01

    We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming.

  6. Simulation Framework for Teaching in Modeling and Simulation Areas

    ERIC Educational Resources Information Center

    De Giusti, Marisa Raquel; Lira, Ariel Jorge; Villarreal, Gonzalo Lujan

    2008-01-01

    Simulation is the process of executing a model that describes a system with enough detail; this model has its entities, an internal state, some input and output variables and a list of processes bound to these variables. Teaching a simulation language such as general purpose simulation system (GPSS) is always a challenge, because of the way it…

  7. An integrated hydrologic modeling framework for coupling SWAT with MODFLOW

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Soil and Water Assessment Tool (SWAT), MODFLOW, and Energy Balance based Evapotranspiration (EB_ET) models are extensively used to estimate different components of the hydrological cycle. Surface and subsurface hydrological processes are modeled in SWAT but limited to the extent of shallow aquif...

  8. A MULTISCALE, CELL-BASED FRAMEWORK FOR MODELING CANCER DEVELOPMENT

    SciTech Connect

    JIANG, YI

    2007-01-16

    Cancer remains to be one of the leading causes of death due to diseases. We use a systems approach that combines mathematical modeling, numerical simulation, in vivo and in vitro experiments, to develop a predictive model that medical researchers can use to study and treat cancerous tumors. The multiscale, cell-based model includes intracellular regulations, cellular level dynamics and intercellular interactions, and extracellular level chemical dynamics. The intracellular level protein regulations and signaling pathways are described by Boolean networks. The cellular level growth and division dynamics, cellular adhesion and interaction with the extracellular matrix is described by a lattice Monte Carlo model (the Cellular Potts Model). The extracellular dynamics of the signaling molecules and metabolites are described by a system of reaction-diffusion equations. All three levels of the model are integrated through a hybrid parallel scheme into a high-performance simulation tool. The simulation results reproduce experimental data in both avasular tumors and tumor angiogenesis. By combining the model with experimental data to construct biologically accurate simulations of tumors and their vascular systems, this model will enable medical researchers to gain a deeper understanding of the cellular and molecular interactions associated with cancer progression and treatment.

  9. Source-to-Outcome Microbial Exposure and Risk Modeling Framework

    EPA Science Inventory

    A Quantitative Microbial Risk Assessment (QMRA) is a computer-based data-delivery and modeling approach that integrates interdisciplinary fate/transport, exposure, and impact models and databases to characterize potential health impacts/risks due to pathogens. As such, a QMRA ex...

  10. A Model Framework for Course Materials Construction (Second Edition).

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    Designed for use by Coast Guard course writers, curriculum developers, course coordinators, and instructors as a decision-support system, this publication presents a model that translates the Intraservices Procedures for Instructional Systems Development curriculum design model into materials usable by classroom teachers and students. Although…

  11. A Framework for Non-Gaussian Signal Modeling and Estimation

    DTIC Science & Technology

    1999-06-01

    1993. [38] B. P. Carlin , N. G. Polson, and D. S. Stoffer, "A Monte Carlo approach to nonnormal and nonlinear state-space modeling," Journal of the...NJ: Prentice-Hall, 1992. [198] J. R. Thompson, Empirical Model Building. New York: John Wiley & Sons, 1989. [199] J. R. Thompson and R. A. Tapia

  12. Developing an Interdisciplinary Curriculum Framework for Aquatic-Ecosystem Modeling

    ERIC Educational Resources Information Center

    Saito, Laurel; Segale, Heather M.; DeAngelis, Donald L.; Jenkins, Stephen H.

    2007-01-01

    This paper presents results from a July 2005 workshop and course aimed at developing an interdisciplinary course on modeling aquatic ecosystems that will provide the next generation of practitioners with critical skills for which formal training is presently lacking. Five different course models were evaluated: (1) fundamentals/general principles…

  13. A Framework for Modeling Human-Machine Interactions

    NASA Technical Reports Server (NTRS)

    Shafto, Michael G.; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    Modern automated flight-control systems employ a variety of different behaviors, or modes, for managing the flight. While developments in cockpit automation have resulted in workload reduction and economical advantages, they have also given rise to an ill-defined class of human-machine problems, sometimes referred to as 'automation surprises'. Our interest in applying formal methods for describing human-computer interaction stems from our ongoing research on cockpit automation. In this area of aeronautical human factors, there is much concern about how flight crews interact with automated flight-control systems, so that the likelihood of making errors, in particular mode-errors, is minimized and the consequences of such errors are contained. The goal of the ongoing research on formal methods in this context is: (1) to develop a framework for describing human interaction with control systems; (2) to formally categorize such automation surprises; and (3) to develop tests for identification of these categories early in the specification phase of a new human-machine system.

  14. Deep Modeling: Circuit Characterization Using Theory Based Models in a Data Driven Framework

    SciTech Connect

    Bolme, David S; Mikkilineni, Aravind K; Rose, Derek C; Yoginath, Srikanth B; Holleman, Jeremy; Judy, Mohsen

    2017-01-01

    Analog computational circuits have been demonstrated to provide substantial improvements in power and speed relative to digital circuits, especially for applications requiring extreme parallelism but only modest precision. Deep machine learning is one such area and stands to benefit greatly from analog and mixed-signal implementations. However, even at modest precisions, offsets and non-linearity can degrade system performance. Furthermore, in all but the simplest systems, it is impossible to directly measure the intermediate outputs of all sub-circuits. The result is that circuit designers are unable to accurately evaluate the non-idealities of computational circuits in-situ and are therefore unable to fully utilize measurement results to improve future designs. In this paper we present a technique to use deep learning frameworks to model physical systems. Recently developed libraries like TensorFlow make it possible to use back propagation to learn parameters in the context of modeling circuit behavior. Offsets and scaling errors can be discovered even for sub-circuits that are deeply embedded in a computational system and not directly observable. The learned parameters can be used to refine simulation methods or to identify appropriate compensation strategies. We demonstrate the framework using a mixed-signal convolution operator as an example circuit.

  15. The Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP): project framework.

    PubMed

    Warszawski, Lila; Frieler, Katja; Huber, Veronika; Piontek, Franziska; Serdeczny, Olivia; Schewe, Jacob

    2014-03-04

    The Inter-Sectoral Impact Model Intercomparison Project offers a framework to compare climate impact projections in different sectors and at different scales. Consistent climate and socio-economic input data provide the basis for a cross-sectoral integration of impact projections. The project is designed to enable quantitative synthesis of climate change impacts at different levels of global warming. This report briefly outlines the objectives and framework of the first, fast-tracked phase of Inter-Sectoral Impact Model Intercomparison Project, based on global impact models, and provides an overview of the participating models, input data, and scenario set-up.

  16. Surgical model-view-controller simulation software framework for local and collaborative applications

    PubMed Central

    Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2010-01-01

    Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933

  17. A big-microsite framework for soil carbon modeling.

    PubMed

    Davidson, Eric A; Savage, Kathleen E; Finzi, Adrien C

    2014-12-01

    Soil carbon cycling processes potentially play a large role in biotic feedbacks to climate change, but little agreement exists at present on what the core of numerical soil C cycling models should look like. In contrast, most canopy models of photosynthesis and leaf gas exchange share a common 'Farquhaur-model' core structure. Here, we explore why a similar core model structure for heterotrophic soil respiration remains elusive and how a pathway to that goal might be envisioned. The spatial and temporal variation in soil microsite conditions greatly complicates modeling efforts, but we believe it is possible to develop a tractable number of parameterizable equations that are organized into a coherent, modular, numerical model structure. First, we show parallels in insights gleaned from linking Arrhenius and Michaelis-Menten kinetics for both photosynthesis and soil respiration. Additional equations and layers of complexity are then added to simulate substrate supply. For soils, model modules that simulate carbon stabilization processes will be key to estimating the fraction of soil C that is accessible to enzymes. Potential modules for dynamic photosynthate input, wetting-event inputs, freeze-thaw impacts on substrate diffusion, aggregate turnover, soluble-C sorption, gas transport, methane respiration, and microbial dynamics are described for conceptually and numerically linking our understanding of fast-response processes of soil gas exchange with longer-term dynamics of soil carbon and nitrogen stocks.

  18. The Community Earth System Model: A Framework for Collaborative Research

    SciTech Connect

    Hurrell, Jim; Holland, Marika M.; Gent, Peter R.; Ghan, Steven J.; Kay, Jennifer; Kushner, P.; Lamarque, J.-F.; Large, William G.; Lawrence, David M.; Lindsay, Keith; Lipscomb, William; Long , Matthew; Mahowald, N.; Marsh, D.; Neale, Richard; Rasch, Philip J.; Vavrus, Steven J.; Vertenstein, Mariana; Bader, David C.; Collins, William D.; Hack, James; Kiehl, J. T.; Marshall, Shawn

    2013-09-30

    The Community Earth System Model (CESM) is a flexible and extensible community tool used to investigate a diverse set of earth system interactions across multiple time and space scales. This global coupled model is a natural evolution from its predecessor, the Community Climate System Model, following the incorporation of new earth system capabilities. These include the ability to simulate biogeochemical cycles, atmospheric chemistry, ice sheets, and a high-top atmosphere. These and other new model capabilities are enabling investigations into a wide range of pressing scientific questions, providing new predictive capabilities and increasing our collective knowledge about the behavior and interactions of the earth system. Simulations with numerous configurations of the CESM have been provided to the Coupled Model Intercomparison Project Phase 5 (CMIP5) and are being analyzed by the broader community of scientists. Additionally, the model source code and associated documentation are freely available to the scientific community to use for earth system studies, making it a true community tool. Here we describe this earth modeling system, its various possible configurations, and illustrate its capabilities with a few science highlights.

  19. Brokering as a framework for hydrological model repeatability

    NASA Astrophysics Data System (ADS)

    Fuka, Daniel; Collick, Amy; MacAlister, Charlotte; Braeckel, Aaron; Wright, Dawn; Jodha Khalsa, Siri; Boldrini, Enrico; Easton, Zachary

    2015-04-01

    Data brokering aims to provide those in the the sciences with quick and repeatable access to data that represents physical, biological, and chemical characteristics; specifically to accelerate scientific discovery. Environmental models are useful tools to understand the behavior of hydrological systems. Unfortunately, parameterization of these hydrological models requires many different data, from different sources, and from different disciplines (e.g., atmospheric, geoscience, ecology). In basin scale hydrological modeling, the traditional procedure for model initialization starts with obtaining elevation models, land-use characterizations, soils maps, and weather data. It is often the researcher's past experience with these datasets that determines which datasets will be used in a study, and often newer, or more suitable data products will exist. An added complexity is that various science communities have differing data formats, storage protocols, and manipulation methods, which makes use by a non native user exceedingly difficult and time consuming. We demonstrate data brokering as a means to address several of these challenges. We present two test case scenarios in which researchers attempt to reproduce hydrological model results using 1) general internet based data gathering techniques, and 2) a scientific data brokering interface. We show that data brokering can increase the efficiency with which data are obtained, models are initialized, and results are analyzed. As an added benefit, it appears brokering can significantly increase the repeatability of a given study.

  20. A framework to establish credibility of computational models in biology.

    PubMed

    Patterson, Eann A; Whelan, Maurice P

    2016-10-01

    Computational models in biology and biomedical science are often constructed to aid people's understanding of phenomena or to inform decisions with socioeconomic consequences. Model credibility is the willingness of people to trust a model's predictions and is often difficult to establish for computational biology models. A 3 × 3 matrix has been proposed to allow such models to be categorised with respect to their testability and epistemic foundation in order to guide the selection of an appropriate process of validation to supply evidence to establish credibility. Three approaches to validation are identified that can be deployed depending on whether a model is deemed untestable, testable or lies somewhere in between. In the latter two cases, the validation process involves the quantification of uncertainty which is a key output. The issues arising due to the complexity and inherent variability of biological systems are discussed and the creation of 'digital twins' proposed as a means to alleviate the issues and provide a more robust, transparent and traceable route to model credibility and acceptance.

  1. Integrated Modeling, Mapping, and Simulation (IMMS) framework for planning exercises.

    SciTech Connect

    Friedman-Hill, Ernest J.; Plantenga, Todd D.

    2010-06-01

    The Integrated Modeling, Mapping, and Simulation (IMMS) program is designing and prototyping a simulation and collaboration environment for linking together existing and future modeling and simulation tools to enable analysts, emergency planners, and incident managers to more effectively, economically, and rapidly prepare, analyze, train, and respond to real or potential incidents. When complete, the IMMS program will demonstrate an integrated modeling and simulation capability that supports emergency managers and responders with (1) conducting 'what-if' analyses and exercises to address preparedness, analysis, training, operations, and lessons learned, and (2) effectively, economically, and rapidly verifying response tactics, plans and procedures.

  2. Next Generation Framework for Aquatic Modeling of the Earth System (NextFrAMES)

    NASA Astrophysics Data System (ADS)

    Fekete, B. M.; Wollheim, W. M.; Lakhankar, T.; Vorosmarty, C. J.

    2008-12-01

    Earth System model development is becoming an increasingly complex task. As scientists attempt to represent the physical and bio-geochemical processes and various feedback mechanisms in unprecedented detail, the models themselves are becoming increasingly complex. At the same time, the surrounding IT infrastructure needed to carry out these detailed model computations is growing increasingly complex as well. To be accurate and useful, Earth System models must manage a vast amount of data in heterogenous computing environments ranging from single CPU systems to Beowulf type computer clusters. Scientists developing Earth System models increasingly confront obstacles associated with IT infrastructure. Numerous development efforts are on the way to ease that burden and offer model development platforms that reduce IT challenges and allow scientists to focus on their science. While these new modeling frameworks (e.g. FMS, ESMF, CCA, OpenMI) do provide solutions to many IT challenges (performing input/output, managing space and time, establishing model coupling, etc.), they are still considerably complex and often have steep learning curves. Over the course of the last fifteen years ,the University of New Hampshire developed several modeling frameworks independently from the above-mentioned efforts (Data Assembler, Frameworks for Aquatic Modeling of the Earth System and NextFrAMES which is continued at CCNY). While the UNH modeling frameworks have numerous similarities to those developed by other teams, these frameworks, in particular the latest NextFrAMES, represent a novel model development paradigm. While other modeling frameworks focus on providing services to modelers to perform various tasks, NextFrAMES strives to hide all of those services and provide a new approach for modelers to express their scientific thoughts. From a scientific perspective, most models have two core elements: the overall model structure (defining the linkages between the simulated processes

  3. Model Components of the Certification Framework for Geologic Carbon Sequestration Risk Assessment

    SciTech Connect

    Oldenburg, Curtis M.; Bryant, Steven L.; Nicot, Jean-Philippe; Kumar, Navanit; Zhang, Yingqi; Jordan, Preston; Pan, Lehua; Granvold, Patrick; Chow, Fotini K.

    2009-06-01

    We have developed a framework for assessing the leakage risk of geologic carbon sequestration sites. This framework, known as the Certification Framework (CF), emphasizes wells and faults as the primary potential leakage conduits. Vulnerable resources are grouped into compartments, and impacts due to leakage are quantified by the leakage flux or concentrations that could potentially occur in compartments under various scenarios. The CF utilizes several model components to simulate leakage scenarios. One model component is a catalog of results of reservoir simulations that can be queried to estimate plume travel distances and times, rather than requiring CF users to run new reservoir simulations for each case. Other model components developed for the CF and described here include fault characterization using fault-population statistics; fault connection probability using fuzzy rules; well-flow modeling with a drift-flux model implemented in TOUGH2; and atmospheric dense-gas dispersion using a mesoscale weather prediction code.

  4. Development of a practical modeling framework for estimating the impact of wind technology on bird populations

    SciTech Connect

    Morrison, M.L.; Pollock, K.H.

    1997-11-01

    One of the most pressing environmental concerns related to wind project development is the potential for avian fatalities caused by the turbines. The goal of this project is to develop a useful, practical modeling framework for evaluating potential wind power plant impacts that can be generalized to most bird species. This modeling framework could be used to get a preliminary understanding of the likelihood of significant impacts to birds, in a cost-effective way. The authors accomplish this by (1) reviewing the major factors that can influence the persistence of a wild population; (2) briefly reviewing various models that can aid in estimating population status and trend, including methods of evaluating model structure and performance; (3) reviewing survivorship and population projections; and (4) developing a framework for using models to evaluate the potential impacts of wind development on birds.

  5. A model integration framework for linking SWAT and MODFLOW

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Hydrological response and transport phenomena are driven by atmospheric, surface and subsurface processes. These complex processes occur at different spatiotemporal scales requiring comprehensive modeling to assess the impact of anthropogenic activity on hydrology and fate and transport of chemical ...

  6. Effective Thermal Conductivity Modeling of Sandstones: SVM Framework Analysis

    NASA Astrophysics Data System (ADS)

    Rostami, Alireza; Masoudi, Mohammad; Ghaderi-Ardakani, Alireza; Arabloo, Milad; Amani, Mahmood

    2016-06-01

    Among the most significant physical characteristics of porous media, the effective thermal conductivity (ETC) is used for estimating the thermal enhanced oil recovery process efficiency, hydrocarbon reservoir thermal design, and numerical simulation. This paper reports the implementation of an innovative least square support vector machine (LS-SVM) algorithm for the development of enhanced model capable of predicting the ETCs of dry sandstones. By means of several statistical parameters, the validity of the presented model was evaluated. The prediction of the developed model for determining the ETCs of dry sandstones was in excellent agreement with the reported data with a coefficient of determination value ({R}2) of 0.983 and an average absolute relative deviation of 0.35 %. Results from present research show that the proposed LS-SVM model is robust, reliable, and efficient in calculating the ETCs of sandstones.

  7. A Model Framework for Science and Other Course Materials Construction.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    A model is presented to provide guidance for Coast Guard writers, curriculum developers, course coordinators, and instructors who intend to update, or draft course materials. Detailed instructions are provided for developing instructor's guides and student's guides. (CS)

  8. A PROBABILISTIC MODELING FRAMEWORK FOR PREDICTING POPULATION EXPOSURES TO BENZENE

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) is modifying their probabilistic Stochastic Human Exposure Dose Simulation (SHEDS) model to assess aggregate exposures to air toxics. Air toxics include urban Hazardous Air Pollutants (HAPS) such as benzene from mobile sources, part...

  9. C-HiLasso: A Collaborative Hierarchical Sparse Modeling Framework

    DTIC Science & Technology

    2010-06-01

    structural constraints to this active set has value both at the level of representation robustness and at the level of signal interpretation (in particular...structure (and robustness ) to the problem is to consider the simultaneous encoding of multiple signals, requesting that they all share the same...in the set, which translates into robustness in the model (class) selection. As with models such as Lasso and Group Lasso, the optimal parameters λ1

  10. Multiscale Model of Colorectal Cancer Using the Cellular Potts Framework

    PubMed Central

    Osborne, James M

    2015-01-01

    Colorectal cancer (CRC) is one of the major causes of death in the developed world and forms a canonical example of tumorigenesis. CRC arises from a string of mutations of individual cells in the colorectal crypt, making it particularly suited for multiscale multicellular modeling, where mutations of individual cells can be clearly represented and their effects readily tracked. In this paper, we present a multicellular model of the onset of colorectal cancer, utilizing the cellular Potts model (CPM). We use the model to investigate how, through the modification of their mechanical properties, mutant cells colonize the crypt. Moreover, we study the influence of mutations on the shape of cells in the crypt, suggesting possible cell- and tissue-level indicators for identifying early-stage cancerous crypts. Crucially, we discuss the effect that the motility parameters of the model (key factors in the behavior of the CPM) have on the distribution of cells within a homeostatic crypt, resulting in an optimal parameter regime that accurately reflects biological assumptions. In summary, the key results of this paper are 1) how to couple the CPM with processes occurring on other spatial scales, using the example of the crypt to motivate suitable motility parameters; 2) modeling mutant cells with the CPM; 3) and investigating how mutations influence the shape of cells in the crypt. PMID:26461973

  11. Model Adaptation for Prognostics in a Particle Filtering Framework

    NASA Technical Reports Server (NTRS)

    Saha, Bhaskar; Goebel, Kai Frank

    2011-01-01

    One of the key motivating factors for using particle filters for prognostics is the ability to include model parameters as part of the state vector to be estimated. This performs model adaptation in conjunction with state tracking, and thus, produces a tuned model that can used for long term predictions. This feature of particle filters works in most part due to the fact that they are not subject to the "curse of dimensionality", i.e. the exponential growth of computational complexity with state dimension. However, in practice, this property holds for "well-designed" particle filters only as dimensionality increases. This paper explores the notion of wellness of design in the context of predicting remaining useful life for individual discharge cycles of Li-ion batteries. Prognostic metrics are used to analyze the tradeoff between different model designs and prediction performance. Results demonstrate how sensitivity analysis may be used to arrive at a well-designed prognostic model that can take advantage of the model adaptation properties of a particle filter.

  12. Design theoretic analysis of three system modeling frameworks.

    SciTech Connect

    McDonald, Michael James

    2007-05-01

    This paper analyzes three simulation architectures from the context of modeling scalability to address System of System (SoS) and Complex System problems. The paper first provides an overview of the SoS problem domain and reviews past work in analyzing model and general system complexity issues. It then identifies and explores the issues of vertical and horizontal integration as well as coupling and hierarchical decomposition as the system characteristics and metrics against which the tools are evaluated. In addition, it applies Nam Suh's Axiomatic Design theory as a construct for understanding coupling and its relationship to system feasibility. Next it describes the application of MATLAB, Swarm, and Umbra (three modeling and simulation approaches) to modeling swarms of Unmanned Flying Vehicle (UAV) agents in relation to the chosen characteristics and metrics. Finally, it draws general conclusions for analyzing model architectures that go beyond those analyzed. In particular, it identifies decomposition along phenomena of interaction and modular system composition as enabling features for modeling large heterogeneous complex systems.

  13. A full annual cycle modeling framework for American black ducks

    USGS Publications Warehouse

    Robinson, Orin J.; McGowan, Conor; Devers, Patrick K.; Brook, Rodney W.; Huang, Min; Jones, Malcom; McAuley, Daniel G.; Zimmerman, Guthrie S.

    2016-01-01

    American black ducks (Anas rubripes) are a harvested, international migratory waterfowl species in eastern North America. Despite an extended period of restrictive harvest regulations, the black duck population is still below the population goal identified in the North American Waterfowl Management Plan (NAWMP). It has been hypothesized that density-dependent factors restrict population growth in the black duck population and that habitat management (increases, improvements, etc.) may be a key component of growing black duck populations and reaching the prescribed NAWMP population goal. Using banding data from 1951 to 2011 and breeding population survey data from 1990 to 2014, we developed a full annual cycle population model for the American black duck. This model uses the seven management units as set by the Black Duck Joint Venture, allows movement into and out of each unit during each season, and models survival and fecundity for each region separately. We compare model population trajectories with observed population data and abundance estimates from the breeding season counts to show the accuracy of this full annual cycle model. With this model, we then show how to simulate the effects of habitat management on the continental black duck population.

  14. Towards uncertainty quantification and parameter estimation for Earth system models in a component-based modeling framework

    NASA Astrophysics Data System (ADS)

    Peckham, Scott D.; Kelbert, Anna; Hill, Mary C.; Hutton, Eric W. H.

    2016-05-01

    Component-based modeling frameworks make it easier for users to access, configure, couple, run and test numerical models. However, they do not typically provide tools for uncertainty quantification or data-based model verification and calibration. To better address these important issues, modeling frameworks should be integrated with existing, general-purpose toolkits for optimization, parameter estimation and uncertainty quantification. This paper identifies and then examines the key issues that must be addressed in order to make a component-based modeling framework interoperable with general-purpose packages for model analysis. As a motivating example, one of these packages, DAKOTA, is applied to a representative but nontrivial surface process problem of comparing two models for the longitudinal elevation profile of a river to observational data. Results from a new mathematical analysis of the resulting nonlinear least squares problem are given and then compared to results from several different optimization algorithms in DAKOTA.

  15. Microplane constitutive model and computational framework for blood vessel tissue.

    PubMed

    Caner, Ferhun C; Carol, Ignacio

    2006-06-01

    This paper presents a nonlinearly elastic anisotropic microplane formulation in 3D for computational constitutive modeling of arterial soft tissue in the passive regime. The constitutive modeling of arterial (and other biological) soft tissue is crucial for accurate finite element calculations, which in turn are essential for design of implants, surgical procedures, bioartificial tissue, as well as determination of effect of progressive diseases on tissues and implants. The model presented is defined at a lower scale (mesoscale) than the conventional macroscale and it incorporates the effect of all the (collagen) fibers which are anisotropic structural components distributed in all directions within the tissue material in addition to that of isotropic bulk tissue. It is shown that the proposed model not only reproduces Holzapfel's recent model but also improves on it by accounting for the actual three-dimensional distribution of fiber orientation in the arterial wall, which endows the model with advanced capabilities in simulation of remodeling of soft tissue. The formulation is flexible so that its parameters could be adjusted to represent the arterial wall either as a single material or a material composed of several layers in finite element analyses of arteries. Explicit algorithms for both the material subroutine and the explicit integration with dynamic relaxation of equations of motion using finite element method are given. To circumvent the slow convergence of the standard dynamic relaxation and small time steps dictated by the stability of the explicit integrator, an adaptive dynamic relaxation technique that ensures stability and fastest possible convergence rates is developed. Incompressibility is enforced using penalty method with an updated penalty parameter. The model is used to simulate experimental data from the literature demonstrating that the model response is in excellent agreement with the data. An experimental procedure to determine the

  16. Building an Open Source Framework for Integrated Catchment Modeling

    NASA Astrophysics Data System (ADS)

    Jagers, B.; Meijers, E.; Villars, M.

    2015-12-01

    In order to develop effective strategies and associated policies for environmental management, we need to understand the dynamics of the natural system as a whole and the human role therein. This understanding is gained by comparing our mental model of the world with observations from the field. However, to properly understand the system we should look at dynamics of water, sediments, water quality, and ecology throughout the whole system from catchment to coast both at the surface and in the subsurface. Numerical models are indispensable in helping us understand the interactions of the overall system, but we need to be able to update and adjust them to improve our understanding and test our hypotheses. To support researchers around the world with this challenging task we started a few years ago with the development of a new open source modeling environment DeltaShell that integrates distributed hydrological models with 1D, 2D, and 3D hydraulic models including generic components for the tracking of sediment, water quality, and ecological quantities throughout the hydrological cycle composed of the aforementioned components. The open source approach combined with a modular approach based on open standards, which allow for easy adjustment and expansion as demands and knowledge grow, provides an ideal starting point for addressing challenging integrated environmental questions.

  17. Introducing a boreal wetland model within the Earth System model framework

    NASA Astrophysics Data System (ADS)

    Getzieh, R. J.; Brovkin, V.; Reick, C.; Kleinen, T.; Raddatz, T.; Raivonen, M.; Sevanto, S.

    2009-04-01

    Wetlands of the northern high latitudes with their low temperatures and waterlogged conditions are prerequisite for peat accumulation. They store at least 25% of the global soil organic carbon and constitute currently the largest natural source of methane. These boreal and subarctic peat carbon pools are sensitive to climate change since the ratio of carbon sequestration and emission is closely dependent on hydrology and temperature. Global biogeochemistry models used for simulations of CO2 dynamics in the past and future climates usually ignore changes in the peat storages. Our approach aims at the evaluation of the boreal wetland feedback to climate through the CO2 and CH4 fluxes on decadal to millennial time scales. A generic model of organic matter accumulation and decay in boreal wetlands is under development in the MPI for Meteorology in cooperation with the University of Helsinki. Our approach is to develop a wetland model which is consistent with the physical and biogeochemical components of the land surface module JSBACH as a part of the Earth System model framework ECHAM5-MPIOM-JSBACH. As prototypes, we use modelling approach by Frolking et al. (2001) for the peat dynamics and the wetland model by Wania (2007) for vegetation cover and plant productivity. An initial distribution of wetlands follows the GLWD-3 map by Lehner and Döll (2004). First results of the modelling approach will be presented. References: Frolking, S. E., N. T. Roulet, T. R. Moore, P. J. H. Richard, M. Lavoie and S. D. Muller (2001): Modeling Northern Peatland Decomposition and Peat Accumulation, Ecosystems, 4, 479-498. Lehner, B., Döll P. (2004): Development and validation of a global database of lakes, reservoirs and wetlands. Journal of Hydrology 296 (1-4), 1-22. Wania, R. (2007): Modelling northern peatland land surface processes, vegetation dynamics and methane emissions. PhD thesis, University of Bristol, 122 pp.

  18. Implementation of a PETN failure model using ARIA's general chemistry framework

    SciTech Connect

    Hobbs, Michael L.

    2017-01-01

    We previously developed a PETN thermal decomposition model that accurately predicts thermal ignition and detonator failure [1]. This model was originally developed for CALORE [2] and required several complex user subroutines. Recently, a simplified version of the PETN decomposition model was implemented into ARIA [3] using a general chemistry framework without need for user subroutines. Detonator failure was also predicted with this new model using ENCORE. The model was simplified by 1) basing the model on moles rather than mass, 2) simplifying the thermal conductivity model, and 3) implementing ARIA’s new phase change model. This memo briefly describes the model, implementation, and validation.

  19. Modeling Agriculture and Land Use in an Integrated Assessment Framework

    SciTech Connect

    Sands, Ronald D.; Leimbach, Marian

    2003-01-01

    The Agriculture and Land Use (AgLU) model is a top-down economic model with just enough structure to simulate global land use change and the resulting carbon emissions over one century. These simulations are done with and without a carbon policy represented by a positive carbon price. Increases in the carbon price create incentives for production of commercial biomass that affect the distribution of other land types and, therefore, carbon emissions from land use change. Commercial biomass provides a link between the agricultural and energy systems. The ICLIPS core model uses AgLU to provide estimates of carbon emissions from land use change as one component of total greenhouse gas emissions.

  20. A Flexible Atmospheric Modeling Framework for the CESM

    SciTech Connect

    Randall, David; Heikes, Ross; Konor, Celal

    2014-11-12

    We have created two global dynamical cores based on the unified system of equations and Z-grid staggering on an icosahedral grid, which are collectively called UZIM (Unified Z-grid Icosahedral Model). The z-coordinate version (UZIM-height) can be run in hydrostatic and nonhydrostatic modes. The sigma-coordinate version (UZIM-sigma) runs in only hydrostatic mode. The super-parameterization has been included as a physics option in both models. The UZIM versions with the super-parameterization are called SUZI. With SUZI-height, we have completed aquaplanet runs. With SUZI-sigma, we are making aquaplanet runs and realistic climate simulations. SUZI-sigma includes realistic topography and a SiB3 model to parameterize the land-surface processes.

  1. A modelling and reasoning framework for social networks policies

    NASA Astrophysics Data System (ADS)

    Governatori, Guido; Iannella, Renato

    2011-02-01

    Policy languages (such as privacy and rights) have had little impact on the wider community. Now that social networks have taken off, the need to revisit policy languages and realign them towards social networks requirements has become more apparent. One such language is explored as to its applicability to the social networks masses. We also argue that policy languages alone are not sufficient and thus they should be paired with reasoning mechanisms to provide precise and unambiguous execution models of the policies. To this end, we propose a computationally oriented model to represent, reason with and execute policies for social networks.

  2. COMPASS: A Framework for Automated Performance Modeling and Prediction

    SciTech Connect

    Lee, Seyong; Meredith, Jeremy S; Vetter, Jeffrey S

    2015-01-01

    Flexible, accurate performance predictions offer numerous benefits such as gaining insight into and optimizing applications and architectures. However, the development and evaluation of such performance predictions has been a major research challenge, due to the architectural complexities. To address this challenge, we have designed and implemented a prototype system, named COMPASS, for automated performance model generation and prediction. COMPASS generates a structured performance model from the target application's source code using automated static analysis, and then, it evaluates this model using various performance prediction techniques. As we demonstrate on several applications, the results of these predictions can be used for a variety of purposes, such as design space exploration, identifying performance tradeoffs for applications, and understanding sensitivities of important parameters. COMPASS can generate these predictions across several types of applications from traditional, sequential CPU applications to GPU-based, heterogeneous, parallel applications. Our empirical evaluation demonstrates a maximum overhead of 4%, flexibility to generate models for 9 applications, speed, ease of creation, and very low relative errors across a diverse set of architectures.

  3. Formulating the Brogden Classification Framework as a Discrete Choice Model

    DTIC Science & Technology

    2012-11-01

    that satisfy the job quota constraints using an MMNL parameter estimation algorithm. Biogeme ( Bierle , 2003) model files for carrying out the...demand. Cambridge, MA: MIT Press. Bierle , M. (2003). An introduction to BIOGEME (Version 1.3) http://roso.epfl.ch/biogeme. Brogden, H. E. (1954). A

  4. A Framework for Modelling Connective Tissue Changes in VIIP Syndrome

    NASA Technical Reports Server (NTRS)

    Ethier, C. R.; Best, L.; Gleason, R.; Mulugeta, L.; Myers, J. G.; Nelson, E. S.; Samuels, B. C.

    2014-01-01

    Insertion of astronauts into microgravity induces a cascade of physiological adaptations, notably including a cephalad fluid shift. Longer-duration flights carry an increased risk of developing Visual Impairment and Intracranial Pressure (VIIP) syndrome, a spectrum of ophthalmic changes including posterior globe flattening, choroidal folds, distension of the optic nerve sheath, kinking of the optic nerve and potentially permanent degradation of visual function. The slow onset of changes in VIIP, their chronic nature, and the similarity of certain clinical features of VIIP to ophthalmic findings in patients with raised intracranial pressure strongly suggest that: (i) biomechanical factors play a role in VIIP, and (ii) connective tissue remodeling must be accounted for if we wish to understand the pathology of VIIP. Our goal is to elucidate the pathophysiology of VIIP and suggest countermeasures based on biomechanical modeling of ocular tissues, suitably informed by experimental data, and followed by validation and verification. We specifically seek to understand the quasi-homeostatic state that evolves over weeks to months in space, during which ocular tissue remodeling occurs. This effort is informed by three bodies of work: (i) modeling of cephalad fluid shifts; (ii) modeling of ophthalmic tissue biomechanics in glaucoma; and (iii) modeling of connective tissue changes in response to biomechanical loading.

  5. Iberian (South American) Model of Judicial Review: Toward Conceptual Framework

    ERIC Educational Resources Information Center

    Klishas, Andrey A.

    2016-01-01

    The paper explores Latin American countries legislation with the view to identify specific features of South American model of judicial review. The research methodology rests on comparative approach to analyzing national constitutions' provisions and experts' interpretations thereof. The constitutional provisions of Brazil, Peru, Mexico, and…

  6. An Empirical Generative Framework for Computational Modeling of Language Acquisition

    ERIC Educational Resources Information Center

    Waterfall, Heidi R.; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-01-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of…

  7. Generic framework for mining cellular automata models on protein-folding simulations.

    PubMed

    Diaz, N; Tischer, I

    2016-05-13

    Cellular automata model identification is an important way of building simplified simulation models. In this study, we describe a generic architectural framework to ease the development process of new metaheuristic-based algorithms for cellular automata model identification in protein-folding trajectories. Our framework was developed by a methodology based on design patterns that allow an improved experience for new algorithms development. The usefulness of the proposed framework is demonstrated by the implementation of four algorithms, able to obtain extremely precise cellular automata models of the protein-folding process with a protein contact map representation. Dynamic rules obtained by the proposed approach are discussed, and future use for the new tool is outlined.

  8. Development of a framework for reporting health service models for managing rheumatoid arthritis.

    PubMed

    O'Donnell, Siobhan; Li, Linda C; King, Judy; Lauzon, Chantal; Finn, Heather; Vliet Vlieland, Theodora P M

    2010-02-01

    The purpose of this study was to develop a framework for reporting health service models for managing rheumatoid arthritis (RA). We conducted a search of the health sciences literature for primary studies that described interventions which aimed to improve the implementation of health services in adults with RA. Thereafter, a nominal group consensus process was used to synthesize the evidence for the development of the reporting framework. Of the 2,033 citations screened, 68 primary studies were included which described 93 health service models for RA. The origin and meaning of the labels given to these health service delivery models varied widely and, in general, the reporting of their components lacked detail or was absent. The six dimensions underlying the framework for reporting RA health service delivery models are: (1) Why was it founded? (2) Who was involved? (3) What were the roles of those participating? (4) When were the services provided? (5) Where were the services provided/received? (6) How were the services/interventions accessed and implemented, how long was the intervention, how did individuals involved communicate, and how was the model supported/sustained? The proposed framework has the potential to facilitate knowledge exchange among clinicians, researchers, and decision makers in the area of health service delivery. Future work includes the validation of the framework with national and international stakeholders such as clinicians, health care administrators, and health services researchers.

  9. A model for phenotype change in a stochastic framework.

    PubMed

    Wake, Graeme; Pleasants, Anthony; Beedle, Alan; Gluckman, Peter

    2010-07-01

    In some species, an inducible secondary phenotype will develop some time after the environmental change that evokes it. Nishimura (2006) [4] showed how an individual organism should optimize the time it takes to respond to an environmental change ("waiting time''). If the optimal waiting time is considered to act over the population, there are implications for the expected value of the mean fitness in that population. A stochastic predator-prey model is proposed in which the prey have a fixed initial energy budget. Fitness is the product of survival probability and the energy remaining for non-defensive purposes. The model is placed in the stochastic domain by assuming that the waiting time in the population is a normally distributed random variable because of biological variance inherent in mounting the response. It is found that the value of the mean waiting time that maximises fitness depends linearly on the variance of the waiting time.

  10. A Distributed Multi-User Role-Based Model Integration Framework

    SciTech Connect

    Dorow, Kevin E.; Gorton, Ian; Thurman, David A.

    2004-06-14

    Integrated computational modeling can be very useful in making quick, yet informed decisions related to environmental issues including Brownfield assessments. Unfortunately, the process of creating meaningful information using this methodology is fraught with difficulties, particularly when multiple computational models are required. Common problems include the inability to seamlessly transfer information between models, the difficulty of incorporating new models and integrating heterogeneous data sources, executing large numbers of model runs in a reasonable time frame, and adequately capturing pedigree information that describes the specific computational steps and data required to reproduce results. While current model integration frameworks have successfully addressed some of these problems, none have addressed all of them. Building on existing work at Pacific Northwest National Laboratory (PNNL), we have created an extensible software architecture for the next generation of model integration frameworks that addresses these issues. This paper describes this architecture that is being developed to support integrated water resource modeling in a metropolitan area.

  11. An Access Control Model for the Uniframe Framework

    DTIC Science & Technology

    2005-05-01

    Because the success or failure of writing of a student record depends on the success or failure of multiple components, the system uses...report the model failure . There are several other temporal properties that can be easily verified. • No student should be able to read any...Systems System Result 1 Failure : These conditions are false. • No student should be able to read any portion of another student’s record. • No

  12. A Computational Framework for Phase-field Modeling

    DTIC Science & Technology

    2011-01-01

    twin embryo within an otherwise perfect single crystal (12). Analytical models based on free energy variations in the context of phase...transformations have been applied to describe twin nucleation (12, 13). Such approaches consider nucleation of a twin embryo of idealized geometry—an elliptical...Crystals, in preparation, 2011. 12. Christian , J. W.; Mahajan, S. Deformation Twinning. Prog. Mater. Sci. 1995, 39, 1–157. 13. Lee, J. K.; Yoo, M

  13. An efficient framework for modeling clouds from Landsat8 images

    NASA Astrophysics Data System (ADS)

    Yuan, Chunqiang; Guo, Jing

    2015-03-01

    Cloud plays an important role in creating realistic outdoor scenes for video game and flight simulation applications. Classic methods have been proposed for cumulus cloud modeling. However, these methods are not flexible for modeling large cloud scenes with hundreds of clouds in that the user must repeatedly model each cloud and adjust its various properties. This paper presents a meteorologically based method to reconstruct cumulus clouds from high resolution Landsat8 satellite images. From these input satellite images, the clouds are first segmented from the background. Then, the cloud top surface is estimated from the temperature of the infrared image. After that, under a mild assumption of flat base for cumulus cloud, the base height of each cloud is computed by averaging the top height for pixels on the cloud edge. Then, the extinction is generated from the visible image. Finally, we enrich the initial shapes of clouds using a fractal method and represent the recovered clouds as a particle system. The experimental results demonstrate our method can yield realistic cloud scenes resembling those in the satellite images.

  14. Using the Bifocal Modeling Framework to Resolve "Discrepant Events" between Physical Experiments and Virtual Models in Biology

    ERIC Educational Resources Information Center

    Blikstein, Paulo; Fuhrmann, Tamar; Salehi, Shima

    2016-01-01

    In this paper, we investigate an approach to supporting students' learning in science through a combination of physical experimentation and virtual modeling. We present a study that utilizes a scientific inquiry framework, which we call "bifocal modeling," to link student-designed experiments and computer models in real time. In this…

  15. The Dimensions of Social Justice Model: Transforming Traditional Group Work into a Socially Just Framework

    ERIC Educational Resources Information Center

    Ratts, Manivong J.; Anthony, Loni; Santos, KristiAnna Nicole T.

    2010-01-01

    Social justice is a complex and abstract concept that can be difficult to discuss and integrate within group work. To address this concern, this article introduces readers to the Dimensions of Social Justice Model. The model provides group leaders with a conceptual framework for understanding the degree to which social justice is integrated within…

  16. A modeling framework for characterizing near-road air pollutant concentration at community scales

    EPA Science Inventory

    In this study, we combine information from transportation network, traffic emissions, and dispersion model to develop a framework to inform exposure estimates for traffic-related air pollutants (TRAPs) with a high spatial resolution. A Research LINE source dispersion model (R-LIN...

  17. Exploring Students' Visual Conception of Matter: Towards Developing a Teaching Framework Using Models

    ERIC Educational Resources Information Center

    Espinosa, Allen A.; Marasigan, Arlyne C.; Datukan, Janir T.

    2016-01-01

    This study explored how students visualise the states and classifications of matter with the use of scientific models. Misconceptions of students in using scientific models were also identified to formulate a teaching framework. To elicit data in the study, a Visual Conception Questionnaire was administered to thirty-four (34), firstyear, general…

  18. An Odds Ratio Approach for Detecting DDF under the Nested Logit Modeling Framework

    ERIC Educational Resources Information Center

    Terzi, Ragip; Suh, Youngsuk

    2015-01-01

    An odds ratio approach (ORA) under the framework of a nested logit model was proposed for evaluating differential distractor functioning (DDF) in multiple-choice items and was compared with an existing ORA developed under the nominal response model. The performances of the two ORAs for detecting DDF were investigated through an extensive…

  19. Evaluation of Hydrometeor Occurrence Profiles in the Multiscale Modeling Framework Climate Model using Atmospheric Classification

    SciTech Connect

    Marchand, Roger T.; Beagley, Nathaniel; Ackerman, Thomas P.

    2009-09-01

    Vertical profiles of hydrometeor occurrence from the Multiscale Modeling Framework (MMF) climate model are compared with profiles observed by a vertically pointing millimeter wavelength cloud-radar (located in the U.S. Southern Great Plains) as a function of the largescale atmospheric state. The atmospheric state is determined by classifying (or clustering) the large-scale (synoptic) fields produced by the MMF and a numerical weather prediction model using a neural network approach. The comparison shows that for cold frontal and post-cold frontal conditions the MMF produces profiles of hydrometeor occurrence that compare favorably with radar observations, while for warm frontal conditions the model tends to produce hydrometeor fractions that are too large with too much cloud (non-precipitating hydrometeors) above 7 km and too much precipitating hydrometeor coverage below 7 km. We also find that the MMF has difficulty capturing the formation of low clouds and that for all atmospheric states that occur during June, July, and August, the MMF produces too much high and thin cloud, especially above 10 km.

  20. Parameter Estimation for Differential Equation Models Using a Framework of Measurement Error in Regression Models.

    PubMed

    Liang, Hua; Wu, Hulin

    2008-12-01

    Differential equation (DE) models are widely used in many scientific fields that include engineering, physics and biomedical sciences. The so-called "forward problem", the problem of simulations and predictions of state variables for given parameter values in the DE models, has been extensively studied by mathematicians, physicists, engineers and other scientists. However, the "inverse problem", the problem of parameter estimation based on the measurements of output variables, has not been well explored using modern statistical methods, although some least squares-based approaches have been proposed and studied. In this paper, we propose parameter estimation methods for ordinary differential equation models (ODE) based on the local smoothing approach and a pseudo-least squares (PsLS) principle under a framework of measurement error in regression models. The asymptotic properties of the proposed PsLS estimator are established. We also compare the PsLS method to the corresponding SIMEX method and evaluate their finite sample performances via simulation studies. We illustrate the proposed approach using an application example from an HIV dynamic study.

  1. Modeling framework for exploring emission impacts of alternative future scenarios

    NASA Astrophysics Data System (ADS)

    Loughlin, D. H.; Benjey, W. G.; Nolte, C. G.

    2010-11-01

    This article presents an approach for creating anthropogenic emission scenarios that can be used to simulate future regional air quality. The approach focuses on energy production and use since these are principal sources of air pollution. We use the MARKAL model to characterize alternative realizations of the US energy system through 2050. Emission growth factors are calculated for major energy system categories using MARKAL, while growth factors from non-energy sectors are based on economic and population projections. The SMOKE model uses these factors to grow a base-year 2002 inventory to future years through 2050. The approach is demonstrated for two emission scenarios: Scenario 1 extends current air regulations through 2050, while Scenario 2 applies a hypothetical policy that limits carbon dioxide (CO2) emissions from the energy system. Although both scenarios show significant reductions in air pollutant emissions through time, these reductions are more pronounced in Scenario 2, where the CO2 policy results in the adoption of technologies with lower emissions of both CO2 and traditional air pollutants. The methodology is expected to play an important role in investigations of linkages among emission drivers, climate and air quality by the U.S. EPA and others.

  2. A new fit-for-purpose model testing framework: Decision Crash Tests

    NASA Astrophysics Data System (ADS)

    Tolson, Bryan; Craig, James

    2016-04-01

    Decision-makers in water resources are often burdened with selecting appropriate multi-million dollar strategies to mitigate the impacts of climate or land use change. Unfortunately, the suitability of existing hydrologic simulation models to accurately inform decision-making is in doubt because the testing procedures used to evaluate model utility (i.e., model validation) are insufficient. For example, many authors have identified that a good standard framework for model testing called the Klemes Crash Tests (KCTs), which are the classic model validation procedures from Klemeš (1986) that Andréassian et al. (2009) rename as KCTs, have yet to become common practice in hydrology. Furthermore, Andréassian et al. (2009) claim that the progression of hydrological science requires widespread use of KCT and the development of new crash tests. Existing simulation (not forecasting) model testing procedures such as KCTs look backwards (checking for consistency between simulations and past observations) rather than forwards (explicitly assessing if the model is likely to support future decisions). We propose a fundamentally different, forward-looking, decision-oriented hydrologic model testing framework based upon the concept of fit-for-purpose model testing that we call Decision Crash Tests or DCTs. Key DCT elements are i) the model purpose (i.e., decision the model is meant to support) must be identified so that model outputs can be mapped to management decisions ii) the framework evaluates not just the selected hydrologic model but the entire suite of model-building decisions associated with model discretization, calibration etc. The framework is constructed to directly and quantitatively evaluate model suitability. The DCT framework is applied to a model building case study on the Grand River in Ontario, Canada. A hypothetical binary decision scenario is analysed (upgrade or not upgrade the existing flood control structure) under two different sets of model building

  3. A Satellite Based Modeling Framework for Estimating Seasonal Carbon Fluxes Over Agricultural Lands

    NASA Astrophysics Data System (ADS)

    Bandaru, V.; Izaurralde, R. C.; Sahajpal, R.; Houborg, R.; Milla, Z.

    2013-12-01

    Croplands are typically characterized by fine-scale heterogeneity, which makes it difficult to accurately estimate cropland carbon fluxes over large regions given the fairly coarse spatial resolution of high-frequency satellite observations. It is, however, important that we improve our ability to estimate spatially and temporally resolved carbon fluxes because croplands constitute a large land area and have a large impact on global carbon cycle. A Satellite based Dynamic Cropland Carbon (SDCC) modeling framework was developed to estimate spatially resolved crop specific daily carbon fluxes over large regions. This modeling framework uses the REGularized canopy reFLECtance (REGFLEC) model to estimate crop specific leaf area index (LAI) using downscaled MODIS reflectance data, and subsequently LAI estimates are integrated into the Environmental Policy Integrated Model (EPIC) model to determine daily net primary productivity (NPP) and net ecosystem productivity (NEP). Firstly, we evaluate the performance of this modeling framework over three eddy covariance flux tower sites (Bondville, IL; Fermi Agricultural Site, IL; and Rosemount site, MN). Daily NPP and NEP of corn and soybean crops are estimated (based on REGFLEC LAI) for year 2007 and 2008 over the flux tower sites and compared against flux tower observations and model estimates based on in-situ LAI. Secondly, we apply the SDCC framework for estimating regional NPP and NEP for corn, soybean and sorghum crops in Nebraska during year 2007 and 2008. The methods and results will be presented.

  4. A Satellite Based Modeling Framework for Estimating Seasonal Carbon Fluxes Over Agricultural Lands

    NASA Astrophysics Data System (ADS)

    Bandaru, V.; Houborg, R.; Izaurralde, R. C.

    2014-12-01

    Croplands are typically characterized by fine-scale heterogeneity, which makes it difficult to accurately estimate cropland carbon fluxes over large regions given the fairly coarse spatial resolution of high-frequency satellite observations. It is, however, important that we improve our ability to estimate spatially and temporally resolved carbon fluxes because croplands constitute a large land area and have a large impact on global carbon cycle. A Satellite based Dynamic Cropland Carbon (SDCC) modeling framework was developed to estimate spatially resolved crop specific daily carbon fluxes over large regions. This modeling framework uses the REGularized canopy reFLECtance (REGFLEC) model to estimate crop specific leaf area index (LAI) using downscaled MODIS reflectance data, and subsequently LAI estimates are integrated into the Environmental Policy Integrated Model (EPIC) model to determine daily net primary productivity (NPP) and net ecosystem productivity (NEP). Firstly, we evaluate the performance of this modeling framework over three eddy covariance flux tower sites (Bondville, IL; Fermi Agricultural Site, IL; and Rosemount site, MN). Daily NPP and NEP of corn and soybean crops are estimated (based on REGFLEC LAI) for year 2007 and 2008 over the flux tower sites and compared against flux tower observations and model estimates based on in-situ LAI. Secondly, we apply the SDCC framework for estimating regional NPP and NEP for corn, soybean and sorghum crops in Nebraska during year 2007 and 2008. The methods and results will be presented.

  5. A Physics-Informed Machine Learning Framework for RANS-based Predictive Turbulence Modeling

    NASA Astrophysics Data System (ADS)

    Xiao, Heng; Wu, Jinlong; Wang, Jianxun; Ling, Julia

    2016-11-01

    Numerical models based on the Reynolds-averaged Navier-Stokes (RANS) equations are widely used in turbulent flow simulations in support of engineering design and optimization. In these models, turbulence modeling introduces significant uncertainties in the predictions. In light of the decades-long stagnation encountered by the traditional approach of turbulence model development, data-driven methods have been proposed as a promising alternative. We will present a data-driven, physics-informed machine-learning framework for predictive turbulence modeling based on RANS models. The framework consists of three components: (1) prediction of discrepancies in RANS modeled Reynolds stresses based on machine learning algorithms, (2) propagation of improved Reynolds stresses to quantities of interests with a modified RANS solver, and (3) quantitative, a priori assessment of predictive confidence based on distance metrics in the mean flow feature space. Merits of the proposed framework are demonstrated in a class of flows featuring massive separations. Significant improvements over the baseline RANS predictions are observed. The favorable results suggest that the proposed framework is a promising path toward RANS-based predictive turbulence in the era of big data. (SAND2016-7435 A).

  6. A modeling framework for investment planning in interdependent infrastructures in multi-hazard environments.

    SciTech Connect

    Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.; Nozick, Linda Karen; Prince, Michael

    2013-09-01

    Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research & Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorists actions based on assumptions regarding their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.

  7. Modeling Complex Biological Flows in Multi-Scale Systems using the APDEC Framework

    SciTech Connect

    Trebotich, D

    2006-06-24

    We have developed advanced numerical algorithms to model biological fluids in multiscale flow environments using the software framework developed under the SciDAC APDEC ISIC. The foundation of our computational effort is an approach for modeling DNA-laden fluids as ''bead-rod'' polymers whose dynamics are fully coupled to an incompressible viscous solvent. The method is capable of modeling short range forces and interactions between particles using soft potentials and rigid constraints. Our methods are based on higher-order finite difference methods in complex geometry with adaptivity, leveraging algorithms and solvers in the APDEC Framework. Our Cartesian grid embedded boundary approach to incompressible viscous flow in irregular geometries has also been interfaced to a fast and accurate level-sets method within the APDEC Framework for extracting surfaces from volume renderings of medical image data and used to simulate cardio-vascular and pulmonary flows in critical anatomies.

  8. A Bayesian modelling framework for tornado occurrences in North America

    NASA Astrophysics Data System (ADS)

    Cheng, Vincent Y. S.; Arhonditsis, George B.; Sills, David M. L.; Gough, William A.; Auld, Heather

    2015-03-01

    Tornadoes represent one of nature’s most hazardous phenomena that have been responsible for significant destruction and devastating fatalities. Here we present a Bayesian modelling approach for elucidating the spatiotemporal patterns of tornado activity in North America. Our analysis shows a significant increase in the Canadian Prairies and the Northern Great Plains during the summer, indicating a clear transition of tornado activity from the United States to Canada. The linkage between monthly-averaged atmospheric variables and likelihood of tornado events is characterized by distinct seasonality; the convective available potential energy is the predominant factor in the summer; vertical wind shear appears to have a strong signature primarily in the winter and secondarily in the summer; and storm relative environmental helicity is most influential in the spring. The present probabilistic mapping can be used to draw inference on the likelihood of tornado occurrence in any location in North America within a selected time period of the year.

  9. A Bayesian modelling framework for tornado occurrences in North America.

    PubMed

    Cheng, Vincent Y S; Arhonditsis, George B; Sills, David M L; Gough, William A; Auld, Heather

    2015-03-25

    Tornadoes represent one of nature's most hazardous phenomena that have been responsible for significant destruction and devastating fatalities. Here we present a Bayesian modelling approach for elucidating the spatiotemporal patterns of tornado activity in North America. Our analysis shows a significant increase in the Canadian Prairies and the Northern Great Plains during the summer, indicating a clear transition of tornado activity from the United States to Canada. The linkage between monthly-averaged atmospheric variables and likelihood of tornado events is characterized by distinct seasonality; the convective available potential energy is the predominant factor in the summer; vertical wind shear appears to have a strong signature primarily in the winter and secondarily in the summer; and storm relative environmental helicity is most influential in the spring. The present probabilistic mapping can be used to draw inference on the likelihood of tornado occurrence in any location in North America within a selected time period of the year.

  10. No-core shell model in an EFT framework

    NASA Astrophysics Data System (ADS)

    Stetcu, Ionel; Torkkola, Juhani L.; Barrett, Bruce R.; van Kolck, Ubirajara

    2006-10-01

    Based on an effective field theory (EFT) that integrates out the pions as degrees of freedom (pionless theory), we present a new approach to the derivation of effective interactions suitable for many-body calculations by means of the no-core shell model. The main investigation is directed toward the description of two-body scattering observables in a restricted harmonic oscillator (HO) basis, and the inherent Gibbs oscillation problem which arises from the truncation of the Hilbert space using HO wave functions. Application of the effective interactions to the description of ^4He will be discussed. I.S. J.L.T, and B.R.B. acknowledge partial support by NSF grant numbers PHY0070858 and PHY0244389. U.v.K. acknowledges partial support from DOE grant number DE-FG02-04ER41338 and from the Sloan Foundation.

  11. The Modular Modeling System (MMS): A modeling framework for water- and environmental-resources management

    USGS Publications Warehouse

    Leavesley, G.H.; Markstrom, S.L.; Viger, R.J.

    2004-01-01

    The interdisciplinary nature and increasing complexity of water- and environmental-resource problems require the use of modeling approaches that can incorporate knowledge from a broad range of scientific disciplines. The large number of distributed hydrological and ecosystem models currently available are composed of a variety of different conceptualizations of the associated processes they simulate. Assessment of the capabilities of these distributed models requires evaluation of the conceptualizations of the individual processes, and the identification of which conceptualizations are most appropriate for various combinations of criteria, such as problem objectives, data constraints, and spatial and temporal scales of application. With this knowledge, "optimal" models for specific sets of criteria can be created and applied. The U.S. Geological Survey (USGS) Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide these model development and application capabilities. MMS supports the integration of models and tools at a variety of levels of modular design. These include individual process models, tightly coupled models, loosely coupled models, and fully-integrated decision support systems. A variety of visualization and statistical tools are also provided. MMS has been coupled with the Bureau of Reclamation (BOR) object-oriented reservoir and river-system modeling framework, RiverWare, under a joint USGS-BOR program called the Watershed and River System Management Program. MMS and RiverWare are linked using a shared relational database. The resulting database-centered decision support system provides tools for evaluating and applying optimal resource-allocation and management strategies to complex, operational decisions on multipurpose reservoir systems and watersheds. Management issues being addressed include efficiency of water-resources management, environmental concerns such as meeting flow needs for

  12. Structural Uncertainties in RANS Models: Reynolds Stress Transport contra Eddy Viscosity Frameworks

    NASA Astrophysics Data System (ADS)

    Mishra, Aashwin; Edeling, Wouter; Iaccarino, Gianluca

    2016-11-01

    A vast majority of turbulent flow studies, both in academia and industry, utilize Reynolds Averaged Navier Stokes based models. There are different RANS modeling frameworks to select from, depending on their complexity and computational requirements, such as eddy viscosity based models, second moment closures, etc. While the relative strengths and weaknesses of each modeling paradigm (vis-a-vis their predictive fidelity, realizability, etc) are roughly established for disparate flows, there are no extant comparative estimates on the relative uncertainty in their predictions. In this investigation, we estimate the structural uncertainty inherent to different RANS modeling approaches for select internal flows. This involves comparisons between models conforming to the same framework, and, across different modeling frameworks. We establish, compare, analyze and explicate the model inadequacy for flows such as in parallel, curved, converging and diverging channels for different models. One of the novel facets of this study involves the estimation of the structural uncertainties of established Reynolds Stress Transport models, and, contrasting these against simpler eddy viscosity models. This work was supported under the DARPA EQUiPS project(Technical Monitor: Fariba Fahroo).

  13. A new framework for modeling decisions about changing information: The Piecewise Linear Ballistic Accumulator model

    PubMed Central

    Heathcote, Andrew

    2016-01-01

    In the real world, decision making processes must be able to integrate non-stationary information that changes systematically while the decision is in progress. Although theories of decision making have traditionally been applied to paradigms with stationary information, non-stationary stimuli are now of increasing theoretical interest. We use a random-dot motion paradigm along with cognitive modeling to investigate how the decision process is updated when a stimulus changes. Participants viewed a cloud of moving dots, where the motion switched directions midway through some trials, and were asked to determine the direction of motion. Behavioral results revealed a strong delay effect: after presentation of the initial motion direction there is a substantial time delay before the changed motion information is integrated into the decision process. To further investigate the underlying changes in the decision process, we developed a Piecewise Linear Ballistic Accumulator model (PLBA). The PLBA is efficient to simulate, enabling it to be fit to participant choice and response-time distribution data in a hierarchal modeling framework using a non-parametric approximate Bayesian algorithm. Consistent with behavioral results, PLBA fits confirmed the presence of a long delay between presentation and integration of new stimulus information, but did not support increased response caution in reaction to the change. We also found the decision process was not veridical, as symmetric stimulus change had an asymmetric effect on the rate of evidence accumulation. Thus, the perceptual decision process was slow to react to, and underestimated, new contrary motion information. PMID:26760448

  14. A climate robust integrated modelling framework for regional impact assessment of climate change

    NASA Astrophysics Data System (ADS)

    Janssen, Gijs; Bakker, Alexander; van Ek, Remco; Groot, Annemarie; Kroes, Joop; Kuiper, Marijn; Schipper, Peter; van Walsum, Paul; Wamelink, Wieger; Mol, Janet

    2013-04-01

    Decision making towards climate proofing the water management of regional catchments can benefit greatly from the availability of a climate robust integrated modelling framework, capable of a consistent assessment of climate change impacts on the various interests present in the catchments. In the Netherlands, much effort has been devoted to developing state-of-the-art regional dynamic groundwater models with a very high spatial resolution (25x25 m2). Still, these models are not completely satisfactory to decision makers because the modelling concepts do not take into account feedbacks between meteorology, vegetation/crop growth, and hydrology. This introduces uncertainties in forecasting the effects of climate change on groundwater, surface water, agricultural yields, and development of groundwater dependent terrestrial ecosystems. These uncertainties add to the uncertainties about the predictions on climate change itself. In order to create an integrated, climate robust modelling framework, we coupled existing model codes on hydrology, agriculture and nature that are currently in use at the different research institutes in the Netherlands. The modelling framework consists of the model codes MODFLOW (groundwater flow), MetaSWAP (vadose zone), WOFOST (crop growth), SMART2-SUMO2 (soil-vegetation) and NTM3 (nature valuation). MODFLOW, MetaSWAP and WOFOST are coupled online (i.e. exchange information on time step basis). Thus, changes in meteorology and CO2-concentrations affect crop growth and feedbacks between crop growth, vadose zone water movement and groundwater recharge are accounted for. The model chain WOFOST-MetaSWAP-MODFLOW generates hydrological input for the ecological prediction model combination SMART2-SUMO2-NTM3. The modelling framework was used to support the regional water management decision making process in the 267 km2 Baakse Beek-Veengoot catchment in the east of the Netherlands. Computations were performed for regionalized 30-year climate change

  15. A Fuzzy Logic Framework for Integrating Multiple Learned Models

    SciTech Connect

    Hartog, Bobi Kai Den

    1999-03-01

    The Artificial Intelligence field of Integrating Multiple Learned Models (IMLM) explores ways to combine results from sets of trained programs. Aroclor Interpretation is an ill-conditioned problem in which trained programs must operate in scenarios outside their training ranges because it is intractable to train them completely. Consequently, they fail in ways related to the scenarios. We developed a general-purpose IMLM solution, the Combiner, and applied it to Aroclor Interpretation. The Combiner's first step, Scenario Identification (M), learns rules from very sparse, synthetic training data consisting of results from a suite of trained programs called Methods. S1 produces fuzzy belief weights for each scenario by approximately matching the rules. The Combiner's second step, Aroclor Presence Detection (AP), classifies each of three Aroclors as present or absent in a sample. The third step, Aroclor Quantification (AQ), produces quantitative values for the concentration of each Aroclor in a sample. AP and AQ use automatically learned empirical biases for each of the Methods in each scenario. Through fuzzy logic, AP and AQ combine scenario weights, automatically learned biases for each of the Methods in each scenario, and Methods' results to determine results for a sample.

  16. A Framework of Multi Objectives Negotiation for Dynamic Supply Chain Model

    NASA Astrophysics Data System (ADS)

    Chai, Jia Yee; Sakaguchi, Tatsuhiko; Shirase, Keiichi

    Trends of globalization and advances in Information Technology (IT) have created opportunity in collaborative manufacturing across national borders. A dynamic supply chain utilizes these advances to enable more flexibility in business cooperation. This research proposes a concurrent decision making framework for a three echelons dynamic supply chain model. The dynamic supply chain is formed by autonomous negotiation among agents based on multi agents approach. Instead of generating negotiation aspects (such as amount, price and due date) arbitrary, this framework proposes to utilize the information available at operational level of an organization in order to generate realistic negotiation aspect. The effectiveness of the proposed model is demonstrated by various case studies.

  17. The Melanoma MAICare Framework: A Microsimulation Model for the Assessment of Individualized Cancer Care.

    PubMed

    van der Meijde, Elisabeth; van den Eertwegh, Alfons J M; Linn, Sabine C; Meijer, Gerrit A; Fijneman, Remond J A; Coupé, Veerle M H

    2016-01-01

    Recently, new but expensive treatments have become available for metastatic melanoma. These improve survival, but in view of the limited funds available, cost-effectiveness needs to be evaluated. Most cancer cost-effectiveness models are based on the observed clinical events such as recurrence- free and overall survival. Times at which events are recorded depend not only on the effectiveness of treatment but also on the timing of examinations and the types of tests performed. Our objective was to construct a microsimulation model framework that describes the melanoma disease process using a description of underlying tumor growth as well as its interaction with diagnostics, treatments, and surveillance. The framework should allow for exploration of the impact of simultaneously altering curative treatment approaches in different phases of the disease as well as altering diagnostics. The developed framework consists of two components, namely, the disease model and the clinical management module. The disease model consists of a tumor level, describing growth and metastasis of the tumor, and a patient level, describing clinically observed states, such as recurrence and death. The clinical management module consists of the care patients receive. This module interacts with the disease process, influencing the rate of transition between tumor growth states at the tumor level and the rate of detecting a recurrence at the patient level. We describe the framework as the required input and the model output. Furthermore, we illustrate model calibration using registry data and data from the literature.

  18. The Melanoma MAICare Framework: A Microsimulation Model for the Assessment of Individualized Cancer Care

    PubMed Central

    van der Meijde, Elisabeth; van den Eertwegh, Alfons J. M.; Linn, Sabine C.; Meijer, Gerrit A.; Fijneman, Remond J. A.; Coupé, Veerle M. H.

    2016-01-01

    Recently, new but expensive treatments have become available for metastatic melanoma. These improve survival, but in view of the limited funds available, cost-effectiveness needs to be evaluated. Most cancer cost-effectiveness models are based on the observed clinical events such as recurrence- free and overall survival. Times at which events are recorded depend not only on the effectiveness of treatment but also on the timing of examinations and the types of tests performed. Our objective was to construct a microsimulation model framework that describes the melanoma disease process using a description of underlying tumor growth as well as its interaction with diagnostics, treatments, and surveillance. The framework should allow for exploration of the impact of simultaneously altering curative treatment approaches in different phases of the disease as well as altering diagnostics. The developed framework consists of two components, namely, the disease model and the clinical management module. The disease model consists of a tumor level, describing growth and metastasis of the tumor, and a patient level, describing clinically observed states, such as recurrence and death. The clinical management module consists of the care patients receive. This module interacts with the disease process, influencing the rate of transition between tumor growth states at the tumor level and the rate of detecting a recurrence at the patient level. We describe the framework as the required input and the model output. Furthermore, we illustrate model calibration using registry data and data from the literature. PMID:27346945

  19. A Physics-Based Modeling Framework for Prognostic Studies

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.

    2014-01-01

    Prognostics and Health Management (PHM) methodologies have emerged as one of the key enablers for achieving efficient system level maintenance as part of a busy operations schedule, and lowering overall life cycle costs. PHM is also emerging as a high-priority issue in critical applications, where the focus is on conducting fundamental research in the field of integrated systems health management. The term diagnostics relates to the ability to detect and isolate faults or failures in a system. Prognostics on the other hand is the process of predicting health condition and remaining useful life based on current state, previous conditions and future operating conditions. PHM methods combine sensing, data collection, interpretation of environmental, operational, and performance related parameters to indicate systems health under its actual application conditions. The development of prognostics methodologies for the electronics field has become more important as more electrical systems are being used to replace traditional systems in several applications in the aeronautics, maritime, and automotive fields. The development of prognostics methods for electronics presents several challenges due to the great variety of components used in a system, a continuous development of new electronics technologies, and a general lack of understanding of how electronics fail. Similarly with electric unmanned aerial vehicles, electrichybrid cars, and commercial passenger aircraft, we are witnessing a drastic increase in the usage of batteries to power vehicles. However, for battery-powered vehicles to operate at maximum efficiency and reliability, it becomes crucial to both monitor battery health and performance and to predict end of discharge (EOD) and end of useful life (EOL) events. We develop an electrochemistry-based model of Li-ion batteries that capture the significant electrochemical processes, are computationally efficient, capture the effects of aging, and are of suitable

  20. Health behaviour models: a framework for studying adherence in children with atopic dermatitis.

    PubMed

    Chisolm, S S; Taylor, S L; Gryzwacz, J G; O'Neill, J L; Balkrishnan, R R; Feldman, S R

    2010-04-01

    Atopic dermatitis (AD) is a common problem of childhood causing considerable distress. Effective topical treatments exist, yet poor adherence often results in poor outcomes. A framework is needed to better understand adherence behaviour. To provide a basis for this framework, we reviewed established models used to describe health behaviour. Structural elements of these models informed the development of an adherence model for AD that can be used to complement empirical AD treatment trials. Health behaviour models provide a means to describe factors that affect adherence and that can mediate the effects of different adherence interventions. Models of adherence behaviour are important for promoting better treatment outcomes for children with AD and their families. These models provide a means to identify new targets to improve adherence and a guide for refining adherence interventions.

  1. Models and Frameworks: A Synergistic Association for Developing Component-Based Applications

    PubMed Central

    Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858

  2. A Framework to Implement IoT Network Performance Modelling Techniques for Network Solution Selection.

    PubMed

    Delaney, Declan T; O'Hare, Gregory M P

    2016-12-01

    No single network solution for Internet of Things (IoT) networks can provide the required level of Quality of Service (QoS) for all applications in all environments. This leads to an increasing number of solutions created to fit particular scenarios. Given the increasing number and complexity of solutions available, it becomes difficult for an application developer to choose the solution which is best suited for an application. This article introduces a framework which autonomously chooses the best solution for the application given the current deployed environment. The framework utilises a performance model to predict the expected performance of a particular solution in a given environment. The framework can then choose an apt solution for the application from a set of available solutions. This article presents the framework with a set of models built using data collected from simulation. The modelling technique can determine with up to 85% accuracy the solution which performs the best for a particular performance metric given a set of solutions. The article highlights the fractured and disjointed practice currently in place for examining and comparing communication solutions and aims to open a discussion on harmonising testing procedures so that different solutions can be directly compared and offers a framework to achieve this within IoT networks.

  3. A Framework to Implement IoT Network Performance Modelling Techniques for Network Solution Selection †

    PubMed Central

    Delaney, Declan T.; O’Hare, Gregory M. P.

    2016-01-01

    No single network solution for Internet of Things (IoT) networks can provide the required level of Quality of Service (QoS) for all applications in all environments. This leads to an increasing number of solutions created to fit particular scenarios. Given the increasing number and complexity of solutions available, it becomes difficult for an application developer to choose the solution which is best suited for an application. This article introduces a framework which autonomously chooses the best solution for the application given the current deployed environment. The framework utilises a performance model to predict the expected performance of a particular solution in a given environment. The framework can then choose an apt solution for the application from a set of available solutions. This article presents the framework with a set of models built using data collected from simulation. The modelling technique can determine with up to 85% accuracy the solution which performs the best for a particular performance metric given a set of solutions. The article highlights the fractured and disjointed practice currently in place for examining and comparing communication solutions and aims to open a discussion on harmonising testing procedures so that different solutions can be directly compared and offers a framework to achieve this within IoT networks. PMID:27916929

  4. Modeling overland flow-driven erosion across a watershed DEM using the Landlab modeling framework.

    NASA Astrophysics Data System (ADS)

    Adams, J. M.; Gasparini, N. M.; Tucker, G. E.; Hobley, D. E. J.; Hutton, E. W. H.; Nudurupati, S. S.; Istanbulluoglu, E.

    2015-12-01

    Many traditional landscape evolution models assume steady-state hydrology when computing discharge, and generally route flow in a single direction, along the path of steepest descent. Previous work has demonstrated that, for larger watersheds or short-duration storms, hydrologic steady-state may not be achieved. In semiarid regions, often dominated by convective summertime storms, landscapes are likely heavily influenced by these short-duration but high-intensity periods of rainfall. To capture these geomorphically significant bursts of rain, a new overland flow method has been implemented in the Landlab modeling framework. This overland flow method routes a hydrograph across a landscape, and allows flow to travel in multiple directions out of a given grid node. This study compares traditional steady-state flow routing and incision methods to the new, hydrograph-driven overland flow and erosion model in Landlab. We propose that for short-duration, high-intensity precipitation events, steady-state, single-direction flow routing models will significantly overestimate discharge and erosion when compared with non-steady, multiple flow direction model solutions. To test this hypothesis, discharge and erosion are modeled using both steady-state and hydrograph methods. A stochastic storm generator is used to generate short-duration, high-intensity precipitation intervals, which drive modeled discharge and erosion across a watershed imported from a digital elevation model, highlighting Landlab's robust raster-gridding library and watershed modeling capabilities. For each storm event in this analysis, peak discharge at the outlet, incision rate at the outlet, as well as total discharge and erosion depth are compared between methods. Additionally, these results are organized by storm duration and intensity to understand how erosion rates scale with precipitation between both flow routing methods. Results show that in many cases traditional steady-state methods overestimate

  5. The C1C2: A framework for simultaneous model selection and assessment

    PubMed Central

    Eklund, Martin; Spjuth, Ola; Wikberg, Jarl ES

    2008-01-01

    Background There has been recent concern regarding the inability of predictive modeling approaches to generalize to new data. Some of the problems can be attributed to improper methods for model selection and assessment. Here, we have addressed this issue by introducing a novel and general framework, the C1C2, for simultaneous model selection and assessment. The framework relies on a partitioning of the data in order to separate model choice from model assessment in terms of used data. Since the number of conceivable models in general is vast, it was also of interest to investigate the employment of two automatic search methods, a genetic algorithm and a brute-force method, for model choice. As a demonstration, the C1C2 was applied to simulated and real-world datasets. A penalized linear model was assumed to reasonably approximate the true relation between the dependent and independent variables, thus reducing the model choice problem to a matter of variable selection and choice of penalizing parameter. We also studied the impact of assuming prior knowledge about the number of relevant variables on model choice and generalization error estimates. The results obtained with the C1C2 were compared to those obtained by employing repeated K-fold cross-validation for choosing and assessing a model. Results The C1C2 framework performed well at finding the true model in terms of choosing the correct variable subset and producing reasonable choices for the penalizing parameter, even in situations when the independent variables were highly correlated and when the number of observations was less than the number of variables. The C1C2 framework was also found to give accurate estimates of the generalization error. Prior information about the number of important independent variables improved the variable subset choice but reduced the accuracy of generalization error estimates. Using the genetic algorithm worsened the model choice but not the generalization error estimates

  6. A FRAMEWORK FOR EVALUATING REGIONAL-SCALE NUMERICAL PHOTOCHEMICAL MODELING SYSTEMS

    PubMed Central

    Dennis, Robin; Fox, Tyler; Fuentes, Montse; Gilliland, Alice; Hanna, Steven; Hogrefe, Christian; Irwin, John; Rao, S.Trivikrama.; Scheffe, Richard; Schere, Kenneth; Steyn, Douw; Venkatram, Akula

    2011-01-01

    This paper discusses the need for critically evaluating regional-scale (~200-2000 km) three-dimensional numerical photochemical air quality modeling systems to establish a model’s credibility in simulating the spatio-temporal features embedded in the observations. Because of limitations of currently used approaches for evaluating regional air quality models, a framework for model evaluation is introduced here for determining the suitability of a modeling system for a given application, distinguishing the performance between different models through confidence-testing of model results, guiding model development, and analyzing the impacts of regulatory policy options. The framework identifies operational, diagnostic, dynamic, and probabilistic types of model evaluation. Operational evaluation techniques include statistical and graphical analyses aimed at determining whether model estimates are in agreement with the observations in an overall sense. Diagnostic evaluation focuses on process-oriented analyses to determine whether the individual processes and components of the model system are working correctly, both independently and in combination. Dynamic evaluation assesses the ability of the air quality model to simulate changes in air quality stemming from changes in source emissions and/or meteorology, the principal forces that drive the air quality model. Probabilistic evaluation attempts to assess the confidence that can be placed in model predictions using techniques such as ensemble modeling and Bayesian model averaging. The advantages of these types of model evaluation approaches are discussed in this paper. PMID:21461126

  7. Alternative Model-Based and Design-Based Frameworks for Inference from Samples to Populations: From Polarization to Integration

    ERIC Educational Resources Information Center

    Sterba, Sonya K.

    2009-01-01

    A model-based framework, due originally to R. A. Fisher, and a design-based framework, due originally to J. Neyman, offer alternative mechanisms for inference from samples to populations. We show how these frameworks can utilize different types of samples (nonrandom or random vs. only random) and allow different kinds of inference (descriptive vs.…

  8. Implementing vertex dynamics models of cell populations in biology within a consistent computational framework.

    PubMed

    Fletcher, Alexander G; Osborne, James M; Maini, Philip K; Gavaghan, David J

    2013-11-01

    The dynamic behaviour of epithelial cell sheets plays a central role during development, growth, disease and wound healing. These processes occur as a result of cell adhesion, migration, division, differentiation and death, and involve multiple processes acting at the cellular and molecular level. Computational models offer a useful means by which to investigate and test hypotheses about these processes, and have played a key role in the study of cell-cell interactions. However, the necessarily complex nature of such models means that it is difficult to make accurate comparison between different models, since it is often impossible to distinguish between differences in behaviour that are due to the underlying model assumptions, and those due to differences in the in silico implementation of the model. In this work, an approach is described for the implementation of vertex dynamics models, a discrete approach that represents each cell by a polygon (or polyhedron) whose vertices may move in response to forces. The implementation is undertaken in a consistent manner within a single open source computational framework, Chaste, which comprises fully tested, industrial-grade software that has been developed using an agile approach. This framework allows one to easily change assumptions regarding force generation and cell rearrangement processes within these models. The versatility and generality of this framework is illustrated using a number of biological examples. In each case we provide full details of all technical aspects of our model implementations, and in some cases provide extensions to make the models more generally applicable.

  9. Predicting the resilience and recovery of aquatic systems: A framework for model evolution within environmental observatories

    NASA Astrophysics Data System (ADS)

    Hipsey, Matthew R.; Hamilton, David P.; Hanson, Paul C.; Carey, Cayelan C.; Coletti, Janaine Z.; Read, Jordan S.; Ibelings, Bas W.; Valesini, Fiona J.; Brookes, Justin D.

    2015-09-01

    Maintaining the health of aquatic systems is an essential component of sustainable catchment management, however, degradation of water quality and aquatic habitat continues to challenge scientists and policy-makers. To support management and restoration efforts aquatic system models are required that are able to capture the often complex trajectories that these systems display in response to multiple stressors. This paper explores the abilities and limitations of current model approaches in meeting this challenge, and outlines a strategy based on integration of flexible model libraries and data from observation networks, within a learning framework, as a means to improve the accuracy and scope of model predictions. The framework is comprised of a data assimilation component that utilizes diverse data streams from sensor networks, and a second component whereby model structural evolution can occur once the model is assessed against theoretically relevant metrics of system function. Given the scale and transdisciplinary nature of the prediction challenge, network science initiatives are identified as a means to develop and integrate diverse model libraries and workflows, and to obtain consensus on diagnostic approaches to model assessment that can guide model adaptation. We outline how such a framework can help us explore the theory of how aquatic systems respond to change by bridging bottom-up and top-down lines of enquiry, and, in doing so, also advance the role of prediction in aquatic ecosystem management.

  10. A structured continuum modelling framework for martensitic transformation and reorientation in shape memory materials.

    PubMed

    Bernardini, Davide; Pence, Thomas J

    2016-04-28

    Models for shape memory material behaviour can be posed in the framework of a structured continuum theory. We study such a framework in which a scalar phase fraction field and a tensor field of martensite reorientation describe the material microstructure, in the context of finite strains. Gradients of the microstructural descriptors naturally enter the formulation and offer the possibility to describe and resolve phase transformation localizations. The constitutive theory is thoroughly described by a single free energy function in conjunction with a path-dependent dissipation function. Balance laws in the form of differential equations are obtained and contain both bulk and surface terms, the latter in terms of microstreses. A natural constraint on the tensor field for martensite reorientation gives rise to reactive fields in these balance laws. Conditions ensuring objectivity as well as the relation of this framework to that provided by currently used models for shape memory alloy behaviour are discussed.

  11. A conceptual framework to design a dimensional model based on the HL7 Clinical Document Architecture.

    PubMed

    Pecoraro, Fabrizio; Luzi, Daniela; Ricci, Fabrizio L

    2014-01-01

    This paper proposes a conceptual framework to design a dimensional model based on the HL7 Clinical Document Architecture (CDA) standard. The adoption of this framework can represent a possible solution to facilitate the integration of heterogeneous information systems in a clinical data warehouse. This can simplify the Extract, Transform and Load (ETL) procedures that are considered the most time-consuming and expensive part of the data warehouse development process. The paper describes the main activities to be carried out to design the dimensional model outlining the main advantages in the application of the proposed framework. The feasibility of our approach is also demonstrated providing a case study to define clinical indicators for quality assessment.

  12. Models of recognition, repetition priming, and fluency: exploring a new framework.

    PubMed

    Berry, Christopher J; Shanks, David R; Speekenbrink, Maarten; Henson, Richard N A

    2012-01-01

    We present a new modeling framework for recognition memory and repetition priming based on signal detection theory. We use this framework to specify and test the predictions of 4 models: (a) a single-system (SS) model, in which one continuous memory signal drives recognition and priming; (b) a multiple-systems-1 (MS1) model, in which completely independent memory signals (such as explicit and implicit memory) drive recognition and priming; (c) a multiple-systems-2 (MS2) model, in which there are also 2 memory signals, but some degree of dependence is allowed between these 2 signals (and this model subsumes the SS and MS1 models as special cases); and (d) a dual-process signal detection (DPSD1) model, 1 possible extension of a dual-process theory of recognition (Yonelinas, 1994) to priming, in which a signal detection model is augmented by an independent recollection process. The predictions of the models are tested in a continuous-identification-with-recognition paradigm in both normal adults (Experiments 1-3) and amnesic individuals (using data from Conroy, Hopkins, & Squire, 2005). The SS model predicted numerous results in advance. These were not predicted by the MS1 model, though could be accommodated by the more flexible MS2 model. Importantly, measures of overall model fit favored the SS model over the others. These results illustrate a new, formal approach to testing theories of explicit and implicit memory.

  13. Framework for non-coherent interface models at finite displacement jumps and finite strains

    NASA Astrophysics Data System (ADS)

    Ottosen, Niels Saabye; Ristinmaa, Matti; Mosler, Jörn

    2016-05-01

    This paper deals with a novel constitutive framework suitable for non-coherent interfaces, such as cracks, undergoing large deformations in a geometrically exact setting. For this type of interface, the displacement field shows a jump across the interface. Within the engineering community, so-called cohesive zone models are frequently applied in order to describe non-coherent interfaces. However, for existing models to comply with the restrictions imposed by (a) thermodynamical consistency (e.g., the second law of thermodynamics), (b) balance equations (in particular, balance of angular momentum) and (c) material frame indifference, these models are essentially fiber models, i.e. models where the traction vector is collinear with the displacement jump. This constraints the ability to model shear and, in addition, anisotropic effects are excluded. A novel, extended constitutive framework which is consistent with the above mentioned fundamental physical principles is elaborated in this paper. In addition to the classical tractions associated with a cohesive zone model, the main idea is to consider additional tractions related to membrane-like forces and out-of-plane shear forces acting within the interface. For zero displacement jump, i.e. coherent interfaces, this framework degenerates to existing formulations presented in the literature. For hyperelasticity, the Helmholtz energy of the proposed novel framework depends on the displacement jump as well as on the tangent vectors of the interface with respect to the current configuration - or equivalently - the Helmholtz energy depends on the displacement jump and the surface deformation gradient. It turns out that by defining the Helmholtz energy in terms of the invariants of these variables, all above-mentioned fundamental physical principles are automatically fulfilled. Extensions of the novel framework necessary for material degradation (damage) and plasticity are also covered.

  14. Simulation-optimization framework for multi-site multi-season hybrid stochastic streamflow modeling

    NASA Astrophysics Data System (ADS)

    Srivastav, Roshan; Srinivasan, K.; Sudheer, K. P.

    2016-11-01

    A simulation-optimization (S-O) framework is developed for the hybrid stochastic modeling of multi-site multi-season streamflows. The multi-objective optimization model formulated is the driver and the multi-site, multi-season hybrid matched block bootstrap model (MHMABB) is the simulation engine within this framework. The multi-site multi-season simulation model is the extension of the existing single-site multi-season simulation model. A robust and efficient evolutionary search based technique, namely, non-dominated sorting based genetic algorithm (NSGA - II) is employed as the solution technique for the multi-objective optimization within the S-O framework. The objective functions employed are related to the preservation of the multi-site critical deficit run sum and the constraints introduced are concerned with the hybrid model parameter space, and the preservation of certain statistics (such as inter-annual dependence and/or skewness of aggregated annual flows). The efficacy of the proposed S-O framework is brought out through a case example from the Colorado River basin. The proposed multi-site multi-season model AMHMABB (whose parameters are obtained from the proposed S-O framework) preserves the temporal as well as the spatial statistics of the historical flows. Also, the other multi-site deficit run characteristics namely, the number of runs, the maximum run length, the mean run sum and the mean run length are well preserved by the AMHMABB model. Overall, the proposed AMHMABB model is able to show better streamflow modeling performance when compared with the simulation based SMHMABB model, plausibly due to the significant role played by: (i) the objective functions related to the preservation of multi-site critical deficit run sum; (ii) the huge hybrid model parameter space available for the evolutionary search and (iii) the constraint on the preservation of the inter-annual dependence. Split-sample validation results indicate that the AMHMABB model is

  15. A multidimensional discontinuous Galerkin modeling framework for overland flow and channel routing

    NASA Astrophysics Data System (ADS)

    West, Dustin W.; Kubatko, Ethan J.; Conroy, Colton J.; Yaufman, Mariah; Wood, Dylan

    2017-04-01

    In this paper, we present the development and application of a new multidimensional, unstructured-mesh model for simulating coupled overland/open-channel flows in the kinematic wave approximation regime. The modeling approach makes use of discontinuous Galerkin (DG) finite element spatial discretizations of variable polynomial degree p, paired with explicit Runge-Kutta time steppers, and is supported by advancements made to an automatic mesh generation tool, ADMESH +, that is used to construct constrained triangulations for channel routing. The developed modeling framework is applied to a set of four test cases, where numerical results are found to compare well with known analytic solutions, experimental data and results from another well-established (structured, finite difference) model within the area of application. The numerical results obtained demonstrate the accuracy and robustness of the developed modeling framework and highlight the potential benefits of using p (polynomial) refinement for hydrological simulations.

  16. a Framework for Voxel-Based Global Scale Modeling of Urban Environments

    NASA Astrophysics Data System (ADS)

    Gehrung, Joachim; Hebel, Marcus; Arens, Michael; Stilla, Uwe

    2016-10-01

    The generation of 3D city models is a very active field of research. Modeling environments as point clouds may be fast, but has disadvantages. These are easily solvable by using volumetric representations, especially when considering selective data acquisition, change detection and fast changing environments. Therefore, this paper proposes a framework for the volumetric modeling and visualization of large scale urban environments. Beside an architecture and the right mix of algorithms for the task, two compression strategies for volumetric models as well as a data quality based approach for the import of range measurements are proposed. The capabilities of the framework are shown on a mobile laser scanning dataset of the Technical University of Munich. Furthermore the loss of the compression techniques is evaluated and their memory consumption is compared to that of raw point clouds. The presented results show that generation, storage and real-time rendering of even large urban models are feasible, even with off-the-shelf hardware.

  17. Integrated Modeling, Mapping, and Simulation (IMMS) Framework for Exercise and Response Planning

    NASA Technical Reports Server (NTRS)

    Mapar, Jalal; Hoette, Trisha; Mahrous, Karim; Pancerella, Carmen M.; Plantenga, Todd; Yang, Christine; Yang, Lynn; Hopmeier, Michael

    2011-01-01

    EmergenCy management personnel at federal, stale, and local levels can benefit from the increased situational awareness and operational efficiency afforded by simulation and modeling for emergency preparedness, including planning, training and exercises. To support this goal, the Department of Homeland Security's Science & Technology Directorate is funding the Integrated Modeling, Mapping, and Simulation (IMMS) program to create an integrating framework that brings together diverse models for use by the emergency response community. SUMMIT, one piece of the IMMS program, is the initial software framework that connects users such as emergency planners and exercise developers with modeling resources, bridging the gap in expertise and technical skills between these two communities. SUMMIT was recently deployed to support exercise planning for National Level Exercise 2010. Threat, casualty. infrastructure, and medical surge models were combined within SUMMIT to estimate health care resource requirements for the exercise ground truth.

  18. Hybrid modelling framework by using mathematics-based and information-based methods

    NASA Astrophysics Data System (ADS)

    Ghaboussi, J.; Kim, J.; Elnashai, A.

    2010-06-01

    Mathematics-based computational mechanics involves idealization in going from the observed behaviour of a system into mathematical equations representing the underlying mechanics of that behaviour. Idealization may lead mathematical models that exclude certain aspects of the complex behaviour that may be significant. An alternative approach is data-centric modelling that constitutes a fundamental shift from mathematical equations to data that contain the required information about the underlying mechanics. However, purely data-centric methods often fail for infrequent events and large state changes. In this article, a new hybrid modelling framework is proposed to improve accuracy in simulation of real-world systems. In the hybrid framework, a mathematical model is complemented by information-based components. The role of informational components is to model aspects which the mathematical model leaves out. The missing aspects are extracted and identified through Autoprogressive Algorithms. The proposed hybrid modelling framework has a wide range of potential applications for natural and engineered systems. The potential of the hybrid methodology is illustrated through modelling highly pinched hysteretic behaviour of beam-to-column connections in steel frames.

  19. Predicting the resilience and recovery of aquatic systems: a framework for model evolution within environmental observatories

    USGS Publications Warehouse

    Hipsey, Matthew R.; Hamilton, David P.; Hanson, Paul C.; Carey, Cayelan C; Coletti, Janaine Z; Read, Jordan S.; Ibelings, Bas W; Valensini, Fiona J; Brookes, Justin D

    2015-01-01

    Maintaining the health of aquatic systems is an essential component of sustainable catchmentmanagement, however, degradation of water quality and aquatic habitat continues to challenge scientistsand policy-makers. To support management and restoration efforts aquatic system models are requiredthat are able to capture the often complex trajectories that these systems display in response to multiplestressors. This paper explores the abilities and limitations of current model approaches in meeting this chal-lenge, and outlines a strategy based on integration of flexible model libraries and data from observationnetworks, within a learning framework, as a means to improve the accuracy and scope of model predictions.The framework is comprised of a data assimilation component that utilizes diverse data streams from sensornetworks, and a second component whereby model structural evolution can occur once the model isassessed against theoretically relevant metrics of system function. Given the scale and transdisciplinarynature of the prediction challenge, network science initiatives are identified as a means to develop and inte-grate diverse model libraries and workflows, and to obtain consensus on diagnostic approaches to modelassessment that can guide model adaptation. We outline how such a framework can help us explore thetheory of how aquatic systems respond to change by bridging bottom-up and top-down lines of enquiry,and, in doing so, also advance the role of prediction in aquatic ecosystem management.

  20. A Novel, Physics-Based Data Analytics Framework for Reducing Systematic Model Errors

    NASA Astrophysics Data System (ADS)

    Wu, W.; Liu, Y.; Vandenberghe, F. C.; Knievel, J. C.; Hacker, J.

    2015-12-01

    Most climate and weather models exhibit systematic biases, such as under predicted diurnal temperatures in the WRF (Weather Research and Forecasting) model. General approaches to alleviate the systematic biases include improving model physics and numerics, improving data assimilation, and bias correction through post-processing. In this study, we developed a novel, physics-based data analytics framework in post processing by taking advantage of ever-growing high-resolution (spatial and temporal) observational and modeling data. In the framework, a spatiotemporal PCA (Principal Component Analysis) is first applied on the observational data to filter out noise and information on scales that a model may not be able to resolve. The filtered observations are then used to establish regression relationships with archived model forecasts in the same spatiotemporal domain. The regressions along with the model forecasts predict the projected observations in the forecasting period. The pre-regression PCA procedure strengthens regressions, and enhances predictive skills. We then combine the projected observations with the past observations to apply PCA iteratively to derive the final forecasts. This post-regression PCA reconstructs variances and scales of information that are lost in the regression. The framework was examined and validated with 24 days of 5-minute observational data and archives from the WRF model at 27 stations near Dugway Proving Ground, Utah. The validation shows significant bias reduction in the diurnal cycle of predicted surface air temperature compared to the direct output from the WRF model. Additionally, unlike other post-processing bias correction schemes, the data analytics framework does not require long-term historic data and model archives. A week or two of the data is enough to take into account changes in weather regimes. The program, written in python, is also computationally efficient.

  1. Smart Frameworks and Self-Describing Models: Model Metadata for Automated Coupling of Hydrologic Process Components (Invited)

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.

    2013-12-01

    Model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that allow heterogeneous sets of process models to be assembled in a plug-and-play manner to create composite "system models". These mechanisms facilitate code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers, e.g. by requiring them to provide their output in specific forms that meet the input requirements of other models. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can compare the answers to these queries with similar answers from other process models in a collection and then automatically call framework

  2. Argumentation, Dialogue Theory, and Probability Modeling: Alternative Frameworks for Argumentation Research in Education

    ERIC Educational Resources Information Center

    Nussbaum, E. Michael

    2011-01-01

    Toulmin's model of argumentation, developed in 1958, has guided much argumentation research in education. However, argumentation theory in philosophy and cognitive science has advanced considerably since 1958. There are currently several alternative frameworks of argumentation that can be useful for both research and practice in education. These…

  3. A Model Driven Framework to Address Challenges in a Mobile Learning Environment

    ERIC Educational Resources Information Center

    Khaddage, Ferial; Christensen, Rhonda; Lai, Wing; Knezek, Gerald; Norris, Cathie; Soloway, Elliot

    2015-01-01

    In this paper a review of the pedagogical, technological, policy and research challenges and concepts underlying mobile learning is presented, followed by a brief description of categories of implementations. A model Mobile learning framework and dynamic criteria for mobile learning implementations are proposed, along with a case study of one site…

  4. Toward a Model Framework of Generalized Parallel Componential Processing of Multi-Symbol Numbers

    ERIC Educational Resources Information Center

    Huber, Stefan; Cornelsen, Sonja; Moeller, Korbinian; Nuerk, Hans-Christoph

    2015-01-01

    In this article, we propose and evaluate a new model framework of parallel componential multi-symbol number processing, generalizing the idea of parallel componential processing of multi-digit numbers to the case of negative numbers by considering the polarity signs similar to single digits. In a first step, we evaluated this account by defining…

  5. MODELING FRAMEWORK FOR EVALUATING SEDIMENTATION IN STREAM NETWORKS: FOR USE IN SEDIMENT TMDL ANALYSIS

    EPA Science Inventory

    A modeling framework that can be used to evaluate sedimentation in stream networks is described. This methodology can be used to determine sediment Total Maximum Daily Loads (TMDLs) in sediment impaired waters, and provide the necessary hydrodynamic and sediment-related data t...

  6. A Supervisory Issue When Utilizing the ASCA National Model Framework in School Counseling

    ERIC Educational Resources Information Center

    Bryant-Young, Necole; Bell, Catherine A.; Davis, Kalena M.

    2014-01-01

    The authors discuss a supervisory issue, in that, the ASCA National Model: A Framework for School Counseling Programs does not emphasize on-going supervision where ethical expectations of supervisors and supervisees in a school setting are clearly defined. Subsequently, the authors highlight supervisor expectations stated with the ASCA National…

  7. PIRPOSAL Model of Integrative STEM Education: Conceptual and Pedagogical Framework for Classroom Implementation

    ERIC Educational Resources Information Center

    Wells, John G.

    2016-01-01

    The PIRPOSAL model is both a conceptual and pedagogical framework intended for use as a pragmatic guide to classroom implementation of Integrative STEM Education. Designerly questioning prompted by a "need to know" serves as the basis for transitioning student designers within and among multiple phases while they progress toward an…

  8. An Analytical Framework for Evaluating E-Commerce Business Models and Strategies.

    ERIC Educational Resources Information Center

    Lee, Chung-Shing

    2001-01-01

    Considers electronic commerce as a paradigm shift, or a disruptive innovation, and presents an analytical framework based on the theories of transaction costs and switching costs. Topics include business transformation process; scale effect; scope effect; new sources of revenue; and e-commerce value creation model and strategy. (LRW)

  9. Leading a New Pedagogical Approach to Australian Curriculum Mathematics: Using the Dual Mathematical Modelling Cycle Framework

    ERIC Educational Resources Information Center

    Lamb, Janeen; Kawakami, Takashi; Saeki, Akihiko; Matsuzaki, Akio

    2014-01-01

    The aim of this study was to investigate the use of the "dual mathematical modelling cycle framework" as one way to meet the espoused goals of the Australian Curriculum Mathematics. This study involved 23 Year 6 students from one Australian primary school who engaged in an "Oil Tank Task" that required them to develop two…

  10. Beyond a Definition: Toward a Framework for Designing and Specifying Mentoring Models

    ERIC Educational Resources Information Center

    Dawson, Phillip

    2014-01-01

    More than three decades of mentoring research has yet to converge on a unifying definition of mentoring; this is unsurprising given the diversity of relationships classified as mentoring. This article advances beyond a definition toward a common framework for specifying mentoring models. Sixteen design elements were identified from the literature…

  11. Map Resource Packet: Course Models for the History-Social Science Framework, Grade Seven.

    ERIC Educational Resources Information Center

    California State Dept. of Education, Sacramento.

    This packet of maps is an auxiliary resource to the "World History and Geography: Medieval and Early Modern Times. Course Models for the History-Social Science Framework, Grade Seven." The set includes: outline, precipitation, and elevation maps; maps for locating key places; landform maps; and historical maps. The list of maps are…

  12. Spatial optimization of watershed management practices for nitrogen load reduction using a modeling-optimization framework

    EPA Science Inventory

    Best management practices (BMPs) are perceived as being effective in reducing nutrient loads transported from non-point sources (NPS) to receiving water bodies. The objective of this study was to develop a modeling-optimization framework that can be used by watershed management p...

  13. Coupled model of INM-IO global ocean model, CICE sea ice model and SCM OIAS framework

    NASA Astrophysics Data System (ADS)

    Bayburin, Ruslan; Rashit, Ibrayev; Konstantin, Ushakov; Vladimir, Kalmykov; Gleb, Dyakonov

    2015-04-01

    Status of coupled Arctic model of ocean and sea ice is presented. Model consists of INM IO global ocean component of high resolution, Los Alamos National Laboratory CICE sea ice model and a framework SCM OIAS for the ocean-ice-atmosphere-land coupled modeling on massively-parallel architectures. Model is currently under development at the Institute of Numerical Mathematics (INM), Hydrometeorological Center (HMC) and P.P. Shirshov Institute of Oceanology (IO). Model is aimed at modeling of intra-annual variability of hydrodynamics in Arctic and. The computational characteristics of the world ocean-sea ice coupled model governed by SCM OIAS are presented. The model is parallelized using MPI technologies and currently can use efficiently up to 5000 cores. Details of programming implementation, computational configuration and physical phenomena parametrization are analyzed in terms of intercoupling complex. Results of five year computational experiment of sea ice, snow and ocean state evolution in Arctic region on tripole grid with horizontal resolution of 3-5 kilometers, closed by atmospheric forcing field from repeating "normal" annual course taken from CORE1 experiment data base are presented and analyzed in terms of the state of vorticity and warm Atlantic water expansion.

  14. Groundwater modelling in decision support: reflections on a unified conceptual framework

    NASA Astrophysics Data System (ADS)

    Doherty, John; Simmons, Craig T.

    2013-11-01

    Groundwater models are commonly used as basis for environmental decision-making. There has been discussion and debate in recent times regarding the issue of model simplicity and complexity. This paper contributes to this ongoing discourse. The selection of an appropriate level of model structural and parameterization complexity is not a simple matter. Although the metrics on which such selection should be based are simple, there are many competing, and often unquantifiable, considerations which must be taken into account as these metrics are applied. A unified conceptual framework is introduced and described which is intended to underpin groundwater modelling in decision support with a direct focus on matters regarding model simplicity and complexity.

  15. Evaluating and Improving Cloud Processes in the Multi-Scale Modeling Framework

    SciTech Connect

    Ackerman, Thomas P.

    2015-03-01

    The research performed under this grant was intended to improve the embedded cloud model in the Multi-scale Modeling Framework (MMF) for convective clouds by using a 2-moment microphysics scheme rather than the single moment scheme used in all the MMF runs to date. The technical report and associated documents describe the results of testing the cloud resolving model with fixed boundary conditions and evaluation of model results with data. The overarching conclusion is that such model evaluations are problematic because errors in the forcing fields control the results so strongly that variations in parameterization values cannot be usefully constrained

  16. TLS and photogrammetry for the modeling of a historic wooden framework

    NASA Astrophysics Data System (ADS)

    Koehl, M.; Viale, M.

    2012-04-01

    The building which is the object of the study is located in the center of Andlau, France. This mansion that was built in 1582 was the residence of the Lords of Andlau from the XVIth century until the French Revolution. Its architecture represents the Renaissance style of the XVIth century in particular by its volutes and its spiral staircase inside the polygonal turret. In January 2005, the municipality of Andlau became the owner of this Seigneury which is intended to welcome the future Heritage Interpretation Center (HIC), a museum is also going to be created there. Three levels of attic of this building are going to be restored and isolated, the historic framework will that way be masked and the last three levels will not be accessible any more. In this context, our lab was asked to model the framework to allow to make diagnoses there, to learn to know and to consolidate the knowledge on this type of historic framework. Finally, next to a virtual visualization, we provided other applications in particular the creation of an accurate 3D model of the framework for animations, as well as for foundation of an historical information system and for supplying the future museum and HIC with digital data. The project contains different phases: the data acquisition, the model creation and data structuring, the creation of an interactive model and the integration in a historic information system. All levels of the attic were acquired: a 3D Trimble GX scanner and partially a Trimble CX scanner were used in particular for the acquisition of data in the highest part of the framework. The various scans were directly georeferenced in the field thanks to control points, then merged together in an unique point cloud covering the whole structure. Several panoramic photos were also realized to create a virtual tour of the framework and the surroundings of the Seigneury. The purpose of the project was to supply a 3D model allowing the creation of scenographies and interactive

  17. A multi-scale modeling framework for instabilities of film/substrate systems

    NASA Astrophysics Data System (ADS)

    Xu, Fan; Potier-Ferry, Michel

    2016-01-01

    Spatial pattern formation in stiff thin films on soft substrates is investigated from a multi-scale point of view based on a technique of slowly varying Fourier coefficients. A general macroscopic modeling framework is developed and then a simplified macroscopic model is derived. The model incorporates Asymptotic Numerical Method (ANM) as a robust path-following technique to trace the post-buckling evolution path and to predict secondary bifurcations. The proposed multi-scale finite element framework allows sinusoidal and square checkerboard patterns as well as their bifurcation portraits to be described from a quantitative standpoint. Moreover, it provides an efficient way to compute large-scale instability problems with a significant reduction of computational cost compared to full models.

  18. An approximate framework for quantum transport calculation with model order reduction

    SciTech Connect

    Chen, Quan; Li, Jun; Yam, Chiyung; Zhang, Yu; Wong, Ngai; Chen, Guanhua

    2015-04-01

    A new approximate computational framework is proposed for computing the non-equilibrium charge density in the context of the non-equilibrium Green's function (NEGF) method for quantum mechanical transport problems. The framework consists of a new formulation, called the X-formulation, for single-energy density calculation based on the solution of sparse linear systems, and a projection-based nonlinear model order reduction (MOR) approach to address the large number of energy points required for large applied biases. The advantages of the new methods are confirmed by numerical experiments.

  19. Following the genes: a framework for animal modeling of psychiatric disorders

    PubMed Central

    2011-01-01

    The number of individual cases of psychiatric disorders that can be ascribed to identified, rare, single mutations is increasing with great rapidity. Such mutations can be recapitulated in mice to generate animal models with direct etiological validity. Defining the underlying pathogenic mechanisms will require an experimental and theoretical framework to make the links from mutation to altered behavior in an animal or psychopathology in a human. Here, we discuss key elements of such a framework, including cell type-based phenotyping, developmental trajectories, linking circuit properties at micro and macro scales and definition of neurobiological phenotypes that are directly translatable to humans. PMID:22078115

  20. A mathematical framework for agent based models of complex biological networks.

    PubMed

    Hinkelmann, Franziska; Murrugarra, David; Jarrah, Abdul Salam; Laubenbacher, Reinhard

    2011-07-01

    Agent-based modeling and simulation is a useful method to study biological phenomena in a wide range of fields, from molecular biology to ecology. Since there is currently no agreed-upon standard way to specify such models, it is not always easy to use published models. Also, since model descriptions are not usually given in mathematical terms, it is difficult to bring mathematical analysis tools to bear, so that models are typically studied through simulation. In order to address this issue, Grimm et al. proposed a protocol for model specification, the so-called ODD protocol, which provides a standard way to describe models. This paper proposes an addition to the ODD protocol which allows the description of an agent-based model as a dynamical system, which provides access to computational and theoretical tools for its analysis. The mathematical framework is that of algebraic models, that is, time-discrete dynamical systems with algebraic structure. It is shown by way of several examples how this mathematical specification can help with model analysis. This mathematical framework can also accommodate other model types such as Boolean networks and the more general logical models, as well as Petri nets.

  1. A Framework for Different Levels of Integration of Computational Models Into Web-Based Virtual Patients

    PubMed Central

    Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-01

    Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the

  2. Representing natural and manmade drainage systems in an earth system modeling framework

    SciTech Connect

    Li, Hongyi; Wu, Huan; Huang, Maoyi; Leung, Lai-Yung R.

    2012-08-27

    Drainage systems can be categorized into natural or geomorphological drainage systems, agricultural drainage systems and urban drainage systems. They interact closely among themselves and with climate and human society, particularly under extreme climate and hydrological events such as floods. This editorial articulates the need to holistically understand and model drainage systems in the context of climate change and human influence, and discusses the requirements and examples of feasible approaches to representing natural and manmade drainage systems in an earth system modeling framework.

  3. A Framework for Modeling Workflow Execution by an Interdisciplinary Healthcare Team.

    PubMed

    Kezadri-Hamiaz, Mounira; Rosu, Daniela; Wilk, Szymon; Kuziemsky, Craig; Michalowski, Wojtek; Carrier, Marc

    2015-01-01

    The use of business workflow models in healthcare is limited because of insufficient capture of complexities associated with behavior of interdisciplinary healthcare teams that execute healthcare workflows. In this paper we present a novel framework that builds on the well-founded business workflow model formalism and related infrastructures and introduces a formal semantic layer that describes selected aspects of team dynamics and supports their real-time operationalization.

  4. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    NASA Astrophysics Data System (ADS)

    Hadjidoukas, P. E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.

    2015-03-01

    We present Π4U, an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  5. A Modeling Framework to Quantify Dilution Enhancement in Spatially Heterogeneous Aquifers

    NASA Astrophysics Data System (ADS)

    de Barros, Felipe; Fiori, Aldo; Boso, Francesca; Bellin, Alberto

    2016-04-01

    Solute dilution rates are strongly affected by the spatial fluctuations of the permeability. Current challenges consist of establishing a quantitative link between the statistical properties of the heterogeneous porous media and the concentration field. Proper quantification of solute dilution is crucial for the success of a remediation campaign and for risk assessment. In this work, we provide a modeling framework to quantify the dilution of a non-reactive solute. More precisely, we model that heterogeneity induced dilution enhancement within a steady state flow. Adopting the Lagrangian framework, we obtain semi-analytical solutions for the dilution index as a function of the structural parameters characterizing the permeability field. The solutions provided are valid for uniform-in-the-mean steady flow fields, small injection source and weak-to-mild heterogeneity in the log-permeability. Results show how the dilution enhancement of the solute plume depends the statistical anisotropy ratio and the heterogeneity level of the porous medium. The modeling framework also captures the temporal evolution of the dilution rate at distinct time regimes thus recovering previous results from the literature. Finally, the performance of the framework is verified with high resolution numerical results and successfully tested against the Cape Cod field data.

  6. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    SciTech Connect

    Hadjidoukas, P.E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.

    2015-03-01

    We present Π4U,{sup 1} an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  7. An Observation-Driven Agent-Based Modeling and Analysis Framework for C. elegans Embryogenesis.

    PubMed

    Wang, Zi; Ramsey, Benjamin J; Wang, Dali; Wong, Kwai; Li, Husheng; Wang, Eric; Bao, Zhirong

    2016-01-01

    With cutting-edge live microscopy and image analysis, biologists can now systematically track individual cells in complex tissues and quantify cellular behavior over extended time windows. Computational approaches that utilize the systematic and quantitative data are needed to understand how cells interact in vivo to give rise to the different cell types and 3D morphology of tissues. An agent-based, minimum descriptive modeling and analysis framework is presented in this paper to study C. elegans embryogenesis. The framework is designed to incorporate the large amounts of experimental observations on cellular behavior and reserve data structures/interfaces that allow regulatory mechanisms to be added as more insights are gained. Observed cellular behaviors are organized into lineage identity, timing and direction of cell division, and path of cell movement. The framework also includes global parameters such as the eggshell and a clock. Division and movement behaviors are driven by statistical models of the observations. Data structures/interfaces are reserved for gene list, cell-cell interaction, cell fate and landscape, and other global parameters until the descriptive model is replaced by a regulatory mechanism. This approach provides a framework to handle the ongoing experiments of single-cell analysis of complex tissues where mechanistic insights lag data collection and need to be validated on complex observations.

  8. An Observation-Driven Agent-Based Modeling and Analysis Framework for C. elegans Embryogenesis

    PubMed Central

    Wang, Zi; Ramsey, Benjamin J.; Wang, Dali; Wong, Kwai; Li, Husheng; Wang, Eric; Bao, Zhirong

    2016-01-01

    With cutting-edge live microscopy and image analysis, biologists can now systematically track individual cells in complex tissues and quantify cellular behavior over extended time windows. Computational approaches that utilize the systematic and quantitative data are needed to understand how cells interact in vivo to give rise to the different cell types and 3D morphology of tissues. An agent-based, minimum descriptive modeling and analysis framework is presented in this paper to study C. elegans embryogenesis. The framework is designed to incorporate the large amounts of experimental observations on cellular behavior and reserve data structures/interfaces that allow regulatory mechanisms to be added as more insights are gained. Observed cellular behaviors are organized into lineage identity, timing and direction of cell division, and path of cell movement. The framework also includes global parameters such as the eggshell and a clock. Division and movement behaviors are driven by statistical models of the observations. Data structures/interfaces are reserved for gene list, cell-cell interaction, cell fate and landscape, and other global parameters until the descriptive model is replaced by a regulatory mechanism. This approach provides a framework to handle the ongoing experiments of single-cell analysis of complex tissues where mechanistic insights lag data collection and need to be validated on complex observations. PMID:27851808

  9. Monitoring and modeling as a continuing learning process: the use of hydrological models in a general probabilistic framework.

    NASA Astrophysics Data System (ADS)

    Baroni, G.; Gräff, T.; Reinstorf, F.; Oswald, S. E.

    2012-04-01

    Nowadays uncertainty and sensitivity analysis are considered basic tools for the assessment of hydrological models and the evaluation of the most important sources of uncertainty. In this context, in the last decades several methods have been developed and applied in different hydrological conditions. However, in most of the cases, the studies have been done by investigating mainly the influence of the parameter uncertainty on the simulated outputs and few approaches tried to consider also other sources of uncertainty i.e. input and model structure. Moreover, several constrains arise when spatially distributed parameters are involved. To overcome these limitations a general probabilistic framework based on Monte Carlo simulations and the Sobol method has been proposed. In this study, the general probabilistic framework was applied at field scale using a 1D physical-based hydrological model (SWAP). Furthermore, the framework was extended at catchment scale in combination with a spatially distributed hydrological model (SHETRAN). The models are applied in two different experimental sites in Germany: a relatively flat cropped field close to Potsdam (Brandenburg) and a small mountainous catchment with agricultural land use (Schaefertal, Harz Mountains). For both cases, input and parameters are considered as major sources of uncertainty. Evaluation of the models was based on soil moisture detected at plot scale in different depths and, for the catchment site, also with daily discharge values. The study shows how the framework can take into account all the various sources of uncertainty i.e. input data, parameters (either in scalar or spatially distributed form) and model structures. The framework can be used in a loop in order to optimize further monitoring activities used to improve the performance of the model. In the particular applications, the results show how the sources of uncertainty are specific for each process considered. The influence of the input data as

  10. An Automated Application Framework to Model Disordered Materials Based on a High Throughput First Principles Approach

    NASA Astrophysics Data System (ADS)

    Oses, Corey; Yang, Kesong; Curtarolo, Stefano; Duke Univ Collaboration; UC San Diego Collaboration

    Predicting material properties of disordered systems remains a long-standing and formidable challenge in rational materials design. To address this issue, we introduce an automated software framework capable of modeling partial occupation within disordered materials using a high-throughput (HT) first principles approach. At the heart of the approach is the construction of supercells containing a virtually equivalent stoichiometry to the disordered material. All unique supercell permutations are enumerated and material properties of each are determined via HT electronic structure calculations. In accordance with a canonical ensemble of supercell states, the framework evaluates ensemble average properties of the system as a function of temperature. As proof of concept, we examine the framework's final calculated properties of a zinc chalcogenide (ZnS1-xSex), a wide-gap oxide semiconductor (MgxZn1-xO), and an iron alloy (Fe1-xCux) at various stoichiometries.

  11. Evaluation framework for nursing education programs: application of the CIPP model.

    PubMed

    Singh, Mina D

    2004-01-01

    It is advised that all nursing education programs conduct program evaluations to address accountability requirements and information for planning and guiding the delivery of the programs. Stufflebeam's CIPP Model, supported by triangulation of multiple modes of data collection provides such a theoretical framework for evaluations. This article proposes a total CIPP evaluation framework for nursing education programs. While this evaluation framework is applicable to any nursing evaluation program, it is practically useful for collaborative nursing programs as it allows a full assessment of each partner in its context. Under the direction of this author, the York-Seneca-Georgian-Durham collaborative BScN Program Evaluation Committee in Ontario developed and utilized a CIPP process evaluation.

  12. A local effect model-based interpolation framework for experimental nanoparticle radiosensitisation data.

    PubMed

    Brown, Jeremy M C; Currell, Fred J

    2017-01-01

    A local effect model (LEM)-based framework capable of interpolating nanoparticle-enhanced photon-irradiated clonogenic cell survival fraction measurements as a function of nanoparticle concentration was developed and experimentally benchmarked for gold nanoparticle (AuNP)-doped bovine aortic endothelial cells (BAECs) under superficial kilovoltage X-ray irradiation. For three different superficial kilovoltage X-ray spectra, the BAEC survival fraction response was predicted for two different AuNP concentrations and compared to experimental data. The ability of the developed framework to predict the cell survival fraction trends is analysed and discussed. This developed framework is intended to fill in the existing gaps of individual cell line response as a function of NP concentration under photon irradiation and assist the scientific community in planning future pre-clinical trials of high Z nanoparticle-enhanced photon radiotherapy.

  13. Development of an integrated modelling framework: comparing client-server and demand-driven control flow for model execution

    NASA Astrophysics Data System (ADS)

    Schmitz, Oliver; Karssenberg, Derek; de Jong, Kor; de Kok, Jean-Luc; de Jong, Steven M.

    2014-05-01

    The construction of hydrological models at the catchment or global scale depends on the integration of component models representing various environmental processes, often operating at different spatial and temporal discretisations. A flexible construction of spatio-temporal model components, a means to specify aggregation or disaggregation to bridge discretisation discrepancies, ease of coupling these into complex integrated models, and support for stochastic modelling and the assessment of model outputs are the desired functionalities for the development of integrated models. These functionalities are preferably combined into one modelling framework such that domain specialists can perform exploratory model development without the need to change their working environment. We implemented an integrated modelling framework in the Python programming language, providing support for 1) model construction and 2) model execution. The framework enables modellers to represent spatio-temporal processes or to specify spatio-temporal (dis)aggregation with map algebra operations provided by the PCRaster library. Model algebra operations can be used by the modeller to specify the exchange of data and therefore the coupling of components. The framework determines the control flow for the ordered execution based on the time steps and couplings of the model components given by the modeller. We implemented two different control flow mechanisms. First, a client-server approach is used with a central entity controlling the execution of the component models and steering the data exchange. Second, a demand-driven approach is used that triggers the execution of a component model when data is requested by a coupled component model. We show that both control flow mechanisms allow for the execution of stochastic, multi-scale integrated models. We examine the implications of each control flow mechanism on the terminology used by the modeller to specify integrated models, and illustrate the

  14. A hydroeconomic modeling framework for optimal integrated management of forest and water

    NASA Astrophysics Data System (ADS)

    Garcia-Prats, Alberto; del Campo, Antonio D.; Pulido-Velazquez, Manuel

    2016-10-01

    Forests play a determinant role in the hydrologic cycle, with water being the most important ecosystem service they provide in semiarid regions. However, this contribution is usually neither quantified nor explicitly valued. The aim of this study is to develop a novel hydroeconomic modeling framework for assessing and designing the optimal integrated forest and water management for forested catchments. The optimization model explicitly integrates changes in water yield in the stands (increase in groundwater recharge) induced by forest management and the value of the additional water provided to the system. The model determines the optimal schedule of silvicultural interventions in the stands of the catchment in order to maximize the total net benefit in the system. Canopy cover and biomass evolution over time were simulated using growth and yield allometric equations specific for the species in Mediterranean conditions. Silvicultural operation costs according to stand density and canopy cover were modeled using local cost databases. Groundwater recharge was simulated using HYDRUS, calibrated and validated with data from the experimental plots. In order to illustrate the presented modeling framework, a case study was carried out in a planted pine forest (Pinus halepensis Mill.) located in south-western Valencia province (Spain). The optimized scenario increased groundwater recharge. This novel modeling framework can be used in the design of a "payment for environmental services" scheme in which water beneficiaries could contribute to fund and promote efficient forest management operations.

  15. Family Environment and Childhood Obesity: A New Framework with Structural Equation Modeling.

    PubMed

    Huang, Hui; Wan Mohamed Radzi, Che Wan Jasimah Bt; Salarzadeh Jenatabadi, Hashem

    2017-02-13

    The main purpose of the current article is to introduce a framework of the complexity of childhood obesity based on the family environment. A conceptual model that quantifies the relationships and interactions among parental socioeconomic status, family food security level, child's food intake and certain aspects of parental feeding behaviour is presented using the structural equation modeling (SEM) concept. Structural models are analysed in terms of the direct and indirect connections among latent and measurement variables that lead to the child weight indicator. To illustrate the accuracy, fit, reliability and validity of the introduced framework, real data collected from 630 families from Urumqi (Xinjiang, China) were considered. The framework includes two categories of data comprising the normal body mass index (BMI) range and obesity data. The comparison analysis between two models provides some evidence that in obesity modeling, obesity data must be extracted from the dataset and analysis must be done separately from the normal BMI range. This study may be helpful for researchers interested in childhood obesity modeling based on family environment.

  16. Family Environment and Childhood Obesity: A New Framework with Structural Equation Modeling

    PubMed Central

    Huang, Hui; Wan Mohamed Radzi, Che Wan Jasimah bt; Salarzadeh Jenatabadi, Hashem

    2017-01-01

    The main purpose of the current article is to introduce a framework of the complexity of childhood obesity based on the family environment. A conceptual model that quantifies the relationships and interactions among parental socioeconomic status, family food security level, child’s food intake and certain aspects of parental feeding behaviour is presented using the structural equation modeling (SEM) concept. Structural models are analysed in terms of the direct and indirect connections among latent and measurement variables that lead to the child weight indicator. To illustrate the accuracy, fit, reliability and validity of the introduced framework, real data collected from 630 families from Urumqi (Xinjiang, China) were considered. The framework includes two categories of data comprising the normal body mass index (BMI) range and obesity data. The comparison analysis between two models provides some evidence that in obesity modeling, obesity data must be extracted from the dataset and analysis must be done separately from the normal BMI range. This study may be helpful for researchers interested in childhood obesity modeling based on family environment. PMID:28208833

  17. A dynamic water-quality modeling framework for the Neuse River estuary, North Carolina

    USGS Publications Warehouse

    Bales, Jerad D.; Robbins, Jeanne C.

    1999-01-01

    As a result of fish kills in the Neuse River estuary in 1995, nutrient reduction strategies were developed for point and nonpoint sources in the basin. However, because of the interannual variability in the natural system and the resulting complex hydrologic-nutrient inter- actions, it is difficult to detect through a short-term observational program the effects of management activities on Neuse River estuary water quality and aquatic health. A properly constructed water-quality model can be used to evaluate some of the potential effects of manage- ment actions on estuarine water quality. Such a model can be used to predict estuarine response to present and proposed nutrient strategies under the same set of meteorological and hydrologic conditions, thus removing the vagaries of weather and streamflow from the analysis. A two-dimensional, laterally averaged hydrodynamic and water-quality modeling framework was developed for the Neuse River estuary by using previously collected data. Development of the modeling framework consisted of (1) computational grid development, (2) assembly of data for model boundary conditions and model testing, (3) selection of initial values of model parameters, and (4) limited model testing. The model domain extends from Streets Ferry to Oriental, N.C., includes seven lateral embayments that have continual exchange with the main- stem of the estuary, three point-source discharges, and three tributary streams. Thirty-five computational segments represent the mainstem of the estuary, and the entire framework contains a total of 60 computa- tional segments. Each computational cell is 0.5 meter thick; segment lengths range from 500 meters to 7,125 meters. Data that were used to develop the modeling framework were collected during March through October 1991 and represent the most comprehensive data set available prior to 1997. Most of the data were collected by the North Carolina Division of Water Quality, the University of North Carolina

  18. A probabilistic model framework for evaluating year-to-year variation in crop productivity

    NASA Astrophysics Data System (ADS)

    Yokozawa, M.; Iizumi, T.; Tao, F.

    2008-12-01

    Most models describing the relation between crop productivity and weather condition have so far been focused on mean changes of crop yield. For keeping stable food supply against abnormal weather as well as climate change, evaluating the year-to-year variations in crop productivity rather than the mean changes is more essential. We here propose a new framework of probabilistic model based on Bayesian inference and Monte Carlo simulation. As an example, we firstly introduce a model on paddy rice production in Japan. It is called PRYSBI (Process- based Regional rice Yield Simulator with Bayesian Inference; Iizumi et al., 2008). The model structure is the same as that of SIMRIW, which was developed and used widely in Japan. The model includes three sub- models describing phenological development, biomass accumulation and maturing of rice crop. These processes are formulated to include response nature of rice plant to weather condition. This model inherently was developed to predict rice growth and yield at plot paddy scale. We applied it to evaluate the large scale rice production with keeping the same model structure. Alternatively, we assumed the parameters as stochastic variables. In order to let the model catch up actual yield at larger scale, model parameters were determined based on agricultural statistical data of each prefecture of Japan together with weather data averaged over the region. The posterior probability distribution functions (PDFs) of parameters included in the model were obtained using Bayesian inference. The MCMC (Markov Chain Monte Carlo) algorithm was conducted to numerically solve the Bayesian theorem. For evaluating the year-to-year changes in rice growth/yield under this framework, we firstly iterate simulations with set of parameter values sampled from the estimated posterior PDF of each parameter and then take the ensemble mean weighted with the posterior PDFs. We will also present another example for maize productivity in China. The

  19. Intrinsic flexibility of porous materials; theory, modelling and the flexibility window of the EMT zeolite framework

    PubMed Central

    Fletcher, Rachel E.; Wells, Stephen A.; Leung, Ka Ming; Edwards, Peter P.; Sartbaeva, Asel

    2015-01-01

    Framework materials have structures containing strongly bonded polyhedral groups of atoms connected through their vertices. Typically the energy cost for variations of the inter-polyhedral geometry is much less than the cost of distortions of the polyhedra themselves – as in the case of silicates, where the geometry of the SiO4 tetrahedral group is much more strongly constrained than the Si—O—Si bridging angle. As a result, framework materials frequently display intrinsic flexibility, and their dynamic and static properties are strongly influenced by low-energy collective motions of the polyhedra. Insight into these motions can be obtained in reciprocal space through the ‘rigid unit mode’ (RUM) model, and in real-space through template-based geometric simulations. We briefly review the framework flexibility phenomena in energy-relevant materials, including ionic conductors, perovskites and zeolites. In particular we examine the ‘flexibility window’ phenomenon in zeolites and present novel results on the flexibility window of the EMT framework, which shed light on the role of structure-directing agents. Our key finding is that the crown ether, despite its steric bulk, does not limit the geometric flexibility of the framework. PMID:26634720

  20. Performance of default risk model with barrier option framework and maximum likelihood estimation: Evidence from Taiwan

    NASA Astrophysics Data System (ADS)

    Chou, Heng-Chih; Wang, David

    2007-11-01

    We investigate the performance of a default risk model based on the barrier option framework with maximum likelihood estimation. We provide empirical validation of the model by showing that implied default barriers are statistically significant for a sample of construction firms in Taiwan over the period 1994-2004. We find that our model dominates the commonly adopted models, Merton model, Z-score model and ZETA model. Moreover, we test the n-year-ahead prediction performance of the model and find evidence that the prediction accuracy of the model improves as the forecast horizon decreases. Finally, we assess the effect of estimated default risk on equity returns and find that default risk is able to explain equity returns and that default risk is a variable worth considering in asset-pricing tests, above and beyond size and book-to-market.

  1. A general science-based framework for dynamical spatio-temporal models

    USGS Publications Warehouse

    Wikle, C.K.; Hooten, M.B.

    2010-01-01

    Spatio-temporal statistical models are increasingly being used across a wide variety of scientific disciplines to describe and predict spatially-explicit processes that evolve over time. Correspondingly, in recent years there has been a significant amount of research on new statistical methodology for such models. Although descriptive models that approach the problem from the second-order (covariance) perspective are important, and innovative work is being done in this regard, many real-world processes are dynamic, and it can be more efficient in some cases to characterize the associated spatio-temporal dependence by the use of dynamical models. The chief challenge with the specification of such dynamical models has been related to the curse of dimensionality. Even in fairly simple linear, first-order Markovian, Gaussian error settings, statistical models are often over parameterized. Hierarchical models have proven invaluable in their ability to deal to some extent with this issue by allowing dependency among groups of parameters. In addition, this framework has allowed for the specification of science based parameterizations (and associated prior distributions) in which classes of deterministic dynamical models (e. g., partial differential equations (PDEs), integro-difference equations (IDEs), matrix models, and agent-based models) are used to guide specific parameterizations. Most of the focus for the application of such models in statistics has been in the linear case. The problems mentioned above with linear dynamic models are compounded in the case of nonlinear models. In this sense, the need for coherent and sensible model parameterizations is not only helpful, it is essential. Here, we present an overview of a framework for incorporating scientific information to motivate dynamical spatio-temporal models. First, we illustrate the methodology with the linear case. We then develop a general nonlinear spatio-temporal framework that we call general quadratic

  2. Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model

    SciTech Connect

    Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.

    2001-11-09

    Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.

  3. Development and Application of A Coupled Land-Atmosphere Modeling Framework with Multi-Physics Options

    NASA Astrophysics Data System (ADS)

    Yang, Z.; Niu, G.; Jiang, X.

    2009-12-01

    In this paper we develop a single integrated mesoscale meteorological modeling framework that intimately couples a land surface model and an atmospheric model, both of which are equipped with multi-parameterization options. This framework should enable process-based ensemble weather predictions, identification of optimal combinations of process parameterization schemes, identification of critical processes controlling the coupling strength between the land surface and the atmosphere, and quantification of uncertainties in using regional meteorological models to extract high-resolution climate information for policy decision making. We build on the existing Weather, Research and Forecasting (WRF) atmospheric model, which already has multiple parameterization options for atmospheric processes such as convection, radiation, planetary boundary layer, and microphysics. One of our key model development efforts is to couple the WRF model with our newly developed, ensemble representation of the land surface, i.e., the Noah land surface model that was first enhanced with biophysical and hydrological realism and then equipped with multi-parameterization options (Noah-MP) for a wide spectrum of physical and ecological processes. The Noah-MP LSM is capable of generating thousands of process-based combinations of land surface parameterization schemes as opposed to the traditional approach (e.g. BATS, SiB or VIC) that utilizes only a single combination. Offline Noah-MP tests show a great potential of the model in ensemble hydrological predictions. We perform an analysis of the sensitivity to different parameterizations over the conterminous United States using the single integrated mesoscale modeling framework described above. To prove the concept, we present an ensemble of multi-day integrations using the model at 30-km resolution with varying physical representations for both the land surface (runoff process) and the atmosphere (convective process). The lateral boundary

  4. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  5. A theoretical and computational setting for a geometrically nonlinear gradient damage modelling framework

    NASA Astrophysics Data System (ADS)

    Nedjar, B.

    The present work deals with the extension to the geometrically nonlinear case of recently proposed ideas on elastic- and elastoplastic-damage modelling frameworks within the infinitesimal theory. The particularity of these models is that the damage part of the modelling involves the gradient of damage quantity which, together with the equations of motion, are ensuing from a new formulation of the principle of virtual power. It is shown how the thermodynamics of irreversible processes is crucial in the characterization of the dissipative phenomena and in setting the convenient forms for the constitutive relations. On the numerical side, we discuss the problem of numerically integrating these equations and the implementation within the context of the finite element method is described in detail. And finally, we present a set of representative numerical simulations to illustrate the effectiveness of the proposed framework.

  6. A comprehensive mathematical framework for modeling intestinal smooth muscle cell contraction with applications to intestinal edema.

    PubMed

    Young, Jennifer; Ozisik, Sevtap; Riviere, Beatrice; Shamim, Muhammad

    2015-04-01

    The contraction of intestinal smooth muscle cells (ISMCs) involves many coordinated biochemical and mechanical processes. In this work, we present a framework for modeling ISMC contractility that begins with chemical models of calcium dynamics, continues with myosin light chain phosphorylation and force generation, and ends with a cell model of the ISMC undergoing contraction-relaxation. The motivation for developing this comprehensive framework is to study the effects of edema (excess fluid build-up in the muscle tissue) on ISMC contractility. The hypothesis is that more fluid equates to dilution of an external stimulis, eventually leading to reduced contractility. We compare our results to experimental data collected from normal versus edematous intestinal muscle tissue.

  7. Development and Implementation of a Telecommuting Evaluation Framework, and Modeling the Executive Telecommuting Adoption Process

    NASA Astrophysics Data System (ADS)

    Vora, V. P.; Mahmassani, H. S.

    2002-02-01

    This work proposes and implements a comprehensive evaluation framework to document the telecommuter, organizational, and societal impacts of telecommuting through telecommuting programs. Evaluation processes and materials within the outlined framework are also proposed and implemented. As the first component of the evaluation process, the executive survey is administered within a public sector agency. The survey data is examined through exploratory analysis and is compared to a previous survey of private sector executives. The ordinal probit, dynamic probit, and dynamic generalized ordinal probit (DGOP) models of telecommuting adoption are calibrated to identify factors which significantly influence executive adoption preferences and to test the robustness of such factors. The public sector DGOP model of executive willingness to support telecommuting under different program scenarios is compared with an equivalent private sector DGOP model. Through the telecommuting program, a case study of telecommuting travel impacts is performed to further substantiate research.

  8. Crowd modeling framework using fast head detection and shape-aware matching

    NASA Astrophysics Data System (ADS)

    Zhou, Tao; Yang, Jie; Loza, Artur; Bhaskar, Harish; Al-Mualla, Mohammed

    2015-03-01

    A framework for crowd modeling using a combination of multiple kernel learning (MKL)-based fast head detection and shape-aware matching is proposed. First, the MKL technique is used to train a classifier for head detection using a combination of the histogram of oriented gradient and local binary patterns feature sets. Further, the head detection process is accelerated by implementing the classification procedure only at those spatial locations in the image where the gradient points overlap with moving objects. Such moving objects are determined using an adaptive background subtraction technique. Finally, the crowd is modeled as a deformable shape through connected boundary points (head detection) and matched with the subsequent detection from the next frame in a shape-aware manner. Experimental results obtained from crowded videos show that the proposed framework, while being characterized by a low computation load, performs better than other state-of-art techniques and results in reliable crowd modeling.

  9. A general ecophysiological framework for modelling the impact of pests and pathogens on forest ecosystems.

    PubMed Central

    Dietze, Michael C; Matthes, Jaclyn Hatala

    2014-01-01

    Forest insects and pathogens (FIPs) have enormous impacts on community dynamics, carbon storage and ecosystem services, however, ecosystem modelling of FIPs is limited due to their variability in severity and extent. We present a general framework for modelling FIP disturbances through their impacts on tree ecophysiology. Five pathways are identified as the basis for functional groupings: increases in leaf, stem and root turnover, and reductions in phloem and xylem transport. A simple ecophysiological model was used to explore the sensitivity of forest growth, mortality and ecosystem fluxes to varying outbreak severity. Across all pathways, low infection was associated with growth reduction but limited mortality. Moderate infection led to individual tree mortality, whereas high levels led to stand-level die-offs delayed over multiple years. Delayed mortality is consistent with observations and critical for capturing biophysical, biogeochemical and successional responses. This framework enables novel predictions under present and future global change scenarios. PMID:25168168

  10. A dynamic hybrid subgrid-scale modeling framework for large eddy simulations

    NASA Astrophysics Data System (ADS)

    Maulik, Romit; San, Omer

    2016-11-01

    We put forth a dynamic modeling framework for sub-grid parameterization of large eddy simulation of turbulent flows based upon the use of the approximate deconvolution (AD) procedure to compute the eddy viscosity constant self-adaptively from the resolved flow quantities. In our proposed framework, the test filtering process of the standard dynamic model is replaced by the AD procedure and a posteriori error analysis is performed. The robustness of the model has been tested considering the Burgers, Kraichnan, Kolmogorov turbulence problems. Our numerical assessments for solving these canonical decaying turbulence problems show that the proposed approach could be used as a viable tool to address the turbulence closure problem due to its flexibility.

  11. CMA-HT: a crowd motion analysis framework based on heat-transfer analog model

    NASA Astrophysics Data System (ADS)

    Liang, Yu; Melvin, William; Sritharan, Subramania I.; Fernandes, Shane; Barker, Darrell

    2012-06-01

    Crowd motion analysis covers the detection, tracking, recognition, and behavior interpretation of target group from persistent surveillance video data. This project is dedicated to investigating a crowd motion analysis system based on a heat-transfer-analog model (denoted as CMA-HT for simplicity), and a generic modeling and simulation framework describing crowd motion behavior. CMA-HT is formulated by coupling the statistical analysis of crowd's historical behavior at a given location, geographic information system, and crowd motion dynamics. The mathematical derivation of the CMA-HT model and the innovative methods involved in the framework's implementation will be discussed in detail. Using the sample video data collected by Central Florida University as benchmark, CMA-HT is employed to measure and identify anomalous personnel or group responses in the video.

  12. A new theoretical framework for modeling respiratory protection based on the beta distribution.

    PubMed

    Klausner, Ziv; Fattal, Eyal

    2014-08-01

    The problem of modeling respiratory protection is well known and has been dealt with extensively in the literature. Often the efficiency of respiratory protection is quantified in terms of penetration, defined as the proportion of an ambient contaminant concentration that penetrates the respiratory protection equipment. Typically, the penetration modeling framework in the literature is based on the assumption that penetration measurements follow the lognormal distribution. However, the analysis in this study leads to the conclusion that the lognormal assumption is not always valid, making it less adequate for analyzing respiratory protection measurements. This work presents a formulation of the problem from first principles, leading to a stochastic differential equation whose solution is the probability density function of the beta distribution. The data of respiratory protection experiments were reexamined, and indeed the beta distribution was found to provide the data a better fit than the lognormal. We conclude with a suggestion for a new theoretical framework for modeling respiratory protection.

  13. A Framework for Dealing With Uncertainty due to Model Structure Error

    NASA Astrophysics Data System (ADS)

    van der Keur, P.; Refsgaard, J.; van der Sluijs, J.; Brown, J.

    2004-12-01

    Although uncertainty about structures of environmental models (conceptual uncertainty) has been recognised often to be the main source of uncertainty in model predictions, it is rarely considered in environmental modelling. Rather, formal uncertainty analyses have traditionally focused on model parameters and input data as the principal source of uncertainty in model predictions. The traditional approach to model uncertainty analysis that considers only a single conceptual model, fails to adequately sample the relevant space of plausible models. As such, it is prone to modelling bias and underestimation of model uncertainty. In this paper we review a range of strategies for assessing structural uncertainties. The existing strategies fall into two categories depending on whether field data are available for the variable of interest. Most research attention has until now been devoted to situations, where model structure uncertainties can be assessed directly on the basis of field data. This corresponds to a situation of `interpolation'. However, in many cases environmental models are used for `extrapolation' beyond the situation and the field data available for calibration. A framework is presented for assessing the predictive uncertainties of environmental models used for extrapolation. The key elements are the use of alternative conceptual models and assessment of their pedigree and the adequacy of the samples of conceptual models to represent the space of plausible models by expert elicitation. Keywords: model error, model structure, conceptual uncertainty, scenario analysis, pedigree

  14. Toward university modeling instruction--biology: adapting curricular frameworks from physics to biology.

    PubMed

    Manthey, Seth; Brewe, Eric

    2013-06-01

    University Modeling Instruction (UMI) is an approach to curriculum and pedagogy that focuses instruction on engaging students in building, validating, and deploying scientific models. Modeling Instruction has been successfully implemented in both high school and university physics courses. Studies within the physics education research (PER) community have identified UMI's positive impacts on learning gains, equity, attitudinal shifts, and self-efficacy. While the success of this pedagogical approach has been recognized within the physics community, the use of models and modeling practices is still being developed for biology. Drawing from the existing research on UMI in physics, we describe the theoretical foundations of UMI and how UMI can be adapted to include an emphasis on models and modeling for undergraduate introductory biology courses. In particular, we discuss our ongoing work to develop a framework for the first semester of a two-semester introductory biology course sequence by identifying the essential basic models for an introductory biology course sequence.

  15. Toward University Modeling Instruction—Biology: Adapting Curricular Frameworks from Physics to Biology

    PubMed Central

    Manthey, Seth; Brewe, Eric

    2013-01-01

    University Modeling Instruction (UMI) is an approach to curriculum and pedagogy that focuses instruction on engaging students in building, validating, and deploying scientific models. Modeling Instruction has been successfully implemented in both high school and university physics courses. Studies within the physics education research (PER) community have identified UMI's positive impacts on learning gains, equity, attitudinal shifts, and self-efficacy. While the success of this pedagogical approach has been recognized within the physics community, the use of models and modeling practices is still being developed for biology. Drawing from the existing research on UMI in physics, we describe the theoretical foundations of UMI and how UMI can be adapted to include an emphasis on models and modeling for undergraduate introductory biology courses. In particular, we discuss our ongoing work to develop a framework for the first semester of a two-semester introductory biology course sequence by identifying the essential basic models for an introductory biology course sequence. PMID:23737628

  16. A model framework to represent plant-physiology and rhizosphere processes in soil profile simulation models

    NASA Astrophysics Data System (ADS)

    Vanderborght, J.; Javaux, M.; Couvreur, V.; Schröder, N.; Huber, K.; Abesha, B.; Schnepf, A.; Vereecken, H.

    2013-12-01

    of plant transpiration by root-zone produced plant hormones, and (iv) the impact of salt accumulation at the soil-root interface on root water uptake. We further propose a framework how this process knowledge could be implemented in root zone simulation models that do not resolve small scale processes.

  17. Modelling framework for dynamic interaction between multiple pedestrians and vertical vibrations of footbridges

    NASA Astrophysics Data System (ADS)

    Venuti, Fiammetta; Racic, Vitomir; Corbetta, Alessandro

    2016-09-01

    After 15 years of active research on the interaction between moving people and civil engineering structures, there is still a lack of reliable models and adequate design guidelines pertinent to vibration serviceability of footbridges due to multiple pedestrians. There are three key issues that a new generation of models should urgently address: pedestrian "intelligent" interaction with the surrounding people and environment, effect of human bodies on dynamic properties of unoccupied structure and inter-subject and intra-subject variability of pedestrian walking loads. This paper presents a modelling framework of human-structure interaction in the vertical direction which addresses all three issues. The framework comprises two main models: (1) a microscopic model of multiple pedestrian traffic that simulates time varying position and velocity of each individual pedestrian on the footbridge deck, and (2) a coupled dynamic model of a footbridge and multiple walking pedestrians. The footbridge is modelled as a SDOF system having the dynamic properties of the unoccupied structure. Each walking pedestrian in a group or crowd is modelled as a SDOF system with an adjacent stochastic vertical force that moves along the footbridge following the trajectory and the gait pattern simulated by the microscopic model of pedestrian traffic. Performance of the suggested modelling framework is illustrated by a series of simulated vibration responses of a virtual footbridge due to light, medium and dense pedestrian traffic. Moreover, the Weibull distribution is shown to fit well the probability density function of the local peaks in the acceleration response. Considering the inherent randomness of the crowd, this makes it possible to determine the probability of exceeding any given acceleration value of the occupied bridge.

  18. An analytical framework to assist decision makers in the use of forest ecosystem model predictions

    USGS Publications Warehouse

    Larocque, Guy R.; Bhatti, Jagtar S.; Ascough, J.C.; Liu, J.; Luckai, N.; Mailly, D.; Archambault, L.; Gordon, Andrew M.

    2011-01-01

    The predictions from most forest ecosystem models originate from deterministic simulations. However, few evaluation exercises for model outputs are performed by either model developers or users. This issue has important consequences for decision makers using these models to develop natural resource management policies, as they cannot evaluate the extent to which predictions stemming from the simulation of alternative management scenarios may result in significant environmental or economic differences. Various numerical methods, such as sensitivity/uncertainty analyses, or bootstrap methods, may be used to evaluate models and the errors associated with their outputs. However, the application of each of these methods carries unique challenges which decision makers do not necessarily understand; guidance is required when interpreting the output generated from each model. This paper proposes a decision flow chart in the form of an analytical framework to help decision makers apply, in an orderly fashion, different steps involved in examining the model outputs. The analytical framework is discussed with regard to the definition of problems and objectives and includes the following topics: model selection, identification of alternatives, modelling tasks and selecting alternatives for developing policy or implementing management scenarios. Its application is illustrated using an on-going exercise in developing silvicultural guidelines for a forest management enterprise in Ontario, Canada.

  19. Modeling Nonlinear Change via Latent Change and Latent Acceleration Frameworks: Examining Velocity and Acceleration of Growth Trajectories

    ERIC Educational Resources Information Center

    Grimm, Kevin; Zhang, Zhiyong; Hamagami, Fumiaki; Mazzocco, Michele

    2013-01-01

    We propose the use of the latent change and latent acceleration frameworks for modeling nonlinear growth in structural equation models. Moving to these frameworks allows for the direct identification of "rates of change" and "acceleration" in latent growth curves--information available indirectly through traditional growth…

  20. A framework for implementation of organ effect models in TOPAS with benchmarks extended to proton therapy

    NASA Astrophysics Data System (ADS)

    Ramos-Méndez, J.; Perl, J.; Schümann, J.; Shin, J.; Paganetti, H.; Faddegon, B.

    2015-07-01

    The aim of this work was to develop a framework for modeling organ effects within TOPAS (TOol for PArticle Simulation), a wrapper of the Geant4 Monte Carlo toolkit that facilitates particle therapy simulation. The DICOM interface for TOPAS was extended to permit contour input, used to assign voxels to organs. The following dose response models were implemented: The Lyman-Kutcher-Burman model, the critical element model, the population based critical volume model, the parallel-serial model, a sigmoid-based model of Niemierko for normal tissue complication probability and tumor control probability (TCP), and a Poisson-based model for TCP. The framework allows easy manipulation of the parameters of these models and the implementation of other models. As part of the verification, results for the parallel-serial and Poisson model for x-ray irradiation of a water phantom were compared to data from the AAPM Task Group 166. When using the task group dose-volume histograms (DVHs), results were found to be sensitive to the number of points in the DVH, with differences up to 2.4%, some of which are attributable to differences between the implemented models. New results are given with the point spacing specified. When using Monte Carlo calculations with TOPAS, despite the relatively good match to the published DVH’s, differences up to 9% were found for the parallel-serial model (for a maximum DVH difference of 2%) and up to 0.5% for the Poisson model (for a maximum DVH difference of 0.5%). However, differences of 74.5% (in Rectangle1), 34.8% (in PTV) and 52.1% (in Triangle) for the critical element, critical volume and the sigmoid-based models were found respectively. We propose a new benchmark for verification of organ effect models in proton therapy. The benchmark consists of customized structures in the spread out Bragg peak plateau, normal tissue, tumor, penumbra and in the distal region. The DVH’s, DVH point spacing, and results of the organ effect models are provided

  1. Extension of the FMCFM framework to the neoclassical, paleoclassical, and gyrokinetic transport models

    NASA Astrophysics Data System (ADS)

    Vadlamani, Srinath; Pankin, Alexei; Kruger, Scott; Pletzer, Alex; Carlsson, Johan; Cary, John; Muszala, Stefan; Fahey, Mark; Candy, Jeff

    2008-11-01

    Recent improvements to the Framework for Modernization and Componentization of Fusion Modules (FMCFM) are described. The FMCFM framework provides a common interface to the transport modules and libraries such as those in the National Transport Code Collabation (NTCC) module library[1]. The FMCFM interface facilitates access to the transport models from integrated modeling codes and allows interlanguage interfaces using Babel[2]. The new interface to transport modules has been applied to the the GLF23 and MMM95 transport models. Current work of incorporating neoclassical NCLASS and Kapisn codes, a paleoclassical transport model[3] and the GYRO nonlinear tokamak microturbulence package will be presented. A new flux mapping tool that is being included to the FMCFM project in order to interface the equilibrium data generated with legacy equilibrium solvers such EFIT and TEQ is described in this report. The functionality is demonstrated in Framework Application for Core-Edge Transport Simulations (FACETS) project. [1] A. H. Kritz et al. Comp. Phys. Comm. 164,108(2004). [2] G. Kumfert et al. LLNL Tech Report UCRL-CONF-222279. [3]J. D. Callen, Nucl. Fusion 45,1120(2005).

  2. Developing policy analytics for public health strategy and decisions-the Sheffield alcohol policy model framework.

    PubMed

    Brennan, Alan; Meier, Petra; Purshouse, Robin; Rafia, Rachid; Meng, Yang; Hill-Macmanus, Daniel

    This paper sets out the development of a methodological framework for detailed evaluation of public health strategies for alcohol harm reduction to meet UK policy-makers needs. Alcohol is known to cause substantial harms, and controlling its affordability and availability are effective policy options. Analysis and synthesis of a variety of public and commercial data sources is needed to evaluate impact on consumers, health services, crime, employers and industry, so a sound evaluation of impact is important. We discuss the iterative process to engage with stakeholders, identify evidence/data and develop analytic approaches and produce a final model structure. We set out a series of steps in modelling impact including: classification and definition of population subgroups of interest, identification and definition of harms and outcomes for inclusion, classification of modifiable components of risk and their baseline values, specification of the baseline position on policy variables especially prices, estimating effects of changing policy variables on risk factors including price elasticities, quantifying risk functions relating risk factors to harms including 47 health conditions, crimes, absenteeism and unemployment, and monetary valuation. The most difficult model structuring decisions are described, as well as the final results framework used to provide decision support to national level policymakers in the UK. In the discussion we explore issues around the relationship between modelling and policy debates, valuation and scope, limitations of evidence/data, how the framework can be adapted to other countries and decisions. We reflect on the approach taken and outline ongoing plans for further development.

  3. Tailored motivational message generation: A model and practical framework for real-time physical activity coaching.

    PubMed

    Op den Akker, Harm; Cabrita, Miriam; Op den Akker, Rieks; Jones, Valerie M; Hermens, Hermie J

    2015-06-01

    This paper presents a comprehensive and practical framework for automatic generation of real-time tailored messages in behavior change applications. Basic aspects of motivational messages are time, intention, content and presentation. Tailoring of messages to the individual user may involve all aspects of communication. A linear modular system is presented for generating such messages. It is explained how properties of user and context are taken into account in each of the modules of the system and how they affect the linguistic presentation of the generated messages. The model of motivational messages presented is based on an analysis of existing literature as well as the analysis of a corpus of motivational messages used in previous studies. The model extends existing 'ontology-based' approaches to message generation for real-time coaching systems found in the literature. Practical examples are given on how simple tailoring rules can be implemented throughout the various stages of the framework. Such examples can guide further research by clarifying what it means to use e.g. user targeting to tailor a message. As primary example we look at the issue of promoting daily physical activity. Future work is pointed out in applying the present model and framework, defining efficient ways of evaluating individual tailoring components, and improving effectiveness through the creation of accurate and complete user- and context models.

  4. A Catchment-Based Land Surface Model for GCMs and the Framework for its Evaluation

    NASA Technical Reports Server (NTRS)

    Ducharen, A.; Koster, R. D.; Suarez, M. J.; Kumar, P.

    1998-01-01

    A new GCM-scale land surface modeling strategy that explicitly accounts for subgrid soil moisture variability and its effects on evaporation and runoff is now being explored. In a break from traditional modeling strategies, the continental surface is disaggregated into a mosaic of hydrological catchments, with boundaries that are not dictated by a regular grid but by topography. Within each catchment, the variability of soil moisture is deduced from TOP-MODEL equations with a special treatment of the unsaturated zone. This paper gives an overview of this new approach and presents the general framework for its off-line evaluation over North-America.

  5. Modeling somite scaling in small embryos in the framework of Turing patterns

    NASA Astrophysics Data System (ADS)

    Signon, Laurence; Nowakowski, Bogdan; Lemarchand, Annie

    2016-04-01

    The adaptation of prevertebra size to embryo size is investigated in the framework of a reaction-diffusion model involving a Turing pattern. The reaction scheme and Fick's first law of diffusion are modified in order to take into account the departure from dilute conditions induced by confinement in smaller embryos. In agreement with the experimental observations of scaling in somitogenesis, our model predicts the formation of smaller prevertebrae or somites in smaller embryos. These results suggest that models based on Turing patterns cannot be automatically disregarded by invoking the question of maintaining proportions in embryonic development. Our approach highlights the nontrivial role that the solvent can play in biology.

  6. Disruptive innovation in health care delivery: a framework for business-model innovation.

    PubMed

    Hwang, Jason; Christensen, Clayton M

    2008-01-01

    Disruptive innovation has brought affordability and convenience to customers in a variety of industries. However, health care remains expensive and inaccessible to many because of the lack of business-model innovation. This paper explains the theory of disruptive innovation and describes how disruptive technologies must be matched with innovative business models. The authors present a framework for categorizing and developing business models in health care, followed by a discussion of some of the reasons why disruptive innovation in health care delivery has been slow.

  7. A framework for dealing with uncertainty due to model structure error

    NASA Astrophysics Data System (ADS)

    Refsgaard, Jens Christian; van der Sluijs, Jeroen P.; Brown, James; van der Keur, Peter

    2006-11-01

    Although uncertainty about structures of environmental models (conceptual uncertainty) is often acknowledged to be the main source of uncertainty in model predictions, it is rarely considered in environmental modelling. Rather, formal uncertainty analyses have traditionally focused on model parameters and input data as the principal source of uncertainty in model predictions. The traditional approach to model uncertainty analysis, which considers only a single conceptual model, may fail to adequately sample the relevant space of plausible conceptual models. As such, it is prone to modelling bias and underestimation of predictive uncertainty. In this paper we review a range of strategies for assessing structural uncertainties in models. The existing strategies fall into two categories depending on whether field data are available for the predicted variable of interest. To date, most research has focussed on situations where inferences on the accuracy of a model structure can be made directly on the basis of field data. This corresponds to a situation of 'interpolation'. However, in many cases environmental models are used for 'extrapolation'; that is, beyond the situation and the field data available for calibration. In the present paper, a framework is presented for assessing the predictive uncertainties of environmental models used for extrapolation. It involves the use of multiple conceptual models, assessment of their pedigree and reflection on the extent to which the sampled models adequately represent the space of plausible models.

  8. Predicting lymphatic filariasis transmission and elimination dynamics using a multi-model ensemble framework.

    PubMed

    Smith, Morgan E; Singh, Brajendra K; Irvine, Michael A; Stolk, Wilma A; Subramanian, Swaminathan; Hollingsworth, T Déirdre; Michael, Edwin

    2017-03-01

    Mathematical models of parasite transmission provide powerful tools for assessing the impacts of interventions. Owing to complexity and uncertainty, no single model may capture all features of transmission and elimination dynamics. Multi-model ensemble modelling offers a framework to help overcome biases of single models. We report on the development of a first multi-model ensemble of three lymphatic filariasis (LF) models (EPIFIL, LYMFASIM, and TRANSFIL), and evaluate its predictive performance in comparison with that of the constituents using calibration and validation data from three case study sites, one each from the three major LF endemic regions: Africa, Southeast Asia and Papua New Guinea (PNG). We assessed the performance of the respective models for predicting the outcomes of annual MDA strategies for various baseline scenarios thought to exemplify the current endemic conditions in the three regions. The results show that the constructed multi-model ensemble outperformed the single models when evaluated across all sites. Single models that best fitted calibration data tended to do less well in simulating the out-of-sample, or validation, intervention data. Scenario modelling results demonstrate that the multi-model ensemble is able to compensate for variance between single models in order to produce more plausible predictions of intervention impacts. Our results highlight the value of an ensemble approach to modelling parasite control dynamics. However, its optimal use will require further methodological improvements as well as consideration of the organizational mechanisms required to ensure that modelling results and data are shared effectively between all stakeholders.

  9. Probabilistic Boolean Network Modelling and Analysis Framework for mRNA Translation.

    PubMed

    Zhao, Yun-Bo; Krishnan, J

    2016-01-01

    mRNA translation is a complex process involving the progression of ribosomes on the mRNA, resulting in the synthesis of proteins, and is subject to multiple layers of regulation. This process has been modelled using different formalisms, both stochastic and deterministic. Recently, we introduced a Probabilistic Boolean modelling framework for mRNA translation, which possesses the advantage of tools for numerically exact computation of steady state probability distribution, without requiring simulation. Here, we extend this model to incorporate both random sequential and parallel update rules, and demonstrate its effectiveness in various settings, including its flexibility in accommodating additional static and dynamic biological complexities and its role in parameter sensitivity analysis. In these applications, the results from the model analysis match those of TASEP model simulations. Importantly, the proposed modelling framework maintains the stochastic aspects of mRNA translation and provides a way to exactly calculate probability distributions, providing additional tools of analysis in this context. Finally, the proposed modelling methodology provides an alternative approach to the understanding of the mRNA translation process, by bridging the gap between existing approaches, providing new analysis tools, and contributing to a more robust platform for modelling and understanding translation.

  10. A Modelling Framework to Assess the Effect of Pressures on River Abiotic Habitat Conditions and Biota.

    PubMed

    Kail, Jochem; Guse, Björn; Radinger, Johannes; Schröder, Maria; Kiesel, Jens; Kleinhans, Maarten; Schuurman, Filip; Fohrer, Nicola; Hering, Daniel; Wolter, Christian

    2015-01-01

    River biota are affected by global reach-scale pressures, but most approaches for predicting biota of rivers focus on river reach or segment scale processes and habitats. Moreover, these approaches do not consider long-term morphological changes that affect habitat conditions. In this study, a modelling framework was further developed and tested to assess the effect of pressures at different spatial scales on reach-scale habitat conditions and biota. Ecohydrological and 1D hydrodynamic models were used to predict discharge and water quality at the catchment scale and the resulting water level at the downstream end of a study reach. Long-term reach morphology was modelled using empirical regime equations, meander migration and 2D morphodynamic models. The respective flow and substrate conditions in the study reach were predicted using a 2D hydrodynamic model, and the suitability of these habitats was assessed with novel habitat models. In addition, dispersal models for fish and macroinvertebrates were developed to assess the re-colonization potential and to finally compare habitat suitability and the availability/ability of species to colonize these habitats. Applicability was tested and model performance was assessed by comparing observed and predicted conditions in the lowland Treene River in northern Germany. Technically, it was possible to link the different models, but future applications would benefit from the development of open source software for all modelling steps to enable fully automated model runs. Future research needs concern the physical modelling of long-term morphodynamics, feedback of biota (e.g., macrophytes) on abiotic habitat conditions, species interactions, and empirical data on the hydraulic habitat suitability and dispersal abilities of macroinvertebrates. The modelling framework is flexible and allows for including additional models and investigating different research and management questions, e.g., in climate impact research as well as

  11. A Modelling Framework to Assess the Effect of Pressures on River Abiotic Habitat Conditions and Biota

    PubMed Central

    Kail, Jochem; Guse, Björn; Radinger, Johannes; Schröder, Maria; Kiesel, Jens; Kleinhans, Maarten; Schuurman, Filip; Fohrer, Nicola; Hering, Daniel; Wolter, Christian

    2015-01-01

    River biota are affected by global reach-scale pressures, but most approaches for predicting biota of rivers focus on river reach or segment scale processes and habitats. Moreover, these approaches do not consider long-term morphological changes that affect habitat conditions. In this study, a modelling framework was further developed and tested to assess the effect of pressures at different spatial scales on reach-scale habitat conditions and biota. Ecohydrological and 1D hydrodynamic models were used to predict discharge and water quality at the catchment scale and the resulting water level at the downstream end of a study reach. Long-term reach morphology was modelled using empirical regime equations, meander migration and 2D morphodynamic models. The respective flow and substrate conditions in the study reach were predicted using a 2D hydrodynamic model, and the suitability of these habitats was assessed with novel habitat models. In addition, dispersal models for fish and macroinvertebrates were developed to assess the re-colonization potential and to finally compare habitat suitability and the availability / ability of species to colonize these habitats. Applicability was tested and model performance was assessed by comparing observed and predicted conditions in the lowland Treene River in northern Germany. Technically, it was possible to link the different models, but future applications would benefit from the development of open source software for all modelling steps to enable fully automated model runs. Future research needs concern the physical modelling of long-term morphodynamics, feedback of biota (e.g., macrophytes) on abiotic habitat conditions, species interactions, and empirical data on the hydraulic habitat suitability and dispersal abilities of macroinvertebrates. The modelling framework is flexible and allows for including additional models and investigating different research and management questions, e.g., in climate impact research as well

  12. A New Open Data Open Modeling Framework for the Geosciences Community (Invited)

    NASA Astrophysics Data System (ADS)

    Liang, X.; Salas, D.; Navarro, M.; Liang, Y.; Teng, W. L.; Hooper, R. P.; Restrepo, P. J.; Bales, J. D.

    2013-12-01

    A prototype Open Hydrospheric Modeling Framework (OHMF), also called Open Data Open Modeling framework, has been developed to address two key modeling challenges faced by the broad research community: (1) accessing external data from diverse sources and (2) execution, coupling, and evaluation/intercomparison of various and complex models. The former is achieved via the Open Data architecture, while the latter is achieved via the Open Modeling architecture. The Open Data architecture adopts a common internal data model and representation, to facilitate the integration of various external data sources into OHMF, using Data Agents that handle remote data access protocols (e.g., OPeNDAP, Web services), metadata standards, and source-specific implementations. These Data Agents hide the heterogeneity of the external data sources and provide a common interface to the OHMF system core. The Open Modeling architecture allows different models or modules to be easily integrated into OHMF. The OHMF architectural design offers a general many-to-many connectivity between individual models and external data sources, instead of one-to-one connectivity from data access to model simulation results. OHMF adopts a graphical scientific workflow, offers tools to re-scale in space and time, and provides multi-scale data fusion and assimilation functionality. Notably, the OHMF system employs a strategy that does not require re-compiling or adding interface codes for a user's model to be integrated. Thus, a corresponding model agent can be easily developed by a user. Once an agent is available for a model, it can be shared and used by others. An example will be presented to illustrate the prototype OHMF system and the automatic flow from accessing data to model simulation results in a user-friendly workflow-controlled environment.

  13. Spatial optimization of watershed management practices for nitrogen load reduction using a modeling-optimization framework.

    PubMed

    Yang, Guoxiang; Best, Elly P H

    2015-09-15

    Best management practices (BMPs) can be used effectively to reduce nutrient loads transported from non-point sources to receiving water bodies. However, methodologies of BMP selection and placement in a cost-effective way are needed to assist watershed management planners and stakeholders. We developed a novel modeling-optimization framework that can be used to find cost-effective solutions of BMP placement to attain nutrient load reduction targets. This was accomplished by integrating a GIS-based BMP siting method, a WQM-TMDL-N modeling approach to estimate total nitrogen (TN) loading, and a multi-objective optimization algorithm. Wetland restoration and buffer strip implementation were the two BMP categories used to explore the performance of this framework, both differing greatly in complexity of spatial analysis for site identification. Minimizing TN load and BMP cost were the two objective functions for the optimization process. The performance of this framework was demonstrated in the Tippecanoe River watershed, Indiana, USA. Optimized scenario-based load reduction indicated that the wetland subset selected by the minimum scenario had the greatest N removal efficiency. Buffer strips were more effective for load removal than wetlands. The optimized solutions provided a range of trade-offs between the two objective functions for both BMPs. This framework can be expanded conveniently to a regional scale because the NHDPlus catchment serves as its spatial computational unit. The present study demonstrated the potential of this framework to find cost-effective solutions to meet a water quality target, such as a 20% TN load reduction, under different conditions.

  14. Recent numerical and algorithmic advances within the volume tracking framework for modeling interfacial flows

    DOE PAGES

    François, Marianne M.

    2015-05-28

    A review of recent advances made in numerical methods and algorithms within the volume tracking framework is presented. The volume tracking method, also known as the volume-of-fluid method has become an established numerical approach to model and simulate interfacial flows. Its advantage is its strict mass conservation. However, because the interface is not explicitly tracked but captured via the material volume fraction on a fixed mesh, accurate estimation of the interface position, its geometric properties and modeling of interfacial physics in the volume tracking framework remain difficult. Several improvements have been made over the last decade to address these challenges.more » In this study, the multimaterial interface reconstruction method via power diagram, curvature estimation via heights and mean values and the balanced-force algorithm for surface tension are highlighted.« less

  15. A unifying framework for systems modeling, control systems design, and system operation

    NASA Technical Reports Server (NTRS)

    Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.

    2005-01-01

    Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.

  16. Frameworks and models--Scaffolding or strait jackets? Problematising reflective practice.

    PubMed

    Kelsey, Catherine; Hayes, Sally

    2015-11-01

    This paper aims to open a debate about the impact of reflective practice questioning whether reflective frameworks and models argued to facilitate the education of highly skilled reflective practitioners can be oppressive rather than emancipatory in outcome. Contemporary education focuses on evidence based and effective practice with reflection at its core leading to empowerment and ultimately emancipation of the profession as independent and equal to medics and other health care professionals. Models and frameworks have therefore been developed to facilitate the education of highly skilled reflective practitioners; able to recognise the need to draw on evidence based practice in order to challenge out-dated methods and engage in new ways of working. This paper however questions the current focus on reflective practice suggesting that reflection in itself can be oppressive and support the commodification of nursing as a 'workforce', the profession at the beck and call of current governmental policy and control.

  17. A unified theoretical framework for mapping models for the multi-state Hamiltonian

    NASA Astrophysics Data System (ADS)

    Liu, Jian

    2016-11-01

    We propose a new unified theoretical framework to construct equivalent representations of the multi-state Hamiltonian operator and present several approaches for the mapping onto the Cartesian phase space. After mapping an F-dimensional Hamiltonian onto an F+1 dimensional space, creation and annihilation operators are defined such that the F+1 dimensional space is complete for any combined excitation. Commutation and anti-commutation relations are then naturally derived, which show that the underlying degrees of freedom are neither bosons nor fermions. This sets the scene for developing equivalent expressions of the Hamiltonian operator in quantum mechanics and their classical/semiclassical counterparts. Six mapping models are presented as examples. The framework also offers a novel way to derive such as the well-known Meyer-Miller model.

  18. Developing a clinical utility framework to evaluate prediction models in radiogenomics

    NASA Astrophysics Data System (ADS)

    Wu, Yirong; Liu, Jie; Munoz del Rio, Alejandro; Page, David C.; Alagoz, Oguzhan; Peissig, Peggy; Onitilo, Adedayo A.; Burnside, Elizabeth S.

    2015-03-01

    Combining imaging and genetic information to predict disease presence and behavior is being codified into an emerging discipline called "radiogenomics." Optimal evaluation methodologies for radiogenomics techniques have not been established. We aim to develop a clinical decision framework based on utility analysis to assess prediction models for breast cancer. Our data comes from a retrospective case-control study, collecting Gail model risk factors, genetic variants (single nucleotide polymorphisms-SNPs), and mammographic features in Breast Imaging Reporting and Data System (BI-RADS) lexicon. We first constructed three logistic regression models built on different sets of predictive features: (1) Gail, (2) Gail+SNP, and (3) Gail+SNP+BI-RADS. Then, we generated ROC curves for three models. After we assigned utility values for each category of findings (true negative, false positive, false negative and true positive), we pursued optimal operating points on ROC curves to achieve maximum expected utility (MEU) of breast cancer diagnosis. We used McNemar's test to compare the predictive performance of the three models. We found that SNPs and BI-RADS features augmented the baseline Gail model in terms of the area under ROC curve (AUC) and MEU. SNPs improved sensitivity of the Gail model (0.276 vs. 0.147) and reduced specificity (0.855 vs. 0.912). When additional mammographic features were added, sensitivity increased to 0.457 and specificity to 0.872. SNPs and mammographic features played a significant role in breast cancer risk estimation (p-value < 0.001). Our decision framework comprising utility analysis and McNemar's test provides a novel framework to evaluate prediction models in the realm of radiogenomics.

  19. A Data-Driven Framework for Rapid Modeling of Wireless Communication Channels

    DTIC Science & Technology

    2013-12-01

    with respect to other modeling techniques such as Support Vector Machines, Random Forests and Boosted Trees . 10.3.6 `1 Regularization Sparse...accuracy and may require a lengthy period of environmental assessment and computation. This dissertation presents a new, data-driven, stochastic...environmental assessment and computation. This dissertation presents a new, data-driven, stochastic framework for rapidly building accurate wireless connectivity

  20. A Framework for Organizing Current and Future Electric Utility Regulatory and Business Models

    SciTech Connect

    Satchwell, Andrew; Cappers, Peter; Schwartz, Lisa; Fadrhonc, Emily Martin

    2015-06-01

    In this report, we will present a descriptive and organizational framework for incremental and fundamental changes to regulatory and utility business models in the context of clean energy public policy goals. We will also discuss the regulated utility's role in providing value-added services that relate to distributed energy resources, identify the "openness" of customer information and utility networks necessary to facilitate change, and discuss the relative risks, and the shifting of risks, for utilities and customers.

  1. A Subbasin-based framework to represent land surface processes in an Earth System Model

    SciTech Connect

    Tesfa, Teklu K.; Li, Hongyi; Leung, Lai-Yung R.; Huang, Maoyi; Ke, Yinghai; Sun, Yu; Liu, Ying

    2014-05-20

    Realistically representing spatial heterogeneity and lateral land surface processes within and between modeling units in earth system models is important because of their implications to surface energy and water exchange. The traditional approach of using regular grids as computational units in land surface models and earth system models may lead to inadequate representation of lateral movements of water, energy and carbon fluxes, especially when the grid resolution increases. Here a new subbasin-based framework is introduced in the Community Land Model (CLM), which is the land component of the Community Earth System Model (CESM). Local processes are represented assuming each subbasin as a grid cell on a pseudo grid matrix with no significant modifications to the existing CLM modeling structure. Lateral routing of water within and between subbasins is simulated with the subbasin version of a recently-developed physically based routing model, Model for Scale Adaptive River Routing (MOSART). As an illustration, this new framework is implemented in the topographically diverse region of the U.S. Pacific Northwest. The modeling units (subbasins) are delineated from high-resolution Digital Elevation Model while atmospheric forcing and surface parameters are remapped from the corresponding high resolution datasets. The impacts of this representation on simulating hydrologic processes are explored by comparing it with the default (grid-based) CLM representation. In addition, the effects of DEM resolution on parameterizing topography and the subsequent effects on runoff processes are investigated. Limited model evaluation and comparison showed that small difference between the averaged forcing can lead to more significant difference in the simulated runoff and streamflow because of nonlinear horizontal processes. Topographic indices derived from high resolution DEM may not improve the overall water balance, but affect the partitioning between surface and subsurface runoff

  2. A dynamic modelling framework towards the solution of reduction in smoking prevalence

    NASA Astrophysics Data System (ADS)

    Halim, Tisya Farida Abdul; Sapiri, Hasimah; Abidin, Norhaslinda Zainal

    2016-10-01

    This paper presents a hypothetical framework towards the solution for reduction in smoking prevalence in Malaysia. The framework is design to assist in decision making process related to reduction in smoking prevalence using SD and OCT. In general, this framework is developed using SD approach where OCT is embedded in the policy evaluation process. Smoking prevalence is one of the determinant which plays an important role in measuring a successful implementation of anti-smoking strategies. Therefore, it is critical to determine the optimal value of smoking prevalence in order to trim down the hazardous effects of smoking to society. Conversely, smoking problem becomes increasingly complex since many issues that ranged from behavioral to economical need to be considered simultaneously. Thus, a hypothetical framework of the control model embedded in the SD methodology is expected to obtain the minimum value of smoking prevalence which the output in turn will provide a guideline for tobacco researchers as well as decision makers for policy design and evaluation.

  3. Overview of the Special Issue: A Multi-Model Framework to ...

    EPA Pesticide Factsheets

    The Climate Change Impacts and Risk Analysis (CIRA) project establishes a new multi-model framework to systematically assess the impacts, economic damages, and risks from climate change in the United States. The primary goal of this framework to estimate how climate change impacts and damages in the United States are avoided or reduced due to global greenhouse gas (GHG) emissions mitigation scenarios. Scenarios are designed to explore key uncertainties around the measurement of these changes. The modeling exercise presented in this Special Issue includes two integrated assessment models and 15 sectoral models encompassing six broad impacts sectors - water resources, electric power, infrastructure, human health, ecosystems, and forests. Three consistent emissions scenarios are used to analyze the benefits of global GHG mitigation targets: a reference and two policy scenarios, with total radiative forcing in 2100 of 10.0W/m2, 4.5W/m2, and 3.7W/m2. A range of climate sensitivities, climate models, natural variability measures, and structural uncertainties of sectoral models are examined to explore the implications of key uncertainties. This overview paper describes the motivations, goals, design, and academic contribution of the CIRA modeling exercise and briefly summarizes the subsequent papers in this Special Issue. A summary of results across impact sectors is provided showing that: GHG mitigation provides benefits to the United States that increase over

  4. A stochastic hybrid systems based framework for modeling dependent failure processes.

    PubMed

    Fan, Mengfei; Zeng, Zhiguo; Zio, Enrico; Kang, Rui; Chen, Ying

    2017-01-01

    In this paper, we develop a framework to model and analyze systems that are subject to dependent, competing degradation processes and random shocks. The degradation processes are described by stochastic differential equations, whereas transitions between the system discrete states are triggered by random shocks. The modeling is, then, based on Stochastic Hybrid Systems (SHS), whose state space is comprised of a continuous state determined by stochastic differential equations and a discrete state driven by stochastic transitions and reset maps. A set of differential equations are derived to characterize the conditional moments of the state variables. System reliability and its lower bounds are estimated from these conditional moments, using the First Order Second Moment (FOSM) method and Markov inequality, respectively. The developed framework is applied to model three dependent failure processes from literature and a comparison is made to Monte Carlo simulations. The results demonstrate that the developed framework is able to yield an accurate estimation of reliability with less computational costs compared to traditional Monte Carlo-based methods.

  5. A stochastic hybrid systems based framework for modeling dependent failure processes

    PubMed Central

    Fan, Mengfei; Zeng, Zhiguo; Zio, Enrico; Kang, Rui; Chen, Ying

    2017-01-01

    In this paper, we develop a framework to model and analyze systems that are subject to dependent, competing degradation processes and random shocks. The degradation processes are described by stochastic differential equations, whereas transitions between the system discrete states are triggered by random shocks. The modeling is, then, based on Stochastic Hybrid Systems (SHS), whose state space is comprised of a continuous state determined by stochastic differential equations and a discrete state driven by stochastic transitions and reset maps. A set of differential equations are derived to characterize the conditional moments of the state variables. System reliability and its lower bounds are estimated from these conditional moments, using the First Order Second Moment (FOSM) method and Markov inequality, respectively. The developed framework is applied to model three dependent failure processes from literature and a comparison is made to Monte Carlo simulations. The results demonstrate that the developed framework is able to yield an accurate estimation of reliability with less computational costs compared to traditional Monte Carlo-based methods. PMID:28231313

  6. A Bioinformatics Reference Model: Towards a Framework for Developing and Organising Bioinformatic Resources

    NASA Astrophysics Data System (ADS)

    Hiew, Hong Liang; Bellgard, Matthew

    2007-11-01

    Life Science research faces the constant challenge of how to effectively handle an ever-growing body of bioinformatics software and online resources. The users and developers of bioinformatics resources have a diverse set of competing demands on how these resources need to be developed and organised. Unfortunately, there does not exist an adequate community-wide framework to integrate such competing demands. The problems that arise from this include unstructured standards development, the emergence of tools that do not meet specific needs of researchers, and often times a communications gap between those who use the tools and those who supply them. This paper presents an overview of the different functions and needs of bioinformatics stakeholders to determine what may be required in a community-wide framework. A Bioinformatics Reference Model is proposed as a basis for such a framework. The reference model outlines the functional relationship between research usage and technical aspects of bioinformatics resources. It separates important functions into multiple structured layers, clarifies how they relate to each other, and highlights the gaps that need to be addressed for progress towards a diverse, manageable, and sustainable body of resources. The relevance of this reference model to the bioscience research community, and its implications in progress for organising our bioinformatics resources, are discussed.

  7. A predictive framework for evaluating models of semantic organization in free recall

    PubMed Central

    Morton, Neal W; Polyn, Sean M.

    2016-01-01

    Research in free recall has demonstrated that semantic associations reliably influence the organization of search through episodic memory. However, the specific structure of these associations and the mechanisms by which they influence memory search remain unclear. We introduce a likelihood-based model-comparison technique, which embeds a model of semantic structure within the context maintenance and retrieval (CMR) model of human memory search. Within this framework, model variants are evaluated in terms of their ability to predict the specific sequence in which items are recalled. We compare three models of semantic structure, latent semantic analysis (LSA), global vectors (GloVe), and word association spaces (WAS), and find that models using WAS have the greatest predictive power. Furthermore, we find evidence that semantic and temporal organization is driven by distinct item and context cues, rather than a single context cue. This finding provides important constraint for theories of memory search.

  8. A Framework for Quantitative Modeling of Neural Circuits Involved in Sleep-to-Wake Transition

    PubMed Central

    Sorooshyari, Siamak; Huerta, Ramón; de Lecea, Luis

    2015-01-01

    Identifying the neuronal circuits and dynamics of sleep-to-wake transition is essential to understanding brain regulation of behavioral states, including sleep–wake cycles, arousal, and hyperarousal. Recent work by different laboratories has used optogenetics to determine the role of individual neuromodulators in state transitions. The optogenetically driven data do not yet provide a multi-dimensional schematic of the mechanisms underlying changes in vigilance states. This work presents a modeling framework to interpret, assist, and drive research on the sleep-regulatory network. We identify feedback, redundancy, and gating hierarchy as three fundamental aspects of this model. The presented model is expected to expand as additional data on the contribution of each transmitter to a vigilance state becomes available. Incorporation of conductance-based models of neuronal ensembles into this model and existing models of cortical excitability will provide more comprehensive insight into sleep dynamics as well as sleep and arousal-related disorders. PMID:25767461

  9. ERUPTION TO DOSE: COUPLING A TEPHRA DISPERSAL MODEL WITHIN A PERFORMANCE ASSESSMENT FRAMEWORK

    SciTech Connect

    G. N. Keating, J. Pelletier

    2005-08-26

    The tephra dispersal model used by the Yucca Mountain Project (YMP) to evaluate the potential consequences of a volcanic eruption through the waste repository must incorporate simplifications in order to function within a large Monte-Carlo style performance assessment framework. That is, the explicit physics of the conduit, vent, and eruption column processes are abstracted to a 2-D, steady-state advection-dispersion model (ASHPLUME) that can be run quickly over thousands of realizations of the overall system model. Given the continuous development of tephra dispersal modeling techniques in the last few years, we evaluated the adequacy of this simplified model for its intended purpose within the YMP total system performance assessment (TSPA) model. We evaluated uncertainties inherent in model simplifications including (1) instantaneous, steady-state vs. unsteady eruption, which affects column height, (2) constant wind conditions, and (3) power-law distribution of the tephra blanket; comparisons were made to other models and published ash distributions. Spatial statistics are useful for evaluating differences in these model output vs. results using more complex wind, column height, and tephra deposition patterns. However, in order to assess the adequacy of the model for its intended use in TSPA, we evaluated the propagation of these uncertainties through FAR, the YMP ash redistribution model, which utilizes ASHPLUME tephra deposition results to calculate the concentration of nuclear waste-contaminated tephra at a dose-receptor population as a result of sedimentary transport and mixing processes on the landscape. Questions we sought to answer include: (1) what conditions of unsteadiness, wind variability, or departure from simplified tephra distribution result in significant effects on waste concentration (related to dose calculated for the receptor population)? (2) What criteria can be established for the adequacy of a tephra dispersal model within the TSPA

  10. Pursuing realistic hydrologic model under SUPERFLEX framework in a semi-humid catchment in China

    NASA Astrophysics Data System (ADS)

    Wei, Lingna; Savenije, Hubert H. G.; Gao, Hongkai; Chen, Xi

    2016-04-01

    Model realism is pursued perpetually by hydrologists for flood and drought prediction, integrated water resources management and decision support of water security. "Physical-based" distributed hydrologic models are speedily developed but they also encounter unneglectable challenges, for instance, computational time with low efficiency and parameters uncertainty. This study step-wisely tested four conceptual hydrologic models under the framework of SUPERFLEX in a small semi-humid catchment in southern Huai River basin of China. The original lumped FLEXL has hypothesized model structure of four reservoirs to represent canopy interception, unsaturated zone, subsurface flow of fast and slow components and base flow storage. Considering the uneven rainfall in space, the second model (FLEXD) is developed with same parameter set for different rain gauge controlling units. To reveal the effect of topography, terrain descriptor of height above the nearest drainage (HAND) combined with slope is applied to classify the experimental catchment into two landscapes. Then the third one (FLEXTOPO) builds different model blocks in consideration of the dominant hydrologic process corresponding to the topographical condition. The fourth one named FLEXTOPOD integrating the parallel framework of FLEXTOPO in four controlled units is designed to interpret spatial variability of rainfall patterns and topographic features. Through pairwise comparison, our results suggest that: (1) semi-distributed models (FLEXD and FLEXTOPOD) taking precipitation spatial heterogeneity into account has improved model performance with parsimonious parameter set, and (2) hydrologic model architecture with flexibility to reflect perceived dominant hydrologic processes can include the local terrain circumstances for each landscape. Hence, the modeling actions are coincided with the catchment behaviour and close to the "reality". The presented methodology is regarding hydrologic model as a tool to test our

  11. An ice sheet model validation framework for the Greenland ice sheet

    DOE PAGES

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; ...

    2017-01-17

    We propose a new ice sheet model validation framework the Cryospheric Model Comparison Tool (CMCT) that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013 using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quanti- tative metricsmore » for use in evaluating the different model simulations against the observations. We find 10 that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin- and whole-ice-sheet scale metrics, the model initial condition as well as output from idealized and dynamic models all provide an equally reasonable representation of the ice sheet surface (mean elevation differences of <1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CMCT, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate predictive skill with respect to observed dynamic changes occurring on Greenland over the past few

  12. An ice sheet model validation framework for the Greenland ice sheet

    NASA Astrophysics Data System (ADS)

    Price, Stephen F.; Hoffman, Matthew J.; Bonin, Jennifer A.; Howat, Ian M.; Neumann, Thomas; Saba, Jack; Tezaur, Irina; Guerber, Jeffrey; Chambers, Don P.; Evans, Katherine J.; Kennedy, Joseph H.; Lenaerts, Jan; Lipscomb, William H.; Perego, Mauro; Salinger, Andrew G.; Tuminaro, Raymond S.; van den Broeke, Michiel R.; Nowicki, Sophie M. J.

    2017-01-01

    We propose a new ice sheet model validation framework - the Cryospheric Model Comparison Tool (CmCt) - that takes advantage of ice sheet altimetry and gravimetry observations collected over the past several decades and is applied here to modeling of the Greenland ice sheet. We use realistic simulations performed with the Community Ice Sheet Model (CISM) along with two idealized, non-dynamic models to demonstrate the framework and its use. Dynamic simulations with CISM are forced from 1991 to 2013, using combinations of reanalysis-based surface mass balance and observations of outlet glacier flux change. We propose and demonstrate qualitative and quantitative metrics for use in evaluating the different model simulations against the observations. We find that the altimetry observations used here are largely ambiguous in terms of their ability to distinguish one simulation from another. Based on basin-scale and whole-ice-sheet-scale metrics, we find that simulations using both idealized conceptual models and dynamic, numerical models provide an equally reasonable representation of the ice sheet surface (mean elevation differences of < 1 m). This is likely due to their short period of record, biases inherent to digital elevation models used for model initial conditions, and biases resulting from firn dynamics, which are not explicitly accounted for in the models or observations. On the other hand, we find that the gravimetry observations used here are able to unambiguously distinguish between simulations of varying complexity, and along with the CmCt, can provide a quantitative score for assessing a particular model and/or simulation. The new framework demonstrates that our proposed metrics can distinguish relatively better from relatively worse simulations and that dynamic ice sheet models, when appropriately initialized and forced with the right boundary conditions, demonstrate a predictive skill with respect to observed dynamic changes that have occurred on

  13. Multi-object segmentation framework using deformable models for medical imaging analysis.

    PubMed

    Namías, Rafael; D'Amato, Juan Pablo; Del Fresno, Mariana; Vénere, Marcelo; Pirró, Nicola; Bellemare, Marc-Emmanuel

    2016-08-01

    Segmenting structures of interest in medical images is an important step in different tasks such as visualization, quantitative analysis, simulation, and image-guided surgery, among several other clinical applications. Numerous segmentation methods have been developed in the past three decades for extraction of anatomical or functional structures on medical imaging. Deformable models, which include the active contour models or snakes, are among the most popular methods for image segmentation combining several desirable features such as inherent connectivity and smoothness. Even though different approaches have been proposed and significant work has been dedicated to the improvement of such algorithms, there are still challenging research directions as the simultaneous extraction of multiple objects and the integration of individual techniques. This paper presents a novel open-source framework called deformable model array (DMA) for the segmentation of multiple and complex structures of interest in different imaging modalities. While most active contour algorithms can extract one region at a time, DMA allows integrating several deformable models to deal with multiple segmentation scenarios. Moreover, it is possible to consider any existing explicit deformable model formulation and even to incorporate new active contour methods, allowing to select a suitable combination in different conditions. The framework also introduces a control module that coordinates the cooperative evolution of the snakes and is able to solve interaction issues toward the segmentation goal. Thus, DMA can implement complex object and multi-object segmentations in both 2D and 3D using the contextual information derived from the model interaction. These are important features for several medical image analysis tasks in which different but related objects need to be simultaneously extracted. Experimental results on both computed tomography and magnetic resonance imaging show that the proposed

  14. A hydro-economic modelling framework for optimal management of groundwater nitrate pollution from agriculture

    NASA Astrophysics Data System (ADS)

    Peña-Haro, Salvador; Pulido-Velazquez, Manuel; Sahuquillo, Andrés

    2009-06-01

    SummaryA hydro-economic modelling framework is developed for determining optimal management of groundwater nitrate pollution from agriculture. A holistic optimization model determines the spatial and temporal fertilizer application rate that maximizes the net benefits in agriculture constrained by the quality requirements in groundwater at various control sites. Since emissions (nitrogen loading rates) are what can be controlled, but the concentrations are the policy targets, we need to relate both. Agronomic simulations are used to obtain the nitrate leached, while numerical groundwater flow and solute transport simulation models were used to develop unit source solutions that were assembled into a pollutant concentration response matrix. The integration of the response matrix in the constraints of the management model allows simulating by superposition the evolution of groundwater nitrate concentration over time at different points of interest throughout the aquifer resulting from multiple pollutant sources distributed over time and space. In this way, the modelling framework relates the fertilizer loads with the nitrate concentration at the control sites. The benefits in agriculture were determined through crop prices and crop production functions. This research aims to contribute to the ongoing policy process in the Europe Union (the Water Framework Directive) providing a tool for analyzing the opportunity cost of measures for reducing nitrogen loadings and assessing their effectiveness for maintaining groundwater nitrate concentration within the target levels. The management model was applied to a hypothetical groundwater system. Optimal solutions of fertilizer use to problems with different initial conditions, planning horizons, and recovery times were determined. The illustrative example shows the importance of the location of the pollution sources in relation to the control sites, and how both the selected planning horizon and the target recovery time can

  15. Verification of Advances in a Coupled Snow-runoff Modeling Framework for Operational Streamflow Forecasts

    NASA Astrophysics Data System (ADS)

    Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.

    2011-12-01

    The National Oceanic and Atmospheric Administration's (NOAA's) River Forecast Centers (RFCs) issue hydrologic forecasts related to flood events, reservoir operations for water supply, streamflow regulation, and recreation on the nation's streams and rivers. The RFCs use the National Weather Service River Forecast System (NWSRFS) for streamflow forecasting which relies on a coupled snow model (i.e. SNOW17) and rainfall-runoff model (i.e. SAC-SMA) in snow-dominated regions of the US. Errors arise in various steps of the forecasting system from input data, model structure, model parameters, and initial states. The goal of the current study is to undertake verification of potential improvements in the SNOW17-SAC-SMA modeling framework developed for operational streamflow forecasts. We undertake verification for a range of parameters sets (i.e. RFC, DREAM (Differential Evolution Adaptive Metropolis)) as well as a data assimilation (DA) framework developed for the coupled models. Verification is also undertaken for various initial conditions to observe the influence of variability in initial conditions on the forecast. The study basin is the North Fork America River Basin (NFARB) located on the western side of the Sierra Nevada Mountains in northern California. Hindcasts are verified using both deterministic (i.e. Nash Sutcliffe efficiency, root mean square error, and joint distribution) and probabilistic (i.e. reliability diagram, discrimination diagram, containing ratio, and Quantile plots) statistics. Our presentation includes comparison of the performance of different optimized parameters and the DA framework as well as assessment of the impact associated with the initial conditions used for streamflow forecasts for the NFARB.

  16. A Linear Mixed Model Spline Framework for Analysing Time Course ‘Omics’ Data

    PubMed Central

    Straube, Jasmin; Gorse, Alain-Dominique

    2015-01-01

    Time course ‘omics’ experiments are becoming increasingly important to study system-wide dynamic regulation. Despite their high information content, analysis remains challenging. ‘Omics’ technologies capture quantitative measurements on tens of thousands of molecules. Therefore, in a time course ‘omics’ experiment molecules are measured for multiple subjects over multiple time points. This results in a large, high-dimensional dataset, which requires computationally efficient approaches for statistical analysis. Moreover, methods need to be able to handle missing values and various levels of noise. We present a novel, robust and powerful framework to analyze time course ‘omics’ data that consists of three stages: quality assessment and filtering, profile modelling, and analysis. The first step consists of removing molecules for which expression or abundance is highly variable over time. The second step models each molecular expression profile in a linear mixed model framework which takes into account subject-specific variability. The best model is selected through a serial model selection approach and results in dimension reduction of the time course data. The final step includes two types of analysis of the modelled trajectories, namely, clustering analysis to identify groups of correlated profiles over time, and differential expression analysis to identify profiles which differ over time and/or between treatment groups. Through simulation studies we demonstrate the high sensitivity and specificity of our approach for differential expression analysis. We then illustrate how our framework can bring novel insights on two time course ‘omics’ studies in breast cancer and kidney rejection. The methods are publicly available, implemented in the R CRAN package lmms. PMID:26313144

  17. A versatile platform for multilevel modeling of physiological systems: template/instance framework for large-scale modeling and simulation.

    PubMed

    Asai, Yoshiyuki; Abe, Takeshi; Oka, Hideki; Okita, Masao; Okuyama, Tomohiro; Hagihara, Ken-Ichi; Ghosh, Samik; Matsuoka, Yukiko; Kurachi, Yoshihisa; Kitano, Hrioaki

    2013-01-01

    Building multilevel models of physiological systems is a significant and effective method for integrating a huge amount of bio-physiological data and knowledge obtained by earlier experiments and simulations. Since such models tend to be large in size and complicated in structure, appropriate software frameworks for supporting modeling activities are required. A software platform, PhysioDesigner, has been developed, which supports the process of creating multilevel models. Models developed on PhysioDesigner are established in an XML format called PHML. Every physiological entity in a model is represented as a module, and hence a model constitutes an aggregation of modules. When the number of entities of which the model is comprised is large, it is difficult to manage the entities manually, and some semiautomatic assistive functions are necessary. In this article, which focuses particularly on recently developed features of the platform for building large-scale models utilizing a template/instance framework and morphological information, the PhysioDesigner platform is introduced.

  18. Short term global health experiences and local partnership models: a framework.

    PubMed

    Loh, Lawrence C; Cherniak, William; Dreifuss, Bradley A; Dacso, Matthew M; Lin, Henry C; Evert, Jessica

    2015-12-18

    Contemporary interest in in short-term experiences in global health (STEGH) has led to important questions of ethics, responsibility, and potential harms to receiving communities. In addressing these issues, the role of local engagement through partnerships between external STEGH facilitating organization(s) and internal community organization(s) has been identified as crucial to mitigating potential pitfalls. This perspective piece offers a framework to categorize different models of local engagement in STEGH based on professional experiences and a review of the existing literature. This framework will encourage STEGH stakeholders to consider partnership models in the development and evaluation of new or existing programs.The proposed framework examines the community context in which STEGH may occur, and considers three broad categories: number of visiting external groups conducting STEGH (single/multiple), number of host entities that interact with the STEGH (none/single/multiple), and frequency of STEGH (continuous/intermittent). These factors culminate in a specific model that provides a description of opportunities and challenges presented by each model. Considering different models, single visiting partners, working without a local partner on an intermittent (or even one-time) basis provided the greatest flexibility to the STEGH participants, but represented the least integration locally and subsequently the greatest potential harm for the receiving community. Other models, such as multiple visiting teams continuously working with a single local partner, provided an opportunity for centralization of efforts and local input, but required investment in consensus-building and streamlining of processes across different groups. We conclude that involving host partners in the design, implementation, and evaluation of STEGH requires more effort on the part of visiting STEGH groups and facilitators, but has the greatest potential benefit for meaningful, locally

  19. SMART: A New Semi-distributed Hydrologic Modelling Framework for Soil Moisture and Runoff Simulations

    NASA Astrophysics Data System (ADS)

    Ajami, Hoori; Sharma, Ashish

    2016-04-01

    A new GIS-based semi-distributed hydrological modelling framework is developed based upon the delineation of contiguous and topologically connected Hydrologic Response Units (HRUs). The Soil Moisture and Runoff simulation Toolkit (SMART) performs topographic and geomorphic analysis of a catchment and delineates HRUs in each first order sub-basin. This HRU delineation approach maintains lateral flow dynamics in first order sub-basins and therefore it is suited for simulating runoff in upland catchments. Simulation elements in SMART are distributed cross sections or equivalent cross sections (ECS) in each first order sub-basin to represent hillslope hydrologic processes. Delineation of ECSs in SMART is performed by weighting the topographic and physiographic properties of the part or entire first-order sub-basin and has the advantage of reducing computational time/effort while maintaining reasonable accuracy in simulated hydrologic state and fluxes (e.g. soil moisture, evapotranspiration and runoff). SMART workflow is written in MATLAB to automate the HRU and cross section delineations, model simulations across multiple cross sections, and post-processing of model outputs to visualize the results. The MATLAB Parallel Processing Toolbox is used for simultaneous simulations of cross sections and is further reduced computational time. SMART workflow tasks are: 1) delineation of first order sub-basins of a catchment using a digital elevation model, 2) hillslope delineation, 3) landform delineation in every first order sub-basin based on topographic and geomorphic properties of a group of sub-basins or the entire catchment, 4) formulation of cross sections as well as equivalent cross sections in every first order sub-basin, and 5) deriving vegetation and soil parameters from spatially distributed land cover and soil information. The current version of SMART uses a 2-d distributed hydrological model based on the Richards' equation. However, any hydrologic model can be

  20. Adding Abstraction and Reuse to a Network Modelling Tool Using the Reuseware Composition Framework

    NASA Astrophysics Data System (ADS)

    Johannes, Jendrik; Fernández, Miguel A.

    Domain-specific modelling (DSM) environments enable experts in a certain domain to actively participate in model-driven development. Developing DSM environments need to be cost-efficient, since they are only used by a limited group of domain experts. Different model-driven technologies promise to allow this cost-efficient development. [1] presented experiences in developing a DSM environment for telecommunication network modelling. There, challenges were identified that need to be addressed by other new modelling technologies. In this paper, we now present the results of addressing one of theses challenges - abstraction and reuse support - with the Reuseware Composition Framework. We show how we identified the abstraction and reuse features required in the telecommunication DSM environment in a case study and extended the existing environment with these features using Reuseware. We discuss the advantages of using this technology and propose a process for further improving the abstraction and reuse capabilities of the DSM environment in the future.

  1. Ensuring HL7-based information model requirements within an ontology framework.

    PubMed

    Ouagne, David; Nadah, Nadia; Schober, Daniel; Choquet, Rémy; Teodoro, Douglas; Colaert, Dirk; Schulz, Stefan; Jaulent, Marie-Christine; Daniel, Christel

    2010-01-01

    This paper describes the building of an HL7-based Information Model Ontology (IMO) that can be exploited by a domain ontology in order to distribute querying over different clinical data repositories. We employed the Open Medical Development Framework (OMDF) based on a model driven development methodology. OMDF provides model transformation features to build an HL7-based information model that covers the conceptual scope of a target project. The resulting IMO is used to mediate between ontologically queries and information retrieval from semantically less defined Hospital Information Systems (HIS). In the context of the DebugIT project - which scope corresponds to the control of infectious diseases and antimicrobial resistances - Information Model Ontology is integrated to the DebugIT domain ontology in order to express queries.

  2. A Petri-Nets Based Unified Modeling Approach for Zachman Framework Cells

    NASA Astrophysics Data System (ADS)

    Ostadzadeh, S. Shervin; Nekoui, Mohammad Ali

    With a trend toward becoming more and more information based, enterprises constantly attempt to surpass the accomplishments of each other by improving their information activities. In this respect, Enterprise Architecture (EA) has proven to serve as a fundamental concept to accomplish this goal. Enterprise architecture clearly provides a thorough outline of the whole enterprise applications and systems with their relationships to enterprise business goals. To establish such an outline, a logical framework needs to be laid upon the entire information system called Enterprise Architecture Framework (EAF). Among various proposed EAF, Zachman Framework (ZF) has been widely accepted as a standard scheme for identifying and organizing descriptive representations that have critical roles in enterprise management and development. One of the problems faced in using ZF is the lack of formal and verifiable models for its cells. In this paper, we proposed a formal language based on Petri nets in order to obtain verifiable models for all cells in ZF. The presented method helps developers to validate and verify completely integrated business and IT systems which results in improve the effectiveness or efficiency of the enterprise itself.

  3. Simulating mesoscale coastal evolution for decadal coastal management: A new framework integrating multiple, complementary modelling approaches

    NASA Astrophysics Data System (ADS)

    van Maanen, Barend; Nicholls, Robert J.; French, Jon R.; Barkwith, Andrew; Bonaldo, Davide; Burningham, Helene; Brad Murray, A.; Payo, Andres; Sutherland, James; Thornhill, Gillian; Townend, Ian H.; van der Wegen, Mick; Walkden, Mike J. A.

    2016-03-01

    Coastal and shoreline management increasingly needs to consider morphological change occurring at decadal to centennial timescales, especially that related to climate change and sea-level rise. This requires the development of morphological models operating at a mesoscale, defined by time and length scales of the order 101 to 102 years and 101 to 102 km. So-called 'reduced complexity' models that represent critical processes at scales not much smaller than the primary scale of interest, and are regulated by capturing the critical feedbacks that govern landform behaviour, are proving effective as a means of exploring emergent coastal behaviour at a landscape scale. Such models tend to be computationally efficient and are thus easily applied within a probabilistic framework. At the same time, reductionist models, built upon a more detailed description of hydrodynamic and sediment transport processes, are capable of application at increasingly broad spatial and temporal scales. More qualitative modelling approaches are also emerging that can guide the development and deployment of quantitative models, and these can be supplemented by varied data-driven modelling approaches that can achieve new explanatory insights from observational datasets. Such disparate approaches have hitherto been pursued largely in isolation by mutually exclusive modelling communities. Brought together, they have the potential to facilitate a step change in our ability to simulate the evolution of coastal morphology at scales that are most relevant to managing erosion and flood risk. Here, we advocate and outline a new integrated modelling framework that deploys coupled mesoscale reduced complexity models, reductionist coastal area models, data-driven approaches, and qualitative conceptual models. Integration of these heterogeneous approaches gives rise to model compositions that can potentially resolve decadal- to centennial-scale behaviour of diverse coupled open coast, estuary and inner

  4. A second gradient theoretical framework for hierarchical multiscale modeling of materials

    SciTech Connect

    Luscher, Darby J; Bronkhorst, Curt A; Mc Dowell, David L

    2009-01-01

    A theoretical framework for the hierarchical multiscale modeling of inelastic response of heterogeneous materials has been presented. Within this multiscale framework, the second gradient is used as a non local kinematic link between the response of a material point at the coarse scale and the response of a neighborhood of material points at the fine scale. Kinematic consistency between these scales results in specific requirements for constraints on the fluctuation field. The wryness tensor serves as a second-order measure of strain. The nature of the second-order strain induces anti-symmetry in the first order stress at the coarse scale. The multiscale ISV constitutive theory is couched in the coarse scale intermediate configuration, from which an important new concept in scale transitions emerges, namely scale invariance of dissipation. Finally, a strategy for developing meaningful kinematic ISVs and the proper free energy functions and evolution kinetics is presented.

  5. Increased flexibility for modeling telemetry and nest-survival data using the multistate framework

    USGS Publications Warehouse

    Devineau, Olivier; Kendall, William L.; Doherty, Paul F.; Shenk, Tanya M.; White, Gary C.; Lukacs, Paul M.; Burnham, Kenneth P.

    2014-01-01

    Although telemetry is one of the most common tools used in the study of wildlife, advances in the analysis of telemetry data have lagged compared to progress in the development of telemetry devices. We demonstrate how standard known-fate telemetry and related nest-survival data analysis models are special cases of the more general multistate framework. We present a short theoretical development, and 2 case examples regarding the American black duck and the mallard. We also present a more complex lynx data analysis. Although not necessary in all situations, the multistate framework provides additional flexibility to analyze telemetry data, which may help analysts and biologists better deal with the vagaries of real-world data collection.

  6. Modeling Plan-Related Clinical Complications Using Machine Learning Tools in a Multiplan IMRT Framework

    SciTech Connect

    Zhang, Hao H.; D'Souza, Warren D. Shi Leyuan; Meyer, Robert R.

    2009-08-01

    Purpose: To predict organ-at-risk (OAR) complications as a function of dose-volume (DV) constraint settings without explicit plan computation in a multiplan intensity-modulated radiotherapy (IMRT) framework. Methods and Materials: Several plans were generated by varying the DV constraints (input features) on the OARs (multiplan framework), and the DV levels achieved by the OARs in the plans (plan properties) were modeled as a function of the imposed DV constraint settings. OAR complications were then predicted for each of the plans by using the imposed DV constraints alone (features) or in combination with modeled DV levels (plan properties) as input to machine learning (ML) algorithms. These ML approaches were used to model two OAR complications after head-and-neck and prostate IMRT: xerostomia, and Grade 2 rectal bleeding. Two-fold cross-validation was used for model verification and mean errors are reported. Results: Errors for modeling the achieved DV values as a function of constraint settings were 0-6%. In the head-and-neck case, the mean absolute prediction error of the saliva flow rate normalized to the pretreatment saliva flow rate was 0.42% with a 95% confidence interval of (0.41-0.43%). In the prostate case, an average prediction accuracy of 97.04% with a 95% confidence interval of (96.67-97.41%) was achieved for Grade 2 rectal bleeding complications. Conclusions: ML can be used for predicting OAR complications during treatment planning allowing for alternative DV constraint settings to be assessed within the planning framework.

  7. A general mathematical framework to model generation structure in a population of asynchronously dividing cells.

    PubMed

    León, Kalet; Faro, Jose; Carneiro, Jorge

    2004-08-21

    In otherwise homogeneous cell populations, individual cells undergo asynchronous cell cycles. In recent years, interest in this fundamental observation has been boosted by the wide usage of CFSE, a fluorescent dye that allows the precise estimation by flow cytometry of the number of divisions performed by different cells in a population, and thus the generation structure. In this work, we propose two general mathematical frameworks to model the time evolution of generation structure in a cell population. The first modeling framework is more descriptive and assumes that cell division time is distributed in the cell population, due to intrinsic noise in the molecular machinery in individual cells; while the second framework assumes that asynchrony in cell division stems from randomness in the interactions individual cells make with environmental agents. We reduce these formalisms to recover two preexistent models, which build on each of the hypotheses. When confronted to kinetics data on CFSE labeled cells taken from literature, these models can fit precursor frequency distributions at each measured time point. However, they fail to fit the whole kinetics of precursor frequency distributions. In contrast, two extensions of those models, derived also from our general formalisms, fit equally well both the whole kinetics and individual profiles at each time point, providing a biologically reasonable estimation of parameters. We prove that the distribution of cell division times is not Gaussian, as previously proposed, but is better described by an asymmetric distribution such as the Gamma distribution. We show also that the observed cell asynchrony could be explained by the existence of a single transitional event during cell division. Based on these results, we suggest new ways of combining theoretical and experimental work to assess how much of noise in internal machinery of the cell and interactions with the environmental agents contribute to the asynchrony in cell

  8. Construction of 3-D geologic framework and textural models for Cuyama Valley groundwater basin, California

    USGS Publications Warehouse

    Sweetkind, Donald S.; Faunt, Claudia C.; Hanson, Randall T.

    2013-01-01

    Groundwater is the sole source of water supply in Cuyama Valley, a rural agricultural area in Santa Barbara County, California, in the southeasternmost part of the Coast Ranges of California. Continued groundwater withdrawals and associated water-resource management concerns have prompted an evaluation of the hydrogeology and water availability for the Cuyama Valley groundwater basin by the U.S. Geological Survey, in cooperation with the Water Agency Division of the Santa Barbara County Department of Public Works. As a part of the overall groundwater evaluation, this report documents the construction of a digital three-dimensional geologic framework model of the groundwater basin suitable for use within a numerical hydrologic-flow model. The report also includes an analysis of the spatial variability of lithology and grain size, which forms the geologic basis for estimating aquifer hydraulic properties. The geologic framework was constructed as a digital representation of the interpreted geometry and thickness of the principal stratigraphic units within the Cuyama Valley groundwater basin, which include younger alluvium, older alluvium, and the Morales Formation, and underlying consolidated bedrock. The framework model was constructed by creating gridded surfaces representing the altitude of the top of each stratigraphic unit from various input data, including lithologic and electric logs from oil and gas wells and water wells, cross sections, and geologic maps. Sediment grain-size data were analyzed in both two and three dimensions to help define textural variations in the Cuyama Valley groundwater basin and identify areas with similar geologic materials that potentially have fairly uniform hydraulic properties. Sediment grain size was used to construct three-dimensional textural models that employed simple interpolation between drill holes and two-dimensional textural models for each stratigraphic unit that incorporated spatial structure of the textural data.

  9. Implementations of a Flexible Framework for Managing Geologic Sequestration Modeling Projects

    SciTech Connect

    White, Signe K.; Gosink, Luke J.; Sivaramakrishnan, Chandrika; Black, Gary D.; Purohit, Sumit; Bacon, Diana H.; Hou, Zhangshuan; Lin, Guang; Gorton, Ian; Bonneville, Alain

    2013-08-06

    Numerical simulation is a standard practice used to support designing, operating, and monitoring CO2 injection projects. Although a variety of computational tools have been developed that support the numerical simulation process, many are single-purpose or platform specific and have a prescribed workflow that may or may not be suitable for a particular project. We are developing an open-source, flexible framework named Velo that provides a knowledge management infrastructure and tools to support modeling and simulation for various types of projects in a number of scientific domains. The Geologic Sequestration Software Suite (GS3) is a version of this framework with features and tools specifically tailored for geologic sequestration studies. Because of its general nature, GS3 is being employed in a variety of ways on projects with differing goals. GS3 is being used to support the Sim-SEQ international model comparison study, by providing a collaborative framework for the modeling teams and providing tools for model comparison. Another customized deployment of GS3 has been made to support the permit application process. In this case, GS3 is being used to manage data in support of conceptual model development and provide documentation and provenance for numerical simulations. An additional customized deployment of GS3 is being created for use by the United States Environmental Protection Agency (US-EPA) to aid in the CO2 injection permit application review process in one of its regions. These use cases demonstrate GS3’s flexibility, utility, and broad applicability

  10. Inferring landscape effects on gene flow: a new model selection framework.

    PubMed

    Shirk, A J; Wallin, D O; Cushman, S A; Rice, C G; Warheit, K I

    2010-09-01

    Populations in fragmented landscapes experience reduced gene flow, lose genetic diversity over time and ultimately face greater extinction risk. Improving connectivity in fragmented landscapes is now a major focus of conservation biology. Designing effective wildlife corridors for this purpose, however, requires an accurate understanding of how landscapes shape gene flow. The preponderance of landscape resistance models generated to date, however, is subjectively parameterized based on expert opinion or proxy measures of gene flow. While the relatively few studies that use genetic data are more rigorous, frameworks they employ frequently yield models only weakly related to the observed patterns of genetic isolation. Here, we describe a new framework that uses expert opinion as a starting point. By systematically varying each model parameter, we sought to either validate the assumptions of expert opinion, or identify a peak of support for a new model more highly related to genetic isolation. This approach also accounts for interactions between variables, allows for nonlinear responses and excludes variables that reduce model performance. We demonstrate its utility on a population of mountain goats inhabiting a fragmented landscape in the Cascade Range, Washington.

  11. A Modeling Framework to Incorporate Effects of Infrastructure in Sociohydrological Systems

    NASA Astrophysics Data System (ADS)

    Muneepeerakul, R.

    2014-12-01

    In studying coupled natural-human systems, most modeling efforts focus on humans and the natural resources. In reality, however, humans rarely interact with these resources directly; the relationships between humans and resources are mediated by infrastructures. In sociohydrological systems, these include, for example, dams and irrigation canals. These infrastructures have important characteristics such as threshold behavior and a separate entity/organization tasked with maintaining them. These characteristics influence social dynamics within the system, which in turn determines the state of infrastructure and water usage, thereby exerting feedbacks onto the hydrological processes. Infrastructure is thus a necessary ingredient for modeling co-evolution of human and water in sociohydrological systems. A conceptual framework to address this gap has been proposed by Anderies, Janssen, and Ostrom (2004). Here we develop a model to operationalize the framework and report some preliminary results. Simple in its setup, the model highlights the structure of the social dilemmas and how it affects the system's sustainability. The model also offers a platform to explore how the system's sustainability may respond to external shocks from globalization and global climate change.

  12. A Mechanistic Modeling Framework for Predicting Metabolic Interactions in Complex Mixtures

    PubMed Central

    Cheng, Shu

    2011-01-01

    Background: Computational modeling of the absorption, distribution, metabolism, and excretion of chemicals is now theoretically able to describe metabolic interactions in realistic mixtures of tens to hundreds of substances. That framework awaits validation. Objectives: Our objectives were to a) evaluate the conditions of application of such a framework, b) confront the predictions of a physiologically integrated model of benzene, toluene, ethylbenzene, and m-xylene (BTEX) interactions with observed kinetics data on these substances in mixtures and, c) assess whether improving the mechanistic description has the potential to lead to better predictions of interactions. Methods: We developed three joint models of BTEX toxicokinetics and metabolism and calibrated them using Markov chain Monte Carlo simulations and single-substance exposure data. We then checked their predictive capabilities for metabolic interactions by comparison with mixture kinetic data. Results: The simplest joint model (BTEX interacting competitively for cytochrome P450 2E1 access) gives qualitatively correct and quantitatively acceptable predictions (with at most 50% deviations from the data). More complex models with two pathways or back-competition with metabolites have the potential to further improve predictions for BTEX mixtures. Conclusions: A systems biology approach to large-scale prediction of metabolic interactions is advantageous on several counts and technically feasible. However, ways to obtain the required parameters need to be further explored. PMID:21835728

  13. Modular GIS Framework for National Scale Hydrologic and Hydraulic Modeling Support

    NASA Astrophysics Data System (ADS)

    Djokic, D.; Noman, N.; Kopp, S.

    2015-12-01

    Geographic information systems (GIS) have been extensively used for pre- and post-processing of hydrologic and hydraulic models at multiple scales. An extensible GIS-based framework was developed for characterization of drainage systems (stream networks, catchments, floodplain characteristics) and model integration. The framework is implemented as a set of free, open source, Python tools and builds on core ArcGIS functionality and uses geoprocessing capabilities to ensure extensibility. Utilization of COTS GIS core capabilities allows immediate use of model results in a variety of existing online applications and integration with other data sources and applications.The poster presents the use of this framework to downscale global hydrologic models to local hydraulic scale and post process the hydraulic modeling results and generate floodplains at any local resolution. Flow forecasts from ECMWF or WRF-Hydro are downscaled and combined with other ancillary data for input into the RAPID flood routing model. RAPID model results (stream flow along each reach) are ingested into a GIS-based scale dependent stream network database for efficient flow utilization and visualization over space and time. Once the flows are known at localized reaches, the tools can be used to derive the floodplain depth and extent for each time step in the forecast at any available local resolution. If existing rating curves are available they can be used to relate the flow to the depth of flooding, or synthetic rating curves can be derived using the tools in the toolkit and some ancillary data/assumptions. The results can be published as time-enabled spatial services to be consumed by web applications that use floodplain information as an input. Some of the existing online presentation templates can be easily combined with available online demographic and infrastructure data to present the impact of the potential floods on the local community through simple, end user products. This framework

  14. Using the Bifocal Modeling Framework to Resolve "Discrepant Events" Between Physical Experiments and Virtual Models in Biology

    NASA Astrophysics Data System (ADS)

    Blikstein, Paulo; Fuhrmann, Tamar; Salehi, Shima

    2016-08-01

    In this paper, we investigate an approach to supporting students' learning in science through a combination of physical experimentation and virtual modeling. We present a study that utilizes a scientific inquiry framework, which we call "bifocal modeling," to link student-designed experiments and computer models in real time. In this study, a group of high school students designed computer models of bacterial growth with reference to a simultaneous physical experiment they were conducting, and were able to validate the correctness of their model against the results of their experiment. Our findings suggest that as the students compared their virtual models with physical experiments, they encountered "discrepant events" that contradicted their existing conceptions and elicited a state of cognitive disequilibrium. This experience of conflict encouraged students to further examine their ideas and to seek more accurate explanations of the observed natural phenomena, improving the design of their computer models.

  15. HyDE Framework for Stochastic and Hybrid Model-Based Diagnosis

    NASA Technical Reports Server (NTRS)

    Narasimhan, Sriram; Brownston, Lee

    2012-01-01

    Hybrid Diagnosis Engine (HyDE) is a general framework for stochastic and hybrid model-based diagnosis that offers flexibility to the diagnosis application designer. The HyDE architecture supports the use of multiple modeling paradigms at the component and system level. Several alternative algorithms are available for the various steps in diagnostic reasoning. This approach is extensible, with support for the addition of new modeling paradigms as well as diagnostic reasoning algorithms for existing or new modeling paradigms. HyDE is a general framework for stochastic hybrid model-based diagnosis of discrete faults; that is, spontaneous changes in operating modes of components. HyDE combines ideas from consistency-based and stochastic approaches to model- based diagnosis using discrete and continuous models to create a flexible and extensible architecture for stochastic and hybrid diagnosis. HyDE supports the use of multiple paradigms and is extensible to support new paradigms. HyDE generates candidate diagnoses and checks them for consistency with the observations. It uses hybrid models built by the users and sensor data from the system to deduce the state of the system over time, including changes in state indicative of faults. At each time step when observations are available, HyDE checks each existing candidate for continued consistency with the new observations. If the candidate is consistent, it continues to remain in the candidate set. If it is not consistent, then the information about the inconsistency is used to generate successor candidates while discarding the candidate that was inconsistent. The models used by HyDE are similar to simulation models. They describe the expected behavior of the system under nominal and fault conditions. The model can be constructed in modular and hierarchical fashion by building component/subsystem models (which may themselves contain component/ subsystem models) and linking them through shared variables/parameters. The

  16. Land-Atmosphere Coupling in the Multi-Scale Modelling Framework

    NASA Astrophysics Data System (ADS)

    Kraus, P. M.; Denning, S.

    2015-12-01

    The Multi-Scale Modeling Framework (MMF), in which cloud-resolving models (CRMs) are embedded within general circulation model (GCM) gridcells to serve as the model's cloud parameterization, has offered a number of benefits to GCM simulations. The coupling of these cloud-resolving models directly to land surface model instances, rather than passing averaged atmospheric variables to a single instance of a land surface model, the logical next step in model development, has recently been accomplished. This new configuration offers conspicuous improvements to estimates of precipitation and canopy through-fall, but overall the model exhibits warm surface temperature biases and low productivity.This work presents modifications to a land-surface model that take advantage of the new multi-scale modeling framework, and accommodate the change in spatial scale from a typical GCM range of ~200 km to the CRM grid-scale of 4 km.A parameterization is introduced to apportion modeled surface radiation into direct-beam and diffuse components. The diffuse component is then distributed among the land-surface model instances within each GCM cell domain. This substantially reduces the number excessively low light values provided to the land-surface model when cloudy conditions are modeled in the CRM, associated with its 1-D radiation scheme. The small spatial scale of the CRM, ~4 km, as compared with the typical ~200 km GCM scale, provides much more realistic estimates of precipitation intensity, this permits the elimination of a model parameterization of canopy through-fall. However, runoff at such scales can no longer be considered as an immediate flow to the ocean. Allowing sub-surface water flow between land-surface instances within the GCM domain affords better realism and also reduces temperature and productivity biases.The MMF affords a number of opportunities to land-surface modelers, providing both the advantages of direct simulation at the 4 km scale and a much reduced

  17. An Overview of Models of Speaking Performance and Its Implications for the Development of Procedural Framework for Diagnostic Speaking Tests

    ERIC Educational Resources Information Center

    Zhao, Zhongbao

    2013-01-01

    This paper aims at developing a procedural framework for the development and validation of diagnostic speaking tests. The researcher reviews the current available models of speaking performance, analyzes the distinctive features and then points out the implications for the development of a procedural framework for diagnostic speaking tests. On…

  18. Framework for Smart Electronic Health Record-Linked Predictive Models to Optimize Care for Complex Digestive Diseases

    DTIC Science & Technology

    2013-06-01

    AD_________________ Award Number: W81XWH-11-2-0133 TITLE: Framework for Smart Electronic Health...NUMBER Framework for Smart Electronic Health Record-Linked Predictive Models to Optimize Care for Complex Digestive Diseases 5b. GRANT NUMBER...an intelligent workspace , by displaying annotation forms and de-identified reports with the same view, automatic report queuing and providing easy

  19. A Methodology for Doctrine in Modeling and Simulation: Battle Management Language (BML) and the Mission to Means Framework (MMF)

    DTIC Science & Technology

    2004-09-01

    and Means Framework (MMF - a framework for explicitly specifying the military mission and quantitatively evaluating the mission utility of alternative...In collaboration with the US Army and selected US Naval and US Air Force projects, the Defense Modeling and Simulation Office developed the Missions

  20. The Behavioral Ecological Model as a Framework for School-Based Anti-Bullying Health Promotion Interventions

    ERIC Educational Resources Information Center

    Dresler-Hawke, Emma; Whitehead, Dean

    2009-01-01

    This article presents a conceptual strategy which uses the Behavioral Ecological Model (BEM) as a health promotion framework to guide school-based bullying awareness programs and subsequent anti-bullying strategies for school nursing practice. Anti-bullying frameworks and tools are scarce despite the extent of the problem of bullying. This article…

  1. Development of an Integrated Modeling Framework for Simulations of Coastal Processes in Deltaic Environments Using High-Performance Computing

    DTIC Science & Technology

    2009-01-01

    ALPACA : This NSF funded project is developing debugging and profiling tools for the Cactus framework which will support the Coastal Modeling Framework...developed in this project. (http://www.cactuscode.org/Development/ alpaca ) CyberTools: This NSF/BOR funded project is developing a

  2. Development of an Integrated Modeling Framework for Simulations of Coastal Processes in Deltaic Environments Using High-Performance Computing

    DTIC Science & Technology

    2008-01-01

    http://www.cactuscode.org/Development/xirel) ALPACA : This NSF funded project is developing debugging and profiling tools for the Cactus framework...which will support the Coastal Modeling Framework developed in this project. (http://www.cactuscode.org/Development/ alpaca ) CyberTools: This NSF/BOR

  3. SPARK: A Framework for Multi-Scale Agent-Based Biomedical Modeling.

    PubMed

    Solovyev, Alexey; Mikheev, Maxim; Zhou, Leming; Dutta-Moscato, Joyeeta; Ziraldo, Cordelia; An, Gary; Vodovotz, Yoram; Mi, Qi

    2010-01-01

    Multi-scale modeling of complex biological systems remains a central challenge in the systems biology community. A method of dynamic knowledge representation known as agent-based modeling enables the study of higher level behavior emerging from discrete events performed by individual components. With the advancement of computer technology, agent-based modeling has emerged as an innovative technique to model the complexities of systems biology. In this work, the authors describe SPARK (Simple Platform for Agent-based Representation of Knowledge), a framework for agent-based modeling specifically designed for systems-level biomedical model development. SPARK is a stand-alone application written in Java. It provides a user-friendly interface, and a simple programming language for developing Agent-Based Models (ABMs). SPARK has the following features specialized for modeling biomedical systems: 1) continuous space that can simulate real physical space; 2) flexible agent size and shape that can represent the relative proportions of various cell types; 3) multiple spaces that can concurrently simulate and visualize multiple scales in biomedical models; 4) a convenient graphical user interface. Existing ABMs of diabetic foot ulcers and acute inflammation were implemented in SPARK. Models of identical complexity were run in both NetLogo and SPARK; the SPARK-based models ran two to three times faster.

  4. A statistical framework for the validation of a population exposure model based on personal exposure data

    NASA Astrophysics Data System (ADS)

    Rodriguez, Delphy; Valari, Myrto; Markakis, Konstantinos; Payan, Sébastien

    2016-04-01

    Currently, ambient pollutant concentrations at monitoring sites are routinely measured by local networks, such as AIRPARIF in Paris, France. Pollutant concentration fields are also simulated with regional-scale chemistry transport models such as CHIMERE (http://www.lmd.polytechnique.fr/chimere) under air-quality forecasting platforms (e.g. Prev'Air http://www.prevair.org) or research projects. These data may be combined with more or less sophisticated techniques to provide a fairly good representation of pollutant concentration spatial gradients over urban areas. Here we focus on human exposure to atmospheric contaminants. Based on census data on population dynamics and demographics, modeled outdoor concentrations and infiltration of outdoor air-pollution indoors we have developed a population exposure model for ozone and PM2.5. A critical challenge in the field of population exposure modeling is model validation since personal exposure data are expensive and therefore, rare. However, recent research has made low cost mobile sensors fairly common and therefore personal exposure data should become more and more accessible. In view of planned cohort field-campaigns where such data will be available over the Paris region, we propose in the present study a statistical framework that makes the comparison between modeled and measured exposures meaningful. Our ultimate goal is to evaluate the exposure model by comparing modeled exposures to monitor data. The scientific question we address here is how to downscale modeled data that are estimated on the county population scale at the individual scale which is appropriate to the available measurements. To assess this question we developed a Bayesian hierarchical framework that assimilates actual individual data into population statistics and updates the probability estimate.

  5. How much cryosphere model complexity is just right? Exploration using the conceptual cryosphere hydrology framework

    NASA Astrophysics Data System (ADS)

    Mosier, Thomas M.; Hill, David F.; Sharp, Kendra V.

    2016-09-01

    Making meaningful projections of the impacts that possible future climates would have on water resources in mountain regions requires understanding how cryosphere hydrology model performance changes under altered climate conditions and when the model is applied to ungaged catchments. Further, if we are to develop better models, we must understand which specific process representations limit model performance. This article presents a modeling tool, named the Conceptual Cryosphere Hydrology Framework (CCHF), that enables implementing and evaluating a wide range of cryosphere modeling hypotheses. The CCHF represents cryosphere hydrology systems using a set of coupled process modules that allows easily interchanging individual module representations and includes analysis tools to evaluate model outputs. CCHF version 1 (Mosier, 2016) implements model formulations that require only precipitation and temperature as climate inputs - for example variations on simple degree-index (SDI) or enhanced temperature index (ETI) formulations - because these model structures are often applied in data-sparse mountain regions, and perform relatively well over short periods, but their calibration is known to change based on climate and geography. Using CCHF, we implement seven existing and novel models, including one existing SDI model, two existing ETI models, and four novel models that utilize a combination of existing and novel module representations. The novel module representations include a heat transfer formulation with net longwave radiation and a snowpack internal energy formulation that uses an approximation of the cold content. We assess the models for the Gulkana and Wolverine glaciated watersheds in Alaska, which have markedly different climates and contain long-term US Geological Survey benchmark glaciers. Overall we find that the best performing models are those that are more physically consistent and representative, but no single model performs best for all of our model

  6. Differentiate climate change uncertainty from other uncertainty sources for water quality modeling with Bayesian framework

    NASA Astrophysics Data System (ADS)

    Jiang, S.; Liu, M.; Rode, M.

    2011-12-01

    Prediction of water quality under future climate changes is always associated with significant uncertainty resulting from the use of climate models and stochastic weather generator. The future related uncertainty is usually mixed with the intrinsic uncertainty sources arising from model structure and parameterization which present also for modeling past and current events. For an effective water quality management policy, the uncertainty sources have to be differentiated and quantified separately. This work applies the Baysian framework in two steps to quantify the climate change uncertainty as input uncertainty and the parameter uncertainty respectively. The HYPE model (Hydrological Prediction for the Environment) from SMHI is applied to simulate the nutrient (N, P) sources in a 100 km2 agricultural low-land catchment in Germany, Weida. The results show that climate change shifts the uncertainty space in terms of probability density function (PDF), and a large portion of future uncertainty is not covered by current uncertainty.

  7. A Regression Framework for Effect Size Assessments in Longitudinal Modeling of Group Differences.

    PubMed

    Feingold, Alan

    2013-03-01

    The use of growth modeling analysis (GMA)--particularly multilevel analysis and latent growth modeling--to test the significance of intervention effects has increased exponentially in prevention science, clinical psychology, and psychiatry over the past 15 years. Model-based effect sizes for differences in means between two independent groups in GMA can be expressed in the same metric (Cohen's d) commonly used in classical analysis and meta-analysis. This article first reviews conceptual issues regarding calculation of d for findings from GMA and then introduces an integrative framework for effect size assessments that subsumes GMA. The new approach uses the structure of the linear regression model, from which effect sizes for findings from diverse cross-sectional and longitudinal analyses can be calculated with familiar statistics, such as the regression coefficient, the standard deviation of the dependent measure, and study duration.

  8. A Computational Framework to Model Degradation of Biocorrodible Metal Stents Using an Implicit Finite Element Solver.

    PubMed

    Debusschere, Nic; Segers, Patrick; Dubruel, Peter; Verhegghe, Benedict; De Beule, Matthieu

    2016-02-01

    Bioresorbable stents represent an emerging technological development within the field of cardiovascular angioplasty. Their temporary presence avoids long-term side effects of non-degradable stents such as in-stent restenosis, late stent thrombosis and fatigue induced strut fracture. Several numerical modelling strategies have been proposed to evaluate the transitional mechanical characteristics of biodegradable stents using a continuum damage framework. However, these methods rely on an explicit finite-element integration scheme which, in combination with the quasi-static nature of many simulations involving stents and the small element size needed to model corrosion mechanisms, results in a high computational cost. To reduce the simulation times and to expand the general applicability of these degradation models, this paper investigates an implicit finite element solution method to model degradation of biodegradable stents.

  9. Feminist Framework Plus: Knitting Feminist Theories of Rape Etiology Into a Comprehensive Model.

    PubMed

    McPhail, Beverly A

    2016-07-01

    The radical-liberal feminist perspective on rape posits that the assault is motivated by power and control rather than sexual gratification and is a violent rather than a sexual act. However, rape is a complex act. Relying on only one early strand of feminist thought to explain the etiology of rape limits feminists' understanding of rape and the practice based upon the theory. The history of the adoption of the "power, not sex" theory is presented and the model critiqued. A more integrated model is developed and presented, the Feminist Framework Plus, which knits together five feminist theories into a comprehensive model that better explains the depth and breadth of the etiology of rape. Empirical evidence that supports each theory is detailed as well as the implications of the model on service provision, education, and advocacy.

  10. OpenMx: An Open Source Extended Structural Equation Modeling Framework.

    PubMed

    Boker, Steven; Neale, Michael; Maes, Hermine; Wilde, Michael; Spiegel, Michael; Brick, Timothy; Spies, Jeffrey; Estabrook, Ryne; Kenny, Sarah; Bates, Timothy; Mehta, Paras; Fox, John

    2011-04-01

    OpenMx is free, full-featured, open source, structural equation modeling (SEM) software. OpenMx runs within the R statistical programming environment on Windows, Mac OS-X, and Linux computers. The rationale for developing OpenMx is discussed along with the philosophy behind the user interface. The OpenMx data structures are introduced - these novel structures define the user interface framework and provide new opportunities for model specification. Two short example scripts for the specification and fitting of a confirmatory factor model are next presented. We end with an abbreviated list of modeling applications available in OpenMx 1.0 and a discussion of directions for future development.

  11. Threat driven modeling framework using petri nets for e-learning system.

    PubMed

    Khamparia, Aditya; Pandey, Babita

    2016-01-01

    Vulnerabilities at various levels are main cause of security risks in e-learning system. This paper presents a modified threat driven modeling framework, to identify the threats after risk assessment which requires mitigation and how to mitigate those threats. To model those threat mitigations aspects oriented stochastic petri nets are used. This paper included security metrics based on vulnerabilities present in e-learning system. The Common Vulnerability Scoring System designed to provide a normalized method for rating vulnerabilities which will be used as basis in metric definitions and calculations. A case study has been also proposed which shows the need and feasibility of using aspect oriented stochastic petri net models for threat modeling which improves reliability, consistency and robustness of the e-learning system.

  12. A quantitative framework to evaluate modeling of cortical development by neural stem cells

    PubMed Central

    Stein, Jason L.; de la Torre-Ubieta, Luis; Tian, Yuan; Parikshak, Neelroop N.; Hernandez, Israel A.; Marchetto, Maria C.; Baker, Dylan K.; Lu, Daning; Hinman, Cassidy R.; Lowe, Jennifer K.; Wexler, Eric M.; Muotri, Alysson R.; Gage, Fred H.; Kosik, Kenneth S.; Geschwind, Daniel H.

    2014-01-01

    Summary Neural stem cells have been adopted to model a wide range of neuropsychiatric conditions in vitro. However, how well such models correspond to in vivo brain has not been evaluated in an unbiased, comprehensive manner. We used transcriptomic analyses to compare in vitro systems to developing human fetal brain and observed strong conservation of in vivo gene expression and network architecture in differentiating primary human neural progenitor cells (phNPCs). Conserved modules are enriched in genes associated with ASD, supporting the utility of phNPCs for studying neuropsychiatric disease. We also developed and validated a machine learning approach called CoNTExT that identifies the developmental maturity and regional identity of in vitro models. We observed strong differences between in vitro models, including hiPSC-derived neural progenitors from multiple laboratories. This work provides a systems biology framework for evaluating in vitro systems and supports their value in studying the molecular mechanisms of human neurodevelopmental disease. PMID:24991955

  13. A deep learning framework for modeling structural features of RNA-binding protein targets

    PubMed Central

    Zhang, Sai; Zhou, Jingtian; Hu, Hailin; Gong, Haipeng; Chen, Ligong; Cheng, Chao; Zeng, Jianyang

    2016-01-01

    RNA-binding proteins (RBPs) play important roles in the post-transcriptional control of RNAs. Identifying RBP binding sites and characterizing RBP binding preferences are key steps toward understanding the basic mechanisms of the post-transcriptional gene regulation. Though numerous computational methods have been developed for modeling RBP binding preferences, discovering a complete structural representation of the RBP targets by integrating their available structural features in all three dimensions is still a challenging task. In this paper, we develop a general and flexible deep learning framework for modeling structural binding preferences and predicting binding sites of RBPs, which takes (predicted) RNA tertiary structural information into account for the first time. Our framework constructs a unified representation that characterizes the structural specificities of RBP targets in all three dimensions, which can be further used to predict novel candidate binding sites and discover potential binding motifs. Through testing on the real CLIP-seq datasets, we have demonstrated that our deep learning framework can automatically extract effective hidden structural features from the encoded raw sequence and structural profiles, and predict accurate RBP binding sites. In addition, we have conducted the first study to show that integrating the additional RNA tertiary structural features can improve the model performance in predicting RBP binding sites, especially for the polypyrimidine tract-binding protein (PTB), which also provides a new evidence to support the view that RBPs may own specific tertiary structural binding preferences. In particular, the tests on the internal ribosome entry site (IRES) segments yield satisfiable results with experimental support from the literature and further demonstrate the necessity of incorporating RNA tertiary structural information into the prediction model. The source code of our approach can be found in https

  14. A deep learning framework for modeling structural features of RNA-binding protein targets.

    PubMed

    Zhang, Sai; Zhou, Jingtian; Hu, Hailin; Gong, Haipeng; Chen, Ligong; Cheng, Chao; Zeng, Jianyang

    2016-02-29

    RNA-binding proteins (RBPs) play important roles in the post-transcriptional control of RNAs. Identifying RBP binding sites and characterizing RBP binding preferences are key steps toward understanding the basic mechanisms of the post-transcriptional gene regulation. Though numerous computational methods have been developed for modeling RBP binding preferences, discovering a complete structural representation of the RBP targets by integrating their available structural features in all three dimensions is still a challenging task. In this paper, we develop a general and flexible deep learning framework for modeling structural binding preferences and predicting binding sites of RBPs, which takes (predicted) RNA tertiary structural information into account for the first time. Our framework constructs a unified representation that characterizes the structural specificities of RBP targets in all three dimensions, which can be further used to predict novel candidate binding sites and discover potential binding motifs. Through testing on the real CLIP-seq datasets, we have demonstrated that our deep learning framework can automatically extract effective hidden structural features from the encoded raw sequence and structural profiles, and predict accurate RBP binding sites. In addition, we have conducted the first study to show that integrating the additional RNA tertiary structural features can improve the model performance in predicting RBP binding sites, especially for the polypyrimidine tract-binding protein (PTB), which also provides a new evidence to support the view that RBPs may own specific tertiary structural binding preferences. In particular, the tests on the internal ribosome entry site (IRES) segments yield satisfiable results with experimental support from the literature and further demonstrate the necessity of incorporating RNA tertiary structural information into the prediction model. The source code of our approach can be found in https://github.com/thucombio/deepnet-rbp.

  15. Modeling Framework and Validation of a Smart Grid and Demand Response System for Wind Power Integration

    SciTech Connect

    Broeer, Torsten; Fuller, Jason C.; Tuffner, Francis K.; Chassin, David P.; Djilali, Ned

    2014-01-31

    Electricity generation from wind power and other renewable energy sources is increasing, and their variability introduces new challenges to the power system. The emergence of smart grid technologies in recent years has seen a paradigm shift in redefining the electrical system of the future, in which controlled response of the demand side is used to balance fluctuations and intermittencies from the generation side. This paper presents a modeling framework for an integrated electricity system where loads become an additional resource. The agent-based model represents a smart grid power system integrating generators, transmission, distribution, loads and market. The model incorporates generator and load controllers, allowing suppliers and demanders to bid into a Real-Time Pricing (RTP) electricity market. The modeling framework is applied to represent a physical demonstration project conducted on the Olympic Peninsula, Washington, USA, and validation simulations are performed using actual dynamic data. Wind power is then introduced into the power generation mix illustrating the potential of demand response to mitigate the impact of wind power variability, primarily through thermostatically controlled loads. The results also indicate that effective implementation of Demand Response (DR) to assist integration of variable renewable energy resources requires a diversity of loads to ensure functionality of the overall system.

  16. Efficient methods for screening of metal organic framework membranes for gas separations using atomically detailed models.

    PubMed

    Keskin, Seda; Sholl, David S

    2009-10-06

    Metal organic frameworks (MOFs) define a diverse class of nanoporous materials having potential applications in adsorption-based and membrane-based gas separations. We have previously used atomically detailed models to predict the performance of MOFs for membrane-based separations of gases, but these calculations require considerable computational resources and time. Here, we introduce an efficient approximate method for screening MOFs based on atomistic models that will accelerate the modeling of membrane applications. The validity of this approximate method is examined by comparison with detailed calculations for CH4/H2, CO2/CH4, and CO2/H2 mixtures at room temperature permeating through IRMOF-1 and CuBTC membranes. These results allow us to hypothesize a connection between two computationally efficient correlations predicting mixture adsorption and mixture self-diffusion properties and the validity of our approximate screening method. We then apply our model to six additional MOFs, IRMOF-8, -9, -10, and -14, Zn(bdc)(ted)0.5, and COF-102, to examine the effect of chemical diversity and interpenetration on the performance of metal organic framework membranes for light gas separations.

  17. Toward a model framework of generalized parallel componential processing of multi-symbol numbers.

    PubMed

    Huber, Stefan; Cornelsen, Sonja; Moeller, Korbinian; Nuerk, Hans-Christoph

    2015-05-01

    In this article, we propose and evaluate a new model framework of parallel componential multi-symbol number processing, generalizing the idea of parallel componential processing of multi-digit numbers to the case of negative numbers by considering the polarity signs similar to single digits. In a first step, we evaluated this account by defining and investigating a sign-decade compatibility effect for the comparison of positive and negative numbers, which extends the unit-decade compatibility effect in 2-digit number processing. Then, we evaluated whether the model is capable of accounting for previous findings in negative number processing. In a magnitude comparison task, in which participants had to single out the larger of 2 integers, we observed a reliable sign-decade compatibility effect with prolonged reaction times for incompatible (e.g., -97 vs. +53; in which the number with the larger decade digit has the smaller, i.e., negative polarity sign) as compared with sign-decade compatible number pairs (e.g., -53 vs. +97). Moreover, an analysis of participants' eye fixation behavior corroborated our model of parallel componential processing of multi-symbol numbers. These results are discussed in light of concurrent theoretical notions about negative number processing. On the basis of the present results, we propose a generalized integrated model framework of parallel componential multi-symbol processing.

  18. OpenDrift - an open source framework for ocean trajectory modeling

    NASA Astrophysics Data System (ADS)

    Dagestad, Knut-Frode; Breivik, Øyvind; Ådlandsvik, Bjørn

    2016-04-01

    We will present a new, open source tool for modeling the trajectories and fate of particles or substances (Lagrangian Elements) drifting in the ocean, or even in the atmosphere. The software is named OpenDrift, and has been developed at Norwegian Meteorological Institute in cooperation with Institute of Marine Research. OpenDrift is a generic framework written in Python, and is openly available at https://github.com/knutfrode/opendrift/. The framework is modular with respect to three aspects: (1) obtaining input data, (2) the transport/morphological processes, and (3) exporting of results to file. Modularity is achieved through well defined interfaces between components, and use of a consistent vocabulary (CF conventions) for naming of variables. Modular input implies that it is not necessary to preprocess input data (e.g. currents, wind and waves from Eulerian models) to a particular file format. Instead "reader modules" can be written/used to obtain data directly from any original source, including files or through web based protocols (e.g. OPeNDAP/Thredds). Modularity of processes implies that a model developer may focus on the geophysical processes relevant for the application of interest, without needing to consider technical tasks such as reading, reprojecting, and colocating input data, rotation and scaling of vectors and model output. We will show a few example applications of using OpenDrift for predicting drifters, oil spills, and search and rescue objects.

  19. Linked Hydrologic-Hydrodynamic Model Framework to Forecast Impacts of Rivers on Beach Water Quality

    NASA Astrophysics Data System (ADS)

    Anderson, E. J.; Fry, L. M.; Kramer, E.; Ritzenthaler, A.

    2014-12-01

    The goal of NOAA's beach quality forecasting program is to use a multi-faceted approach to aid in detection and prediction of bacteria in recreational waters. In particular, our focus has been on the connection between tributary loads and bacteria concentrations at nearby beaches. While there is a clear link between stormwater runoff and beach water quality, quantifying the contribution of river loadings to nearshore bacterial concentrations is complicated due to multiple processes that drive bacterial concentrations in rivers as well as those processes affecting the fate and transport of bacteria upon exiting the rivers. In order to forecast potential impacts of rivers on beach water quality, we developed a linked hydrologic-hydrodynamic water quality framework that simulates accumulation and washoff of bacteria from the landscape, and then predicts the fate and transport of washed off bacteria from the watershed to the coastal zone. The framework includes a watershed model (IHACRES) to predict fecal indicator bacteria (FIB) loadings to the coastal environment (accumulation, wash-off, die-off) as a function of effective rainfall. These loadings are input into a coastal hydrodynamic model (FVCOM), including a bacteria transport model (Lagrangian particle), to simulate 3D bacteria transport within the coastal environment. This modeling system provides predictive tools to assist local managers in decision-making to reduce human health threats.

  20. Developing a utility decision framework to evaluate predictive models in breast cancer risk estimation

    PubMed Central

    Wu, Yirong; Abbey, Craig K.; Chen, Xianqiao; Liu, Jie; Page, David C.; Alagoz, Oguzhan; Peissig, Peggy; Onitilo, Adedayo A.; Burnside, Elizabeth S.

    2015-01-01

    Abstract. Combining imaging and genetic information to predict disease presence and progression is being codified into an emerging discipline called “radiogenomics.” Optimal evaluation methodologies for radiogenomics have not been well established. We aim to develop a decision framework based on utility analysis to assess predictive models for breast cancer diagnosis. We garnered Gail risk factors, single nucleotide polymorphisms (SNPs), and mammographic features from a retrospective case-control study. We constructed three logistic regression models built on different sets of predictive features: (1) Gail, (2) Gail + Mammo, and (3) Gail + Mammo + SNP. Then we generated receiver operating characteristic (ROC) curves for three models. After we assigned utility values for each category of outcomes (true negatives, false positives, false negatives, and true positives), we pursued optimal operating points on ROC curves to achieve maximum expected utility of breast cancer diagnosis. We performed McNemar’s test based on threshold levels at optimal operating points, and found that SNPs and mammographic features played a significant role in breast cancer risk estimation. Our study comprising utility analysis and McNemar’s test provides a decision framework to evaluate predictive models in breast cancer risk estimation. PMID:26835489

  1. Physically based estimation of soil water retention from textural data: General framework, new models, and streamlined existing models

    USGS Publications Warehouse

    Nimmo, J.R.; Herkelrath, W.N.; Laguna, Luna A.M.

    2007-01-01

    Numerous models are in widespread use for the estimation of soil water retention from more easily measured textural data. Improved models are needed for better prediction and wider applicability. We developed a basic framework from which new and existing models can be derived to facilitate improvements. Starting from the assumption that every particle has a characteristic dimension R associated uniquely with a matric pressure ?? and that the form of the ??-R relation is the defining characteristic of each model, this framework leads to particular models by specification of geometric relationships between pores and particles. Typical assumptions are that particles are spheres, pores are cylinders with volume equal to the associated particle volume times the void ratio, and that the capillary inverse proportionality between radius and matric pressure is valid. Examples include fixed-pore-shape and fixed-pore-length models. We also developed alternative versions of the model of Arya and Paris that eliminate its interval-size dependence and other problems. The alternative models are calculable by direct application of algebraic formulas rather than manipulation of data tables and intermediate results, and they easily combine with other models (e.g., incorporating structural effects) that are formulated on a continuous basis. Additionally, we developed a family of models based on the same pore geometry as the widely used unsaturated hydraulic conductivity model of Mualem. Predictions of measurements for different suitable media show that some of the models provide consistently good results and can be chosen based on ease of calculations and other factors. ?? Soil Science Society of America. All rights reserved.

  2. Framework for e-learning assessment in dental education: a global model for the future.

    PubMed

    Arevalo, Carolina R; Bayne, Stephen C; Beeley, Josie A; Brayshaw, Christine J; Cox, Margaret J; Donaldson, Nora H; Elson, Bruce S; Grayden, Sharon K; Hatzipanagos, Stylianos; Johnson, Lynn A; Reynolds, Patricia A; Schönwetter, Dieter J

    2013-05-01

    The framework presented in this article demonstrates strategies for a global approach to e-curricula in dental education by considering a collection of outcome assessment tools. By combining the outcomes for overall assessment, a global model for a pilot project that applies e-assessment tools to virtual learning environments (VLE), including haptics, is presented. Assessment strategies from two projects, HapTEL (Haptics in Technology Enhanced Learning) and UDENTE (Universal Dental E-learning), act as case-user studies that have helped develop the proposed global framework. They incorporate additional assessment tools and include evaluations from questionnaires and stakeholders' focus groups. These measure each of the factors affecting the classical teaching/learning theory framework as defined by Entwistle in a standardized manner. A mathematical combinatorial approach is proposed to join these results together as a global assessment. With the use of haptic-based simulation learning, exercises for tooth preparation assessing enamel and dentine were compared to plastic teeth in manikins. Equivalence for student performance for haptic versus traditional preparation methods was established, thus establishing the validity of the haptic solution for performing these exercises. Further data collected from HapTEL are still being analyzed, and pilots are being conducted to validate the proposed test measures. Initial results have been encouraging, but clearly the need persists to develop additional e-assessment methods for new learning domains.

  3. Catchment travel and residence time distributions: a theoretical framework for solute transport modeling

    NASA Astrophysics Data System (ADS)

    Botter, G.; Bertuzzo, E.; Rinaldo, A.

    2011-12-01

    The probability density functions (pdf's) of travel and residence times are key descriptors of the mechanisms through which catchments retain and release old and event water, transporting solutes to receiving water bodies. In this contribution we derive a general stochastic framework applicable to arbitrary catchment control volumes, where time-variable precipitation, evapotranspiration and discharge are assumed to be the major hydrological drivers for water and solutes. A master equation for the residence time pdf is derived and solved analytically, providing expressions for travel and residence time pdf's as a function of input/output fluxes and of the relevant mixing processes occurring along streamflow production and plant upatke. Our solutions suggest intrinsically time variant travel and residence time pdf's through a direct dependence on the underlying hydrological forcings and soil vegetation dynamics. The proposed framework highlights the dependence of water/solute travel times on eco-hydrological processes (especially transpiration and uptake), and integrates age-dating and tracer hydrology techniques by providing a coherent framework for catchment transport models. An application to the release of pesticides from an agricultural watershead is also discussed.

  4. Uncertainty Quantification for the OCO-2 Mission: A Monte Carlo Framework Using a Surrogate Model

    NASA Astrophysics Data System (ADS)

    Braverman, A. J.; Hobbs, J.; Cressie, N.; Gunson, M. R.; Granat, R.; Brynjarsdottir, J.; Nguyen, H.; Fu, D.; Teixeira, J.

    2015-12-01

    The OCO-2 retrieval algorithm is based on the optimal estimation (OE) framework of Rodgers (2000): itapplies Bayes' Theorem to the problem of inferring geophysical states from soundings observed by theinstrument. The algorithm ingests a sounding and outputs the putative posterior mean vector and covariance matrix that, under the assumption of Gaussianity, give a probabilistic description of the underlying state vector. The algorithm also uses a host of other inputs that are treated as fixed and known when at least some of them are not. Thus, some additional uncertainty beyond that captured by the posterior covariance is imparted to the algorithm output. The OCO-2 Team is implementing a framework to quantify this additional uncertainty by simulating a representative set of synthetic state vectors, generating corresponding synthetic soundings, performing retrievals on them, and comparing the retrievals to the original synthetic truth. A simplified forward model is used both to generate the soundings and in the retrieval in order to generate a large ensemble of results in a short period of time. The empirical joint distribution of the synthetic true and retrieved state vectors can be interrogated to provide estimates of the impact of uncertain inputs on the retrieval. In this talk, we describe this framework, its rationale, and how it will ultimately provide users with adjustments that more fully account for uncertainty.

  5. Atomic charges for modeling metal–organic frameworks: Why and how

    SciTech Connect

    Hamad, Said Balestra, Salvador R.G.; Bueno-Perez, Rocio; Calero, Sofia; Ruiz-Salvador, A. Rabdel

    2015-03-15

    Atomic partial charges are parameters of key importance in the simulation of Metal–Organic Frameworks (MOFs), since Coulombic interactions decrease with the distance more slowly than van der Waals interactions. But despite its relevance, there is no method to unambiguously assign charges to each atom, since atomic charges are not quantum observables. There are several methods that allow the calculation of atomic charges, most of them starting from the electronic wavefunction or the electronic density or the system, as obtained with quantum mechanics calculations. In this work, we describe the most common methods employed to calculate atomic charges in MOFs. In order to show the influence that even small variations of structure have on atomic charges, we present the results that we obtained for DMOF-1. We also discuss the effect that small variations of atomic charges have on the predicted structural properties of IRMOF-1. - Graphical abstract: We review the different method with which to calculate atomic partial charges that can be used in force field-based calculations. We also present two examples that illustrate the influence of the geometry on the calculated charges and the influence of the charges on structural properties. - Highlights: • The choice of atomic charges is crucial in modeling adsorption and diffusion in MOFs. • Methods for calculating atomic charges in MOFs are reviewed. • We discuss the influence of the framework geometry on the calculated charges. • We discuss the influence of the framework charges on structural the properties.

  6. Framework for modeling urban restoration resilience time in the aftermath of an extreme event

    USGS Publications Warehouse

    Ramachandran, Varun; Long, Suzanna K.; Shoberg, Thomas G.; Corns, Steven; Carlo, Héctor

    2015-01-01

    The impacts of extreme events continue long after the emergency response has terminated. Effective reconstruction of supply-chain strategic infrastructure (SCSI) elements is essential for postevent recovery and the reconnectivity of a region with the outside. This study uses an interdisciplinary approach to develop a comprehensive framework to model resilience time. The framework is tested by comparing resilience time results for a simulated EF-5 tornado with ground truth data from the tornado that devastated Joplin, Missouri, on May 22, 2011. Data for the simulated tornado were derived for Overland Park, Johnson County, Kansas, in the greater Kansas City, Missouri, area. Given the simulated tornado, a combinatorial graph considering the damages in terms of interconnectivity between different SCSI elements is derived. Reconstruction in the aftermath of the simulated tornado is optimized using the proposed framework to promote a rapid recovery of the SCSI. This research shows promising results when compared with the independent quantifiable data obtained from Joplin, Missouri, returning a resilience time of 22 days compared with 25 days reported by city and state officials.

  7. A Human Sensor Network Framework in Support of Near Real Time Situational Geophysical Modeling

    NASA Astrophysics Data System (ADS)

    Aulov, O.; Price, A.; Smith, J. A.; Halem, M.

    2013-12-01

    The area of Disaster Management is well established among Federal Agencies such as FEMA, EPA, NOAA and NASA. These agencies have well formulated frameworks for response and mitigation based on near real time satellite and conventional observing networks for assimilation into geophysical models. Forecasts from these models are used to communicate with emergency responders and the general public. More recently, agencies have started using social media to broadcast warnings and alerts to potentially affected communities. In this presentation, we demonstrate the added benefits of mining and assimilating the vast amounts of social media data available from heterogeneous hand held devices and social networks into established operational geophysical modeling frameworks as they apply to the five cornerstones of disaster management - Prevention, Mitigation, Preparedness, Response and Recovery. Often, in situations of extreme events, social media provide the earliest notification of adverse extreme events. However, various forms of social media data also can provide useful geolocated and time stamped in situ observations, complementary to directly sensed conventional observations. We use the concept of a Human Sensor Network where one views social media users as carrying field deployed "sensors" whose posts are the remotely "sensed instrument measurements.' These measurements can act as 'station data' providing the resolution and coverage needed for extreme event specific modeling and validation. Here, we explore the use of social media through the use of a Human Sensor Network (HSN) approach as another data input source for assimilation into geophysical models. Employing the HSN paradigm can provide useful feedback in near real-time, but presents software challenges for rapid access, quality filtering and transforming massive social media data into formats consistent with the operational models. As a use case scenario, we demonstrate the value of HSN for disaster management

  8. Elementary metabolite units (EMU): a novel framework for modeling isotopic distributions.

    PubMed

    Antoniewicz, Maciek R; Kelleher, Joanne K; Stephanopoulos, Gregory

    2007-01-01

    Metabolic flux analysis (MFA) has emerged as a tool of great significance for metabolic engineering and mammalian physiology. An important limitation of MFA, as carried out via stable isotope labeling and GC/MS and nuclear magnetic resonance (NMR) measurements, is the large number of isotopomer or cumomer equations that need to be solved, especially when multiple isotopic tracers are used for the labeling of the system. This restriction reduces the ability of MFA to fully utilize the power of multiple isotopic tracers in elucidating the physiology of realistic situations comprising complex bioreaction networks. Here, we present a novel framework for the modeling of isotopic labeling systems that significantly reduces the number of system variables without any loss of information. The elementary metabolite unit (EMU) framework is based on a highly efficient decomposition method that identifies the minimum amount of information needed to simulate isotopic labeling within a reaction network using the knowledge of atomic transitions occurring in the network reactions. The functional units generated by the decomposition algorithm, called EMUs, form the new basis for generating system equations that describe the relationship between fluxes and stable isotope measurements. Isotopomer abundances simulated using the EMU framework are identical to those obtained using the isotopomer and cumomer methods, however, require significantly less computation time. For a typical (13)C-labeling system the total number of equations that needs to be solved is reduced by one order-of-magnitude (100s EMUs vs. 1000s isotopomers). As such, the EMU framework is most efficient for the analysis of labeling by multiple isotopic tracers. For example, analysis of the gluconeogenesis pathway with (2)H, (13)C, and (18)O tracers requires only 354 EMUs, compared to more than two million isotopomers.

  9. The Application of Modeling and Simulation in Capacity Management within the ITIL Framework

    NASA Technical Reports Server (NTRS)

    Rahmani, Sonya; vonderHoff, Otto

    2010-01-01

    Tightly integrating modeling and simulation techniques into Information Technology Infrastructure Library (ITIL) practices can be one of the driving factors behind a successful and cost-effective capacity management effort for any Information Technology (IT) system. ITIL is a best practices framework for managing IT infrastructure, development and operations. Translating ITIL theory into operational reality can be a challenge. This paper aims to highlight how to best integrate modeling and simulation into an ITIL implementation. For cases where the project team initially has difficulty gaining consensus on investing in modeling and simulation resources, a clear definition for M&S implementation into the ITIL framework, specifically its role in supporting Capacity Management, is critical to gaining the support required to garner these resources. This implementation should also help to clearly define M&S support to the overall system mission. This paper will describe the development of an integrated modeling approach and how best to tie M&S to definitive goals for evaluating system capacity and performance requirements. Specifically the paper will discuss best practices for implementing modeling and simulation into ITIL. These practices hinge on implementing integrated M&S methods that 1) encompass at least two or more predictive modeling techniques, 2) complement each one's respective strengths and weaknesses to support the validation of predicted results, and 3) are tied to the system's performance and workload monitoring efforts. How to structure two forms of modeling: statistical and simUlation in the development of "As Is" and "To Be" efforts will be used to exemplify the integrated M&S methods. The paper will show how these methods can better support the project's overall capacity management efforts.

  10. Distribution-enhanced homogenization framework and model for heterogeneous elasto-plastic problems

    NASA Astrophysics Data System (ADS)

    Alleman, Coleman; Luscher, D. J.; Bronkhorst, Curt; Ghosh, Somnath

    2015-12-01

    Multi-scale computational models offer tractable means to simulate sufficiently large spatial domains comprised of heterogeneous materials by resolving material behavior at different scales and communicating across these scales. Within the framework of computational multi-scale analyses, hierarchical models enable unidirectional transfer of information from lower to higher scales, usually in the form of effective material properties. Determining explicit forms for the macroscale constitutive relations for complex microstructures and nonlinear processes generally requires numerical homogenization of the microscopic response. Conventional low-order homogenization uses results of simulations of representative microstructural domains to construct appropriate expressions for effective macroscale constitutive parameters written as a function of the microstructural characterization. This paper proposes an alternative novel approach, introduced as the distribution-enhanced homogenization framework or DEHF, in which the macroscale constitutive relations are formulated in a series expansion based on the microscale constitutive relations and moments of arbitrary order of the microscale field variables. The framework does not make any a priori assumption on the macroscale constitutive behavior being represented by a homogeneous effective medium theory. Instead, the evolution of macroscale variables is governed by the moments of microscale distributions of evolving field variables. This approach demonstrates excellent accuracy in representing the microscale fields through their distributions. An approximate characterization of the microscale heterogeneity is accounted for explicitly in the macroscale constitutive behavior. Increasing the order of this approximation results in increased fidelity of the macroscale approximation of the microscale constitutive behavior. By including higher-order moments of the microscale fields in the macroscale problem, micromechanical analyses do

  11. Quantitative Hydrogeological Framework Interpretations from Modeling Helicopter Electromagnetic Survey Data, Nebraska Panhandle

    NASA Astrophysics Data System (ADS)

    Abraham, J. D.; Ball, L. B.; Bedrosian, P. A.; Cannia, J. C.; Deszcz-Pan, M.; Minsley, B. J.; Peterson, S. M.; Smith, B. D.

    2009-12-01

    The need for allocation and management of water resources within the state of Nebraska has created a demand for innovative approaches to data collection for development of hydrogeologic frameworks to be used for 2D and 3D groundwater models. In 2008, the USGS in cooperation with the North Platte Natural Resources District, the South Platte Natural Resources District, and the University of Nebraska Conservation and Survey Division began using frequency domain helicopter electromagnetic (HEM) surveys to map selected sections of the Nebraska Panhandle. The surveys took place in selected sections of the North Platte River valley, Lodgepole Creek, and portions of the adjacent tablelands. The objective of the surveys is to map the aquifers of the area to improve understanding of the groundwater-surface water relationships and develop better hydrogeologic frameworks used in making more accurate 3D groundwater models of the area. For the HEM method to have an impact in a groundwater model at the basin scale, hydrostratigraphic units need to have detectable physical property (electrical resistivity) contrasts. When these contrasts exist within the study area and they are detectable from an airborne platform, large areas can be surveyed to rapidly generate 2D and 3D maps and models of 3D hydrogeologic features. To make the geophysical data useful to multidimensional groundwater models, numerical inversion is necessary to produce a depth-dependent physical property data set reflecting hydrogeologic features. These maps and depth images of electrical resistivity in themselves are not useful for the hydrogeologist. They need to be turned into maps and depth images of the hydrostratigraphic units and hydrogeologic features. Through a process of numerical imaging, inversion, sensitivity analysis, geological ground truthing (boreholes), geological interpretation, hydrogeologic features are characterized. Resistivity depth sections produced from this process are used to pick

  12. Mapping Spatial Variability of Soil Moisture in a Semi-distributed Hydrologic Modelling Framework

    NASA Astrophysics Data System (ADS)

    Ajami, Hoori; Sharma, Ashish

    2016-04-01

    The Soil Moisture and Runoff simulation Toolkit (SMART) is a computationally efficient semi-distributed hydrological modelling framework developed for water balance simulations at a catchment scale. The modelling framework is based upon the delineation of contiguous and topologically connected Hydrologic Response Units (HRUs) and distributed cross sections or equivalent cross sections (ECS) in each first order sub-basin to represent hillslope hydrologic processes. HRUs are delineated in each first order sub-basin based on topographic and geomorphic analysis of the entire catchment. A 2-d distributed hydrological model based on the Richards' equation performs water balance simulations across a series of ECSs formulated by aggregating topographic and physiographic properties of the part or entire first order sub-basins. Delineation of ECSs has the advantage of reducing computational time while maintaining reasonable accuracy in simulated fluxes and states. While HRU level soil moisture is well approximated in the ECS formulation compared to the distributed modelling approaches, spatial variability of soil moisture within a given HRU inside an ECS is ignored. In this study, we developed a disaggregation scheme for soil moisture distribution within every ECS formulated in a first order sub-basin. The statistical disaggregation scheme is developed based on soil moisture simulations of the Baldry sub-catchment, Australia using the integrated land surface-groundwater model, ParFlow.CLM. ParFlow is a variably saturated flow model that solves the 3D Richards' equation for the sub-surface and it is coupled to the Common Land Model (CLM). The disaggregation scheme preserves the mean sub-basin soil moisture and maintains temporal correlation of simulated daily soil moisture. Our preliminary results illustrate that the spatial disaggregation scheme can approximate spatially distributed soil moisture field produced by ParFlow.CLM at 60 m resolution. In addition, the

  13. The Earth System Modeling Framework and Earth System Curator: Software Components as Building Blocks of Community

    NASA Astrophysics Data System (ADS)

    Deluca, C.; Balaji, V.; da Silva, A.; Dunlap, R.; Hill, C.; Mark, L.; Mechoso, C. R.; Middleton, D.; Nikonov, S.; Rugaber, S.; Suarez, M.

    2006-05-01

    The Earth System Modeling Framework (ESMF) is an established U.S. initiative to develop high performance common modeling infrastructure for climate and weather models. ESMF is the technical foundation for the NASA Modeling, Analysis, and Prediction (MAP) Climate Variability and Change program and the DoD Battlespace Environments Institute (BEI). It has been incorporated into the Community Climate System Model (CCSM), the Weather Research and Forecast (WRF) Model, NOAA NCEP and GFDL models, Army, Navy, and Air Force models, and many others. The new, NSF-funded Earth System Curator is a related database and toolkit that will store information about model configurations, prepare models for execution, and run them locally or in a distributed fashion. The key concept that underlies both ESMF and the Earth System Curator is that of software components. Components are software units that are "composable", meaning they can be combined to form coupled applications. These components may be representations of physical domains, such as atmospheres or oceans; processes within particular domains such as atmospheric radiation or chemistry; or computational functions, such as data assimilation or I/O. ESMF provides interfaces, an architecture, and tools for structuring components hierarchically to form complex, coupled modeling applications. The Earth System Curator will enable modelers to describe, archive, search, compose, and run ESMF and similar components. Together these projects encourage a new paradigm for modeling: one in which the community can draw from a federation of many interoperable components in order to create and deploy applications. The goal is to enable a network of collaborations and new scientific opportunities for the Earth modeling community.

  14. A hierarchical framework for the multiscale modeling of microstructure evolution in heterogeneous materials.

    SciTech Connect

    Luscher, Darby J.

    2010-04-01

    All materials are heterogeneous at various scales of observation. The influence of material heterogeneity on nonuniform response and microstructure evolution can have profound impact on continuum thermomechanical response at macroscopic “engineering” scales. In many cases, it is necessary to treat this behavior as a multiscale process thus integrating the physical understanding of material behavior at various physical (length and time) scales in order to more accurately predict the thermomechanical response of materials as their microstructure evolves. The intent of the dissertation is to provide a formal framework for multiscale hierarchical homogenization to be used in developing constitutive models.

  15. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model

    PubMed Central

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework. PMID:26543899

  16. A computational framework for 3D mechanical modeling of plant morphogenesis with cellular resolution.

    PubMed

    Boudon, Frédéric; Chopard, Jérôme; Ali, Olivier; Gilles, Benjamin; Hamant, Olivier; Boudaoud, Arezki; Traas, Jan; Godin, Christophe

    2015-01-01

    The link between genetic regulation and the definition of form and size during morphogenesis remains largely an open question in both plant and animal biology. This is partially due to the complexity of the process, involving extensive molecular networks, multiple feedbacks between different scales of organization and physical forces operating at multiple levels. Here we present a conceptual and modeling framework aimed at generating an integrated understanding of morphogenesis in plants. This framework is based on the biophysical properties of plant cells, which are under high internal turgor pressure, and are prevented from bursting because of the presence of a rigid cell wall. To control cell growth, the underlying molecular networks must interfere locally with the elastic and/or plastic extensibility of this cell wall. We present a model in the form of a three dimensional (3D) virtual tissue, where growth depends on the local modulation of wall mechanical properties and turgor pressure. The model shows how forces generated by turgor-pressure can act both cell autonomously and non-cell autonomously to drive growth in different directions. We use simulations to explore lateral organ formation at the shoot apical meristem. Although different scenarios lead to similar shape changes, they are not equivalent and lead to different, testable predictions regarding the mechanical and geometrical properties of the growing lateral organs. Using flower development as an example, we further show how a limited number of gene activities can explain the complex shape changes that accompany organ outgrowth.

  17. A Computational Framework for 3D Mechanical Modeling of Plant Morphogenesis with Cellular Resolution

    PubMed Central

    Gilles, Benjamin; Hamant, Olivier; Boudaoud, Arezki; Traas, Jan; Godin, Christophe

    2015-01-01

    The link between genetic regulation and the definition of form and size during morphogenesis remains largely an open question in both plant and animal biology. This is partially due to the complexity of the process, involving extensive molecular networks, multiple feedbacks between different scales of organization and physical forces operating at multiple levels. Here we present a conceptual and modeling framework aimed at generating an integrated understanding of morphogenesis in plants. This framework is based on the biophysical properties of plant cells, which are under high internal turgor pressure, and are prevented from bursting because of the presence of a rigid cell wall. To control cell growth, the underlying molecular networks must interfere locally with the elastic and/or plastic extensibility of this cell wall. We present a model in the for