Science.gov

Sample records for quark-parton model framework

  1. Semi-inclusive charged-pion electroproduction off protons and deuterons: Cross sections, ratios, and access to the quark-parton model at low energies

    NASA Astrophysics Data System (ADS)

    Asaturyan, R.; Ent, R.; Mkrtchyan, H.; Navasardyan, T.; Tadevosyan, V.; Adams, G. S.; Ahmidouch, A.; Angelescu, T.; Arrington, J.; Asaturyan, A.; Baker, O. K.; Benmouna, N.; Bertoncini, C.; Blok, H. P.; Boeglin, W. U.; Bosted, P. E.; Breuer, H.; Christy, M. E.; Connell, S. H.; Cui, Y.; Dalton, M. M.; Danagoulian, S.; Day, D.; Dunne, J. A.; Dutta, D.; El Khayari, N.; Fenker, H. C.; Frolov, V. V.; Gan, L.; Gaskell, D.; Hafidi, K.; Hinton, W.; Holt, R. J.; Horn, T.; Huber, G. M.; Hungerford, E.; Jiang, X.; Jones, M.; Joo, K.; Kalantarians, N.; Kelly, J. J.; Keppel, C. E.; Kubarovsky, V.; Li, Y.; Liang, Y.; Mack, D.; Malace, S. P.; Markowitz, P.; McGrath, E.; McKee, P.; Meekins, D. G.; Mkrtchyan, A.; Moziak, B.; Niculescu, G.; Niculescu, I.; Opper, A. K.; Ostapenko, T.; Reimer, P. E.; Reinhold, J.; Roche, J.; Rock, S. E.; Schulte, E.; Segbefia, E.; Smith, C.; Smith, G. R.; Stoler, P.; Tang, L.; Ungaro, M.; Uzzle, A.; Vidakovic, S.; Villano, A.; Vulcan, W. F.; Wang, M.; Warren, G.; Wesselmann, F. R.; Wojtsekhowski, B.; Wood, S. A.; Xu, C.; Yuan, L.; Zheng, X.

    2012-01-01

    A large set of cross sections for semi-inclusive electroproduction of charged pions (π±) from both proton and deuteron targets was measured. The data are in the deep-inelastic scattering region with invariant mass squared W2>4 GeV2 (up to ≈7 GeV2) and range in four-momentum transfer squared 2quark-parton model description. We find the azimuthal dependences to be small, as compared to exclusive pion electroproduction, and consistent with theoretical expectations based on tree-level factorization in terms of transverse-momentum-dependent parton distribution and fragmentation functions. In the context of a simple model, the initial transverse momenta of d quarks are found to be slightly smaller than for u quarks, while the transverse momentum width of the favored fragmentation function is about the same as for the unfavored one, and both fragmentation widths are larger than the quark widths.

  2. Semi-inclusive charged-pion electroproduction off protons and deuterons: Cross sections, ratios, and access to the quark-parton model at low energies

    DOE PAGESBeta

    Asaturyan, R.; Ent, R.; Mkrtchyan, H.; Navasardyan, T.; Tadevosyan, V.; Adams, G. S.; Ahmidouch, A.; Angelescu, T.; Arrington, J.; Asaturyan, A.; et al

    2012-01-01

    A large set of cross sections for semi-inclusive electroproduction of charged pions (π±) from both proton and deuteron targets was measured. The data are in the deep-inelastic scattering region with invariant mass squared W2 > 4 GeV2 and range in four-momentum transfer squared 2 < Q2 < 4 (GeV/c)2, and cover a range in the Bjorken scaling variable 0.2 < x < 0.6. The fractional energy of the pions spans a range 0.3 < z < 1, with small transverse momenta with respect to the virtual-photon direction, Pt2 < 0.2 (GeV/c)2. The invariant mass that goes undetected, Mx or W',more » is in the nucleon resonance region, W' < 2 GeV. The new data conclusively show the onset of quark-hadron duality in this process, and the relation of this phenomenon to the high-energy factorization ansatz of electron-quark scattering and subsequent quark → pion production mechanisms. The x, z and Pt2 dependences of several ratios (the ratios of favored-unfavored fragmentation functions, charged pion ratios, deuteron-hydrogen and aluminum-deuteron ratios for π+ and π-) have been studied. The ratios are found to be in good agreement with expectations based upon a high-energy quark-parton model description. We find the azimuthal dependences to be small, as compared to exclusive pion electroproduction, and consistent with theoretical expectations based on tree-level factorization in terms of transverse-momentum-dependent parton distribution and fragmentation functions. In the context of a simple model, the initial transverse momenta of d quarks are found to be slightly smaller than for u quarks, while the transverse momentum width of the favored fragmentation function is about the same as for the unfavored one, and both fragmentation widths are larger than the quark widths.« less

  3. Fermi-Dirac distributions for quark partons

    NASA Astrophysics Data System (ADS)

    Bourrely, C.; Buccella, F.; Miele, G.; Migliore, G.; Soffer, J.; Tibullo, V.

    1994-09-01

    We propose to use Fermi-Dirac distributions for quark and antiquark partons. It allows a fair description of the x-dependence of the very recent NMC data on the proton and neutron structure functions F {2/ p } (x) and F {2/ n } (x) at Q 2=4 GeV2, as well as the CCFR antiquark distributionxbar q(x). We show that one can also use a corresponding Bose-Einstein expression to describe consistently the gluon distribution. The Pauli exclusion principle, which has been identified to explain the flavor asymmetry of the light-quark sea of the proton, is advocated to guide us for making a simple construction of the polarized parton distributions. We predict the spin dependent structure functions g {1/ p } (x) and g {1/ n } (x) in good agreement with EMC and SLAC data. The quark distributions involve some parameters whose values support well the hypothesis that the violation of the quark parton model sum rules is a consequence of the Pauli principle.

  4. Semi-inclusive charged-pion electroproduction off protons and deuterons: Cross sections, ratios, and access to the quark-parton model at low energies

    SciTech Connect

    Asaturyan, R.; Ent, R.; Mkrtchyan, H.; Navasardyan, T.; Tadevosyan, V.; Adams, G. S.; Ahmidouch, A.; Angelescu, T.; Arrington, J.; Asaturyan, A.; Baker, O. K.; Benmouna, N.; Bertoncini, C.; Blok, H. P.; Boeglin, W. U.; Bosted, P. E.; Breuer, H.; Christy, M. E.; Connell, S. H.; Cui, Y.; Dalton, M. M.; Danagoulian, S.; Day, D.; Dunne, J. A.; Dutta, D.; El Khayari, N.; Fenker, H. C.; Frolov, V. V.; Gan, L.; Gaskell, D.; Hafidi, K.; Hinton, W.; Holt, R. J.; Horn, T.; Huber, G. M.; Hungerford, E.; Jiang, X.; Jones, M.; Joo, K.; Kalantarians, N.; Kelly, J. J.; Keppel, C. E.; Kubarovsky, V.; Li, Y.; Liang, Y.; Mack, D.; Malace, S. P.; Markowitz, P.; McGrath, E.; McKee, P.; Meekins, D. G.; Mkrtchyan, A.; Moziak, B.; Niculescu, G.; Niculescu, I.; Opper, A. K.; Ostapenko, T.; Reimer, P. E.; Reinhold, J.; Roche, J.; Rock, S. E.; Schulte, E.; Segbefia, E.; Smith, C.; Smith, G. R.; Stoler, P.; Tang, L.; Ungaro, M.; Uzzle, A.; Vidakovic, S.; Villano, A.; Vulcan, W. F.; Wang, M.; Warren, G.; Wesselmann, F. R.; Wojtsekhowski, B.; Wood, S. A.; Xu, C.; Yuan, L.; Zheng, X.

    2012-01-01

    A large set of cross sections for semi-inclusive electroproduction of charged pions (π±) from both proton and deuteron targets was measured. The data are in the deep-inelastic scattering region with invariant mass squared W2 > 4 GeV2 and range in four-momentum transfer squared 2 < Q2 < 4 (GeV/c)2, and cover a range in the Bjorken scaling variable 0.2 < x < 0.6. The fractional energy of the pions spans a range 0.3 < z < 1, with small transverse momenta with respect to the virtual-photon direction, Pt2 < 0.2 (GeV/c)2. The invariant mass that goes undetected, Mx or W', is in the nucleon resonance region, W' < 2 GeV. The new data conclusively show the onset of quark-hadron duality in this process, and the relation of this phenomenon to the high-energy factorization ansatz of electron-quark scattering and subsequent quark → pion production mechanisms. The x, z and Pt2 dependences of several ratios (the ratios of favored-unfavored fragmentation functions, charged pion ratios, deuteron-hydrogen and aluminum-deuteron ratios for π+ and π-) have been studied. The ratios are found to be in good agreement with expectations based upon a high-energy quark-parton model description. We find the azimuthal dependences to be small, as compared to exclusive pion electroproduction, and consistent with theoretical expectations based on tree-level factorization in terms of transverse-momentum-dependent parton distribution and fragmentation functions. In the context of a simple model, the initial transverse momenta of d quarks are found to be slightly smaller than for u quarks, while the transverse momentum width of the favored fragmentation function is about the same as for the unfavored one, and both fragmentation widths are larger than the quark widths.

  5. Pion and kaon valence-quark parton distribution functions

    NASA Astrophysics Data System (ADS)

    Nguyen, Trang; Bashir, Adnan; Roberts, Craig D.; Tandy, Peter C.

    2011-06-01

    A rainbow-ladder truncation of QCD’s Dyson-Schwinger equations, constrained by existing applications to hadron physics, is employed to compute the valence-quark parton distribution functions of the pion and kaon. Comparison is made to π-N Drell-Yan data for the pion’s u-quark distribution and to Drell-Yan data for the ratio uK(x)/uπ(x): the environmental influence of this quantity is a parameter-free prediction, which agrees well with existing data. Our analysis unifies the computation of distribution functions with that of numerous other properties of pseudoscalar mesons.

  6. Pion and kaon valence-quark parton distribution functions.

    SciTech Connect

    Nguyen, T.; Bashir, A.; Roberts, C. D.; Tandy, P. C.

    2011-06-16

    A rainbow-ladder truncation of QCD's Dyson-Schwinger equations, constrained by existing applications to hadron physics, is employed to compute the valence-quark parton distribution functions of the pion and kaon. Comparison is made to {pi}-N Drell-Yan data for the pion's u-quark distribution and to Drell-Yan data for the ratio u{sub K}(x)/u{sub {pi}}(x): the environmental influence of this quantity is a parameter-free prediction, which agrees well with existing data. Our analysis unifies the computation of distribution functions with that of numerous other properties of pseudoscalar mesons.

  7. Pion and kaon valence-quark parton distribution functions

    SciTech Connect

    Nguyen, Trang; Bashir, Adnan; Roberts, Craig D.; Tandy, Peter C.

    2011-06-15

    A rainbow-ladder truncation of QCD's Dyson-Schwinger equations, constrained by existing applications to hadron physics, is employed to compute the valence-quark parton distribution functions of the pion and kaon. Comparison is made to {pi}-N Drell-Yan data for the pion's u-quark distribution and to Drell-Yan data for the ratio u{sub K}(x)/u{sub {pi}}(x): the environmental influence of this quantity is a parameter-free prediction, which agrees well with existing data. Our analysis unifies the computation of distribution functions with that of numerous other properties of pseudoscalar mesons.

  8. Dicyanometallates as Model Extended Frameworks

    PubMed Central

    2016-01-01

    We report the structures of eight new dicyanometallate frameworks containing molecular extra-framework cations. These systems include a number of hybrid inorganic–organic analogues of conventional ceramics, such as Ruddlesden–Popper phases and perovskites. The structure types adopted are rationalized in the broader context of all known dicyanometallate framework structures. We show that the structural diversity of this family can be understood in terms of (i) the charge and coordination preferences of the particular metal cation acting as framework node, and (ii) the size, shape, and extent of incorporation of extra-framework cations. In this way, we suggest that dicyanometallates form a particularly attractive model family of extended frameworks in which to explore the interplay between molecular degrees of freedom, framework topology, and supramolecular interactions. PMID:27057759

  9. Dicyanometallates as Model Extended Frameworks.

    PubMed

    Hill, Joshua A; Thompson, Amber L; Goodwin, Andrew L

    2016-05-11

    We report the structures of eight new dicyanometallate frameworks containing molecular extra-framework cations. These systems include a number of hybrid inorganic-organic analogues of conventional ceramics, such as Ruddlesden-Popper phases and perovskites. The structure types adopted are rationalized in the broader context of all known dicyanometallate framework structures. We show that the structural diversity of this family can be understood in terms of (i) the charge and coordination preferences of the particular metal cation acting as framework node, and (ii) the size, shape, and extent of incorporation of extra-framework cations. In this way, we suggest that dicyanometallates form a particularly attractive model family of extended frameworks in which to explore the interplay between molecular degrees of freedom, framework topology, and supramolecular interactions. PMID:27057759

  10. Geologic Framework Model (GFM2000)

    SciTech Connect

    T. Vogt

    2004-08-26

    The purpose of this report is to document the geologic framework model, version GFM2000 with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, and the differences between GFM2000 and previous versions. The version number of this model reflects the year during which the model was constructed. This model supersedes the previous model version, documented in Geologic Framework Model (GFM 3.1) (CRWMS M&O 2000 [DIRS 138860]). The geologic framework model represents a three-dimensional interpretation of the geology surrounding the location of the monitored geologic repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain. The geologic framework model encompasses and is limited to an area of 65 square miles (168 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the geologic framework model (shown in Figure 1-1) were chosen to encompass the exploratory boreholes and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The upper surface of the model is made up of the surface topography and the depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The geologic framework model was constructed from geologic map and borehole data. Additional information from measured stratigraphic sections, gravity profiles, and seismic profiles was also considered. The intended use of the geologic framework model is to provide a geologic framework over the area of interest consistent with the level of detailed needed for hydrologic flow and radionuclide transport modeling through the UZ and for repository design. The model is limited by the availability of data and relative amount of geologic complexity found in an area. The geologic framework model is inherently limited by scale and content. The grid spacing used in the

  11. Environmental modeling framework invasiveness: analysis and implications

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Environmental modeling frameworks support scientific model development by providing an Application Programming Interface (API) which model developers use to implement models. This paper presents results of an investigation on the framework invasiveness of environmental modeling frameworks. Invasiven...

  12. Environmental modeling framework invasiveness: analysis and implications

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Environmental modeling frameworks support scientific model development by providing an Application Programming Interface (API) which model developers use to implement models. This paper presents results of an investigation on the framework invasiveness of environmental modeling frameworks. Invasiv...

  13. Sequentially Executed Model Evaluation Framework

    SciTech Connect

    2014-02-14

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and 1/0 through the time domain (or other discrete domain), and sample 1/0 drivers. This is a Framework library framework, and does not, itself, solve any problems or execute any modelling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Ha) applications in quality monitoring, and was developed as part of the CANARY-EDS software, where real-time water quality data is being analyzed

  14. Sequentially Executed Model Evaluation Framework

    Energy Science and Technology Software Center (ESTSC)

    2014-02-14

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and 1/0 through the time domain (or other discrete domain), and sample 1/0 drivers. This is a Framework library framework, and does not, itself, solve any problems or execute any modelling. The SeMe framework aids in development of models which operate on sequential information, suchmore » as time-series, where evaluation is based on prior results combined with new data for this iteration. Ha) applications in quality monitoring, and was developed as part of the CANARY-EDS software, where real-time water quality data is being analyzed« less

  15. Sequentially Executed Model Evaluation Framework

    SciTech Connect

    2015-10-20

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as part of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.

  16. Sequentially Executed Model Evaluation Framework

    Energy Science and Technology Software Center (ESTSC)

    2015-10-20

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such asmore » time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as part of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.« less

  17. Pion valence-quark parton distribution function

    NASA Astrophysics Data System (ADS)

    Chang, Lei; Thomas, Anthony W.

    2015-10-01

    Within the Dyson-Schwinger equation formulation of QCD, a rainbow ladder truncation is used to calculate the pion valence-quark distribution function (PDF). The gap equation is renormalized at a typical hadronic scale, of order 0.5 GeV, which is also set as the default initial scale for the pion PDF. We implement a corrected leading-order expression for the PDF which ensures that the valence-quarks carry all of the pion's light-front momentum at the initial scale. The scaling behavior of the pion PDF at a typical partonic scale of order 5.2 GeV is found to be (1 - x) ν, with ν ≃ 1.6, as x approaches one.

  18. Geologic Framework Model Analysis Model Report

    SciTech Connect

    R. Clayton

    2000-12-19

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the

  19. Reducing the invasiveness of modelling frameworks

    NASA Astrophysics Data System (ADS)

    Donchyts, G.; Baart, F.

    2010-12-01

    There are several modelling frameworks available that allow for environmental models to exchange data with other models. Many efforts have been made in the past years promoting solutions aimed at integrating different numerical models with each other as well as at simplifying the way to set them up, entering the data, and running them. Meanwhile the development of many modeling frameworks concentrated on the interoperability of different model engines, several standards were introduced such as ESMF, OMS and OpenMI. One of the issues with applying modelling frameworks is the invasessness, the more the model has to know about the framework, the more intrussive it is. Another issue when applying modelling frameworks are that a lot of environmental models are written in procedural and in FORTRAN, which is one of the few languages that doesn't have a proper interface with other programming languages. Most modelling frameworks are written in object oriented languages like java/c# and the modelling framework in FORTRAN ESMF is also objected oriented. In this research we show how the application of domain driven, object oriented development techniques to environmental models can reduce the invasiveness of modelling frameworks. Our approach is based on four different steps: 1) application of OO techniques and reflection to the existing model to allow introspection. 2) programming language interoperability, between model written in a procedural programming language and modeling framework written in an object oriented programming language. 3) Domain mapping between data types used by model and other components being integrated 4) Connecting models using framework (wrapper) We compare coupling of an existing model as it was to the same model adapted using the four step approach. We connect both versions of the models using two different integrated modelling frameworks. As an example of a model we use the coastal morphological model XBeach. By adapting this model it allows for

  20. ATMOSPHERIC HEALTH EFFECTS FRAMEWORK (AHEF) MODEL

    EPA Science Inventory

    The Atmospheric and Health Effects Framework (AHEF) is used to assess theglobal impacts of substitutes for ozone-depleting substances (ODS). The AHEF is a series of FORTRAN modeling modules that collectively form a simulation framework for (a) translating ODS production into emi...

  1. The Generalized DINA Model Framework

    ERIC Educational Resources Information Center

    de la Torre, Jimmy

    2011-01-01

    The G-DINA ("generalized deterministic inputs, noisy and gate") model is a generalization of the DINA model with more relaxed assumptions. In its saturated form, the G-DINA model is equivalent to other general models for cognitive diagnosis based on alternative link functions. When appropriate constraints are applied, several commonly used…

  2. Knowledge Encapsulation Framework for Collaborative Social Modeling

    SciTech Connect

    Cowell, Andrew J.; Gregory, Michelle L.; Marshall, Eric J.; McGrath, Liam R.

    2009-03-24

    This paper describes the Knowledge Encapsulation Framework (KEF), a suite of tools to enable knowledge inputs (relevant, domain-specific facts) to modeling and simulation projects, as well as other domains that require effective collaborative workspaces for knowledge-based task. This framework can be used to capture evidence (e.g., trusted material such as journal articles and government reports), discover new evidence (covering both trusted and social media), enable discussions surrounding domain-specific topics and provide automatically generated semantic annotations for improved corpus investigation. The current KEF implementation is presented within a wiki environment, providing a simple but powerful collaborative space for team members to review, annotate, discuss and align evidence with their modeling frameworks. The novelty in this approach lies in the combination of automatically tagged and user-vetted resources, which increases user trust in the environment, leading to ease of adoption for the collaborative environment.

  3. A Framework to Manage Information Models

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; King, T.; Crichton, D.; Walker, R.; Roberts, A.; Thieman, J.

    2008-05-01

    The Information Model is the foundation on which an Information System is built. It defines the entities to be processed, their attributes, and the relationships that add meaning. The development and subsequent management of the Information Model is the single most significant factor for the development of a successful information system. A framework of tools has been developed that supports the management of an information model with the rigor typically afforded to software development. This framework provides for evolutionary and collaborative development independent of system implementation choices. Once captured, the modeling information can be exported to common languages for the generation of documentation, application databases, and software code that supports both traditional and semantic web applications. This framework is being successfully used for several science information modeling projects including those for the Planetary Data System (PDS), the International Planetary Data Alliance (IPDA), the National Cancer Institute's Early Detection Research Network (EDRN), and several Consultative Committee for Space Data Systems (CCSDS) projects. The objective of the Space Physics Archive Search and Exchange (SPASE) program is to promote collaboration and coordination of archiving activity for the Space Plasma Physics community and ensure the compatibility of the architectures used for a global distributed system and the individual data centers. Over the past several years, the SPASE data model working group has made great progress in developing the SPASE Data Model and supporting artifacts including a data dictionary, XML Schema, and two ontologies. The authors have captured the SPASE Information Model in this framework. This allows the generation of documentation that presents the SPASE Information Model in object-oriented notation including UML class diagrams and class hierarchies. The modeling information can also be exported to semantic web languages such

  4. Rethinking modeling framework design: object modeling system 3.0

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Object Modeling System (OMS) is a framework for environmental model development, data provisioning, testing, validation, and deployment. It provides a bridge for transferring technology from the research organization to the program delivery agency. The framework provides a consistent and efficie...

  5. Modelling Diffusion of a Personalized Learning Framework

    ERIC Educational Resources Information Center

    Karmeshu; Raman, Raghu; Nedungadi, Prema

    2012-01-01

    A new modelling approach for diffusion of personalized learning as an educational process innovation in social group comprising adopter-teachers is proposed. An empirical analysis regarding the perception of 261 adopter-teachers from 18 schools in India about a particular personalized learning framework has been made. Based on this analysis,…

  6. Overarching framework for data-based modelling

    NASA Astrophysics Data System (ADS)

    Schelter, Björn; Mader, Malenka; Mader, Wolfgang; Sommerlade, Linda; Platt, Bettina; Lai, Ying-Cheng; Grebogi, Celso; Thiel, Marco

    2014-02-01

    One of the main modelling paradigms for complex physical systems are networks. When estimating the network structure from measured signals, typically several assumptions such as stationarity are made in the estimation process. Violating these assumptions renders standard analysis techniques fruitless. We here propose a framework to estimate the network structure from measurements of arbitrary non-linear, non-stationary, stochastic processes. To this end, we propose a rigorous mathematical theory that underlies this framework. Based on this theory, we present a highly efficient algorithm and the corresponding statistics that are immediately sensibly applicable to measured signals. We demonstrate its performance in a simulation study. In experiments of transitions between vigilance stages in rodents, we infer small network structures with complex, time-dependent interactions; this suggests biomarkers for such transitions, the key to understand and diagnose numerous diseases such as dementia. We argue that the suggested framework combines features that other approaches followed so far lack.

  7. DANA: distributed numerical and adaptive modelling framework.

    PubMed

    Rougier, Nicolas P; Fix, Jérémy

    2012-01-01

    DANA is a python framework ( http://dana.loria.fr ) whose computational paradigm is grounded on the notion of a unit that is essentially a set of time dependent values varying under the influence of other units via adaptive weighted connections. The evolution of a unit's value are defined by a set of differential equations expressed in standard mathematical notation which greatly ease their definition. The units are organized into groups that form a model. Each unit can be connected to any other unit (including itself) using a weighted connection. The DANA framework offers a set of core objects needed to design and run such models. The modeler only has to define the equations of a unit as well as the equations governing the training of the connections. The simulation is completely transparent to the modeler and is handled by DANA. This allows DANA to be used for a wide range of numerical and distributed models as long as they fit the proposed framework (e.g. cellular automata, reaction-diffusion system, decentralized neural networks, recurrent neural networks, kernel-based image processing, etc.). PMID:22994650

  8. An evaluation framework for participatory modelling

    NASA Astrophysics Data System (ADS)

    Krueger, T.; Inman, A.; Chilvers, J.

    2012-04-01

    Strong arguments for participatory modelling in hydrology can be made on substantive, instrumental and normative grounds. These arguments have led to increasingly diverse groups of stakeholders (here anyone affecting or affected by an issue) getting involved in hydrological research and the management of water resources. In fact, participation has become a requirement of many research grants, programs, plans and policies. However, evidence of beneficial outcomes of participation as suggested by the arguments is difficult to generate and therefore rare. This is because outcomes are diverse, distributed, often tacit, and take time to emerge. In this paper we develop an evaluation framework for participatory modelling focussed on learning outcomes. Learning encompasses many of the potential benefits of participation, such as better models through diversity of knowledge and scrutiny, stakeholder empowerment, greater trust in models and ownership of subsequent decisions, individual moral development, reflexivity, relationships, social capital, institutional change, resilience and sustainability. Based on the theories of experiential, transformative and social learning, complemented by practitioner experience our framework examines if, when and how learning has occurred. Special emphasis is placed on the role of models as learning catalysts. We map the distribution of learning between stakeholders, scientists (as a subgroup of stakeholders) and models. And we analyse what type of learning has occurred: instrumental learning (broadly cognitive enhancement) and/or communicative learning (change in interpreting meanings, intentions and values associated with actions and activities; group dynamics). We demonstrate how our framework can be translated into a questionnaire-based survey conducted with stakeholders and scientists at key stages of the participatory process, and show preliminary insights from applying the framework within a rural pollution management situation in

  9. A framework for benchmarking land models

    SciTech Connect

    Luo, Yiqi; Randerson, J.; Abramowitz, G.; Bacour, C.; Blyth, E.; Carvalhais, N.; Ciais, Philippe; Dalmonech, D.; Fisher, J.B.; Fisher, R.; Friedlingstein, P.; Hibbard, Kathleen A.; Hoffman, F. M.; Huntzinger, Deborah; Jones, C.; Koven, C.; Lawrence, David M.; Li, D.J.; Mahecha, M.; Niu, S.L.; Norby, Richard J.; Piao, S.L.; Qi, X.; Peylin, P.; Prentice, I.C.; Riley, William; Reichstein, M.; Schwalm, C.; Wang, Y.; Xia, J. Y.; Zaehle, S.; Zhou, X. H.

    2012-10-09

    Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1) targeted aspects of model performance to be evaluated, (2) a set of benchmarks as defined references to test model performance, (3) metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4) model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1) a priori thresholds of acceptable model performance and (2) a scoring system to combine data–model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties of land models

  10. A Framework for Considering Comprehensibility in Modeling.

    PubMed

    Gleicher, Michael

    2016-06-01

    Comprehensibility in modeling is the ability of stakeholders to understand relevant aspects of the modeling process. In this article, we provide a framework to help guide exploration of the space of comprehensibility challenges. We consider facets organized around key questions: Who is comprehending? Why are they trying to comprehend? Where in the process are they trying to comprehend? How can we help them comprehend? How do we measure their comprehension? With each facet we consider the broad range of options. We discuss why taking a broad view of comprehensibility in modeling is useful in identifying challenges and opportunities for solutions. PMID:27441712

  11. A framework for modeling rail transport vulnerability

    SciTech Connect

    Peterson, Steven K; Church, Richard L.

    2008-01-01

    Railroads represent one of the most efficient methods of long-haul transport for bulk commodities, from coal to agricultural products. Over the past fifty years, the rail network has contracted while tonnage has increased. Service, geographically, has been abandoned along short haul routes and increased along major long haul routes, resulting in a network that is more streamlined. The current rail network may be very vulnerable to disruptions, like the failure of a trestle. This paper proposes a framework to model rail network vulnerability and gives an application of this modeling framework in analyzing rail network vulnerability for the State of Washington. It concludes with a number of policy related issues that need to be addressed in order to identify, plan, and mitigate the risks associated with the sudden loss of a bridge or trestle.

  12. Density Estimation Framework for Model Error Assessment

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Liu, Z.; Najm, H. N.; Safta, C.; VanBloemenWaanders, B.; Michelsen, H. A.; Bambha, R.

    2014-12-01

    In this work we highlight the importance of model error assessment in physical model calibration studies. Conventional calibration methods often assume the model is perfect and account for data noise only. Consequently, the estimated parameters typically have biased values that implicitly compensate for model deficiencies. Moreover, improving the amount and the quality of data may not improve the parameter estimates since the model discrepancy is not accounted for. In state-of-the-art methods model discrepancy is explicitly accounted for by enhancing the physical model with a synthetic statistical additive term, which allows appropriate parameter estimates. However, these statistical additive terms do not increase the predictive capability of the model because they are tuned for particular output observables and may even violate physical constraints. We introduce a framework in which model errors are captured by allowing variability in specific model components and parameterizations for the purpose of achieving meaningful predictions that are both consistent with the data spread and appropriately disambiguate model and data errors. Here we cast model parameters as random variables, embedding the calibration problem within a density estimation framework. Further, we calibrate for the parameters of the joint input density. The likelihood function for the associated inverse problem is degenerate, therefore we use Approximate Bayesian Computation (ABC) to build prediction-constraining likelihoods and illustrate the strengths of the method on synthetic cases. We also apply the ABC-enhanced density estimation to the TransCom 3 CO2 intercomparison study (Gurney, K. R., et al., Tellus, 55B, pp. 555-579, 2003) and calibrate 15 transport models for regional carbon sources and sinks given atmospheric CO2 concentration measurements.

  13. An entropic framework for modeling economies

    NASA Astrophysics Data System (ADS)

    Caticha, Ariel; Golan, Amos

    2014-08-01

    We develop an information-theoretic framework for economic modeling. This framework is based on principles of entropic inference that are designed for reasoning on the basis of incomplete information. We take the point of view of an external observer who has access to limited information about broad macroscopic economic features. We view this framework as complementary to more traditional methods. The economy is modeled as a collection of agents about whom we make no assumptions of rationality (in the sense of maximizing utility or profit). States of statistical equilibrium are introduced as those macrostates that maximize entropy subject to the relevant information codified into constraints. The basic assumption is that this information refers to supply and demand and is expressed in the form of the expected values of certain quantities (such as inputs, resources, goods, production functions, utility functions and budgets). The notion of economic entropy is introduced. It provides a measure of the uniformity of the distribution of goods and resources. It captures both the welfare state of the economy as well as the characteristics of the market (say, monopolistic, concentrated or competitive). Prices, which turn out to be the Lagrange multipliers, are endogenously generated by the economy. Further studies include the equilibrium between two economies and the conditions for stability. As an example, the case of the nonlinear economy that arises from linear production and utility functions is treated in some detail.

  14. Concordance: A Framework for Managing Model Integrity

    NASA Astrophysics Data System (ADS)

    Rose, Louis M.; Kolovos, Dimitrios S.; Drivalos, Nicholas; Williams, James R.; Paige, Richard F.; Polack, Fiona A. C.; Fernandes, Kiran J.

    A change to a software development artefact, such as source code or documentation, can affect the integrity of others. Many contemporary software development environments provide tools that automatically manage (detect, report and reconcile) integrity. For instance, incremental background compilation can reconcile object code with changing source code and report calls to a method that are inconsistent with its definition. Although models are increasingly first-class citizens in software development, contemporary development environments are less able to automatically detect, manage and reconcile the integrity of models than the integrity of other types of artefact. In this paper, we discuss the scalability and efficiency problems faced when managing model integrity for two categories of change that occur in MDE. We present a framework to support the incremental management of model integrity, evaluating the efficiency of the proposed approach atop Eclipse and EMF.

  15. Architecting a Simulation Framework for Model Rehosting

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2004-01-01

    The utility of vehicle math models extends beyond human-in-the-loop simulation. It is desirable to deploy a given model across a multitude of applications that target design, analysis, and research. However, the vehicle model alone represents an incomplete simulation. One must also replicate the environment models (e.g., atmosphere, gravity, terrain) to achieve identical vehicle behavior across all applications. Environment models are increasing in complexity and represent a substantial investment to re-engineer for a new application. A software component that can be rehosted in each application is one solution to the deployment problem. The component must encapsulate both the vehicle and environment models. The component must have a well-defined interface that abstracts the bulk of the logic to operate the models. This paper examines the characteristics of a rehostable modeling component from the perspective of a human-in-the-loop simulation framework. The Langley Standard Real-Time Simulation in C++ (LaSRS++) is used as an example. LaSRS++ was recently redesigned to transform its modeling package into a rehostable component.

  16. A framework for multi-scale modelling

    PubMed Central

    Chopard, B.; Borgdorff, Joris; Hoekstra, A. G.

    2014-01-01

    We review a methodology to design, implement and execute multi-scale and multi-science numerical simulations. We identify important ingredients of multi-scale modelling and give a precise definition of them. Our framework assumes that a multi-scale model can be formulated in terms of a collection of coupled single-scale submodels. With concepts such as the scale separation map, the generic submodel execution loop (SEL) and the coupling templates, one can define a multi-scale modelling language which is a bridge between the application design and the computer implementation. Our approach has been successfully applied to an increasing number of applications from different fields of science and technology. PMID:24982249

  17. A learning framework for catchment erosion modelling

    NASA Astrophysics Data System (ADS)

    Freer, J. E.; Quinton, J.

    2006-12-01

    Erosion modelling at the catchment scale, like many other disciplines that model environmental signals, is not an exact science. We are limited by our incomplete perceptual understanding of relevant processes; in formulating and simplifying these perceptions into conceptual models; and by our ability to collect data at the right resolution and spatial scale to drive and evaluate our models effectively. The challenge is how to develop models which take into account our difficulties in describing processes, parameterising equations and demonstrating that they perform within acceptable limits. In this talk we will: examine how limited data has been used to develop algorithms applied across the world and how this may lead to one source of prediction error discuss the use of uncertainty analysis techniques for describing the possible suite of model predictions that give acceptable responses explore what field observations can tell us about model performance and how these might be used to constrain uncertainties in model predictions or in some cases contribute towards these uncertainties; consider how we might learn from our data to produce models with an appropriate degree of complexity We hope that the talk will begin a debate about our ability to capture the essence of erosional processes and quantities for storm responses through data and modelling that includes characterising the appropriate level of uncertainties. We will use examples from the literature as well as from our own observations and modelling initiatives. We hope to generate some lively discussion about the limits of our observations to both inform and to evaluate, to consider what the appropriate level of complexity should be for catchment scale erosion modelling and to consider ways to develop a learning framework for all erosion scientists to engage in.

  18. Conceptual Frameworks in the Doctoral Research Process: A Pedagogical Model

    ERIC Educational Resources Information Center

    Berman, Jeanette; Smyth, Robyn

    2015-01-01

    This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…

  19. A Smallholder Socio-hydrological Modelling Framework

    NASA Astrophysics Data System (ADS)

    Pande, S.; Savenije, H.; Rathore, P.

    2014-12-01

    Small holders are farmers who own less than 2 ha of farmland. They often have low productivity and thus remain at subsistence level. A fact that nearly 80% of Indian farmers are smallholders, who merely own a third of total farmlands and belong to the poorest quartile, but produce nearly 40% of countries foodgrains underlines the importance of understanding the socio-hydrology of a small holder. We present a framework to understand the socio-hydrological system dynamics of a small holder. It couples the dynamics of 6 main variables that are most relevant at the scale of a small holder: local storage (soil moisture and other water storage), capital, knowledge, livestock production, soil fertility and grass biomass production. The model incorporates rule-based adaptation mechanisms (for example: adjusting expenditures on food and fertilizers, selling livestocks etc.) of small holders when they face adverse socio-hydrological conditions, such as low annual rainfall, higher intra-annual variability in rainfall or variability in agricultural prices. It allows us to study sustainability of small holder farming systems under various settings. We apply the framework to understand the socio-hydrology of small holders in Aurangabad, Maharashtra, India. This district has witnessed suicides of many sugarcane farmers who could not extricate themselves out of the debt trap. These farmers lack irrigation and are susceptible to fluctuating sugar prices and intra-annual hydroclimatic variability. This presentation discusses two aspects in particular: whether government interventions to absolve the debt of farmers is enough and what is the value of investing in local storages that can buffer intra-annual variability in rainfall and strengthening the safety-nets either by creating opportunities for alternative sources of income or by crop diversification.

  20. An Exploratory Investigation on the Invasiveness of Environmental Modeling Frameworks

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper provides initial results of an exploratory investigation on the invasiveness of environmental modeling frameworks. Invasiveness is defined as the coupling between application (i.e., model) and framework code used to implement the model. By comparing the implementation of an environmenta...

  1. Improving the physics models in the Space Weather Modeling Framework

    NASA Astrophysics Data System (ADS)

    Toth, G.; Fang, F.; Frazin, R. A.; Gombosi, T. I.; Ilie, R.; Liemohn, M. W.; Manchester, W. B.; Meng, X.; Pawlowski, D. J.; Ridley, A. J.; Sokolov, I.; van der Holst, B.; Vichare, G.; Yigit, E.; Yu, Y.; Buzulukova, N.; Fok, M. H.; Glocer, A.; Jordanova, V. K.; Welling, D. T.; Zaharia, S. G.

    2010-12-01

    The success of physics based space weather forecasting depends on several factors: we need sufficient amount and quality of timely observational data, we have to understand the physics of the Sun-Earth system well enough, we need sophisticated computational models, and the models have to run faster than real time on the available computational resources. This presentation will focus on a single ingredient, the recent improvements of the mathematical and numerical models in the Space Weather Modeling Framework. We have developed a new physics based CME initiation code using flux emergence from the convection zone solving the equations of radiative magnetohydrodynamics (MHD). Our new lower corona and solar corona models use electron heat conduction, Alfven wave heating, and boundary conditions based on solar tomography. We can obtain a physically consistent solar wind model from the surface of the Sun all the way to the L1 point without artificially changing the polytropic index. The global magnetosphere model can now solve the multi-ion MHD equations and take into account the oxygen outflow from the polar wind model. We have also added the options of solving for Hall MHD and anisotropic pressure. Several new inner magnetosphere models have been added to the framework: CRCM, HEIDI and RAM-SCB. These new models resolve the pitch angle distribution of the trapped particles. The upper atmosphere model GITM has been improved by including a self-consistent equatorial electrodynamics and the effects of solar flares. This presentation will very briefly describe the developments and highlight some results obtained with the improved and new models.

  2. Fermi-Dirac statistics plus liquid description of quark partons

    NASA Astrophysics Data System (ADS)

    Buccella, F.; Miele, G.; Migliore, G.; Tibullo, V.

    1995-12-01

    A previous approach with Fermi-Dirac distributions for fermion partons is here improved to comply with the expected low x behaviour of structure functions. We are so able to get a fair description of the unpolarized and polarized structure functions of the nucleons as well as of neutrino data. We cannot reach definite conclusions, but confirm our suspicion of a relationship between the defects in Gottfried and spin sum rules.

  3. PARCC Model Content Frameworks: Mathematics--Grades 3-11

    ERIC Educational Resources Information Center

    Partnership for Assessment of Readiness for College and Careers (NJ1), 2011

    2011-01-01

    As part of its proposal to the U.S. Department of Education, the Partnership for Assessment of Readiness for College and Careers (PARCC) committed to developing model content frameworks for mathematics to serve as a bridge between the Common Core State Standards and the PARCC assessments. The PARCC Model Content Frameworks were developed through a…

  4. Critical Thinking: Frameworks and Models for Teaching

    ERIC Educational Resources Information Center

    Fahim, Mansoor; Eslamdoost, Samaneh

    2014-01-01

    Developing critical thinking since the educational revolution gave rise to flourishing movements toward embedding critical thinking (CT henceforth) stimulating classroom activities in educational settings. Nevertheless the process faced with complications such as teachability potentiality, lack of practical frameworks concerning actualization of…

  5. A Simulation and Modeling Framework for Space Situational Awareness

    SciTech Connect

    Olivier, S S

    2008-09-15

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.

  6. Field-theoretical description of deep inelastic scattering

    SciTech Connect

    Geyer, B.; Robaschik, D.; Wieczorek, E.

    1980-01-01

    The most important theoretical notions concerning deep inelastic scattering are reviewed. Topics discussed are the model-independent approach, which is based on the general principles of quantum field theory, the application of quantum chromodynamics to deep inelastic scattering, approaches based on the quark--parton model, the light cone algebra, and conformal invariance, and also investigations in the framework of perturbation theory.

  7. A modeling framework for resource-user-infrastructure systems

    NASA Astrophysics Data System (ADS)

    Muneepeerakul, R.; Qubbaj, M.; Anderies, J. M.; Aggarwal, R.; Janssen, M.

    2012-12-01

    A compact modeling framework is developed to supplement a conceptual framework of coupled natural-human systems. The framework consists of four components: resource (R), users (U), public infrastructure (PI), and public infrastructure providers (PIP), the last two of which have not been adequately addressed in many existing modeling studies. The modeling approach employed here is a set of replicator equations describing the dynamical frequencies of social strategies (of U and PIP), whose payoffs are explicit and dynamical functions of biophysical components (R and PI). Model development and preliminary results from specific implementation will be reported and discussed.

  8. A traceability framework for diagnostics of global land models

    NASA Astrophysics Data System (ADS)

    Luo, Yiqi; Xia, Jianyang; Liang, Junyi; Jiang, Lifen; Shi, Zheng; KC, Manoj; Hararuk, Oleksandra; Rafique, Rashid; Wang, Ying-Ping

    2015-04-01

    The biggest impediment to model diagnostics and improvement at present is model intractability. The more processes incorporated, the more difficult it becomes to understand or evaluate model behavior. As a result, uncertainty in predictions among global land models cannot be easily diagnosed and attributed to their sources. We have recently developed an approach to analytically decompose a complex land model into traceable components based on mutually independent properties of modeled core biogeochemical processes. As all global land carbon models share those common properties, this traceability framework is applicable to all of them to improve their tractability. Indeed, we have applied the traceability framework to improve model diagnostics in several aspects. First, we have applied the framework to the Australian Community Atmosphere Biosphere Land Exchange (CABLE) model and Community Land Model version 3.5 (CLM3.5) to identify sources of those model differences. The major causes of different predictions were found to be parameter setting related to carbon input and baseline residence times between the two models. Second, we have used the framework to diagnose impacts of adding nitrogen processes into CABLE on its carbon simulation. Adding nitrogen processes not only reduces net primary production but also shortens residence times in the CABLE model. Third, the framework helps isolate components of CLM3.5 for data assimilation. Data assimilation with global land models has been computationally extremely difficult. By isolating traceable components, we have improved parameterization of CLM3.4 related to soil organic decomposition, microbial kinetics and carbon use efficiency, and litter decomposition. Further, we are currently developing the traceability framework to analyze transient simulations of models that were involved in the coupled Model Intercomparison Project Phase 5 (CMIP5) to improve our understanding on parameter space of global carbon models. This

  9. A framework for modeling human evolution.

    PubMed

    Gintis, Herbert

    2016-01-01

    Culture-led gene-culture coevolution is a framework within which substantive explanations of human evolution must be located. It is not itself an explanation. Explanations depend on such concrete historical evolutionary factors such as the control of fire, collective child-rearing, lethal weapon technology, altruistic cooperation and punishment, and the mastery of complex collaboration protocols leading to an effective division of social labor. PMID:27561218

  10. A Framework for Dimensionality Assessment for Multidimensional Item Response Models

    ERIC Educational Resources Information Center

    Svetina, Dubravka; Levy, Roy

    2014-01-01

    A framework is introduced for considering dimensionality assessment procedures for multidimensional item response models. The framework characterizes procedures in terms of their confirmatory or exploratory approach, parametric or nonparametric assumptions, and applicability to dichotomous, polytomous, and missing data. Popular and emerging…

  11. Coastal Ecosystem Integrated Compartment Model (ICM): Modeling Framework

    NASA Astrophysics Data System (ADS)

    Meselhe, E. A.; White, E. D.; Reed, D.

    2015-12-01

    The Integrated Compartment Model (ICM) was developed as part of the 2017 Coastal Master Plan modeling effort. It is a comprehensive and numerical hydrodynamic model coupled to various geophysical process models. Simplifying assumptions related to some of the flow dynamics are applied to increase the computational efficiency of the model. The model can be used to provide insights about coastal ecosystems and evaluate restoration strategies. It builds on existing tools where possible and incorporates newly developed tools where necessary. It can perform decadal simulations (~ 50 years) across the entire Louisiana coast. It includes several improvements over the approach used to support the 2012 Master Plan, such as: additional processes in the hydrology, vegetation, wetland and barrier island morphology subroutines, increased spatial resolution, and integration of previously disparate models into a single modeling framework. The ICM includes habitat suitability indices (HSIs) to predict broad spatial patterns of habitat change, and it provides an additional integration to a dynamic fish and shellfish community model which quantitatively predicts potential changes in important fishery resources. It can be used to estimate the individual and cumulative effects of restoration and protection projects on the landscape, including a general estimate of water levels associated with flooding. The ICM is also used to examine possible impacts of climate change and future environmental scenarios (e.g. precipitation, Eustatic sea level rise, subsidence, tropical storms, etc.) on the landscape and on the effectiveness of restoration projects. The ICM code is publically accessible, and coastal restoration and protection groups interested in planning-level modeling are encouraged to explore its utility as a computationally efficient tool to examine ecosystem response to future physical or ecological changes, including the implementation of restoration and protection strategies.

  12. A Computational Framework for Realistic Retina Modeling.

    PubMed

    Martínez-Cañada, Pablo; Morillas, Christian; Pino, Begoña; Ros, Eduardo; Pelayo, Francisco

    2016-11-01

    Computational simulations of the retina have led to valuable insights about the biophysics of its neuronal activity and processing principles. A great number of retina models have been proposed to reproduce the behavioral diversity of the different visual processing pathways. While many of these models share common computational stages, previous efforts have been more focused on fitting specific retina functions rather than generalizing them beyond a particular model. Here, we define a set of computational retinal microcircuits that can be used as basic building blocks for the modeling of different retina mechanisms. To validate the hypothesis that similar processing structures may be repeatedly found in different retina functions, we implemented a series of retina models simply by combining these computational retinal microcircuits. Accuracy of the retina models for capturing neural behavior was assessed by fitting published electrophysiological recordings that characterize some of the best-known phenomena observed in the retina: adaptation to the mean light intensity and temporal contrast, and differential motion sensitivity. The retinal microcircuits are part of a new software platform for efficient computational retina modeling from single-cell to large-scale levels. It includes an interface with spiking neural networks that allows simulation of the spiking response of ganglion cells and integration with models of higher visual areas. PMID:27354192

  13. Mediation Analysis in a Latent Growth Curve Modeling Framework

    ERIC Educational Resources Information Center

    von Soest, Tilmann; Hagtvet, Knut A.

    2011-01-01

    This article presents several longitudinal mediation models in the framework of latent growth curve modeling and provides a detailed account of how such models can be constructed. Logical and statistical challenges that might arise when such analyses are conducted are also discussed. Specifically, we discuss how the initial status (intercept) and…

  14. Modeling asset price processes based on mean-field framework

    NASA Astrophysics Data System (ADS)

    Ieda, Masashi; Shiino, Masatoshi

    2011-12-01

    We propose a model of the dynamics of financial assets based on the mean-field framework. This framework allows us to construct a model which includes the interaction among the financial assets reflecting the market structure. Our study is on the cutting edge in the sense of a microscopic approach to modeling the financial market. To demonstrate the effectiveness of our model concretely, we provide a case study, which is the pricing problem of the European call option with short-time memory noise.

  15. Towards a consistent modeling framework across scales

    NASA Astrophysics Data System (ADS)

    Jagers, B.

    2013-12-01

    The morphodynamic evolution of river-delta-coastal systems may be studied in detail to predict local, short-term changes or at a more aggregated level to indicate the net large scale, long-term effect. The whole spectrum of spatial and temporal scales needs to be considered for environmental impact studies. Usually this implies setting up a number of different models for different scales. Since the various models often use codes that have been independently developed by different researchers and include different formulations, it may be difficult to arrive at a consistent set of modeling results. This is one of the reasons why Deltares has taken on an effort to develop a consistent suite of model components that can be applied over a wide range of scales. The heart of this suite is formed by a flexible mesh flow component that supports mixed 1D-2D-3D domains, a equally flexible transport component with an expandable library of water quality and ecological processes, and a library of sediment transport and morphology routines that can be linked directly to the flow component or used as part of the process library. We will present the latest developments with a focus on the status of the sediment transport and morphology component for running consistent 1D, 2D and 3D models.

  16. Entity-Centric Abstraction and Modeling Framework for Transportation Architectures

    NASA Technical Reports Server (NTRS)

    Lewe, Jung-Ho; DeLaurentis, Daniel A.; Mavris, Dimitri N.; Schrage, Daniel P.

    2007-01-01

    A comprehensive framework for representing transpportation architectures is presented. After discussing a series of preceding perspectives and formulations, the intellectual underpinning of the novel framework using an entity-centric abstraction of transportation is described. The entities include endogenous and exogenous factors and functional expressions are offered that relate these and their evolution. The end result is a Transportation Architecture Field which permits analysis of future concepts under the holistic perspective. A simulation model which stems from the framework is presented and exercised producing results which quantify improvements in air transportation due to advanced aircraft technologies. Finally, a modeling hypothesis and its accompanying criteria are proposed to test further use of the framework for evaluating new transportation solutions.

  17. Evolutionary Framework for Lepidoptera Model Systems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Model systems” are specific organisms upon which detailed studies have been conducted examining a fundamental biological question. If the studies are robust, their results can be extrapolated among an array of organisms that possess features in common with the subject organism. The true power of...

  18. Theoretical Tinnitus Framework: A Neurofunctional Model

    PubMed Central

    Ghodratitoostani, Iman; Zana, Yossi; Delbem, Alexandre C. B.; Sani, Siamak S.; Ekhtiari, Hamed; Sanchez, Tanit G.

    2016-01-01

    Subjective tinnitus is the conscious (attended) awareness perception of sound in the absence of an external source and can be classified as an auditory phantom perception. Earlier literature establishes three distinct states of conscious perception as unattended, attended, and attended awareness conscious perception. The current tinnitus development models depend on the role of external events congruently paired with the causal physical events that precipitate the phantom perception. We propose a novel Neurofunctional Tinnitus Model to indicate that the conscious (attended) awareness perception of phantom sound is essential in activating the cognitive-emotional value. The cognitive-emotional value plays a crucial role in governing attention allocation as well as developing annoyance within tinnitus clinical distress. Structurally, the Neurofunctional Tinnitus Model includes the peripheral auditory system, the thalamus, the limbic system, brainstem, basal ganglia, striatum, and the auditory along with prefrontal cortices. Functionally, we assume the model includes presence of continuous or intermittent abnormal signals at the peripheral auditory system or midbrain auditory paths. Depending on the availability of attentional resources, the signals may or may not be perceived. The cognitive valuation process strengthens the lateral-inhibition and noise canceling mechanisms in the mid-brain, which leads to the cessation of sound perception and renders the signal evaluation irrelevant. However, the “sourceless” sound is eventually perceived and can be cognitively interpreted as suspicious or an indication of a disease in which the cortical top-down processes weaken the noise canceling effects. This results in an increase in cognitive and emotional negative reactions such as depression and anxiety. The negative or positive cognitive-emotional feedbacks within the top-down approach may have no relation to the previous experience of the patients. They can also be

  19. Theoretical Tinnitus Framework: A Neurofunctional Model.

    PubMed

    Ghodratitoostani, Iman; Zana, Yossi; Delbem, Alexandre C B; Sani, Siamak S; Ekhtiari, Hamed; Sanchez, Tanit G

    2016-01-01

    Subjective tinnitus is the conscious (attended) awareness perception of sound in the absence of an external source and can be classified as an auditory phantom perception. Earlier literature establishes three distinct states of conscious perception as unattended, attended, and attended awareness conscious perception. The current tinnitus development models depend on the role of external events congruently paired with the causal physical events that precipitate the phantom perception. We propose a novel Neurofunctional Tinnitus Model to indicate that the conscious (attended) awareness perception of phantom sound is essential in activating the cognitive-emotional value. The cognitive-emotional value plays a crucial role in governing attention allocation as well as developing annoyance within tinnitus clinical distress. Structurally, the Neurofunctional Tinnitus Model includes the peripheral auditory system, the thalamus, the limbic system, brainstem, basal ganglia, striatum, and the auditory along with prefrontal cortices. Functionally, we assume the model includes presence of continuous or intermittent abnormal signals at the peripheral auditory system or midbrain auditory paths. Depending on the availability of attentional resources, the signals may or may not be perceived. The cognitive valuation process strengthens the lateral-inhibition and noise canceling mechanisms in the mid-brain, which leads to the cessation of sound perception and renders the signal evaluation irrelevant. However, the "sourceless" sound is eventually perceived and can be cognitively interpreted as suspicious or an indication of a disease in which the cortical top-down processes weaken the noise canceling effects. This results in an increase in cognitive and emotional negative reactions such as depression and anxiety. The negative or positive cognitive-emotional feedbacks within the top-down approach may have no relation to the previous experience of the patients. They can also be

  20. Multicriteria framework for selecting a process modelling language

    NASA Astrophysics Data System (ADS)

    Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel

    2016-01-01

    The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.

  1. Mathematical models for biodegradation of chlorinated solvents. 1: Model framework

    SciTech Connect

    Zhang, X.; Banerji, S.; Bajpai, R.

    1996-12-31

    Complete mineralization of chlorinated solvents by microbial action has been demonstrated under aerobic as well as anaerobic conditions. In most of the cases, it is believed that the biodegradation is initiated by broad-specificity enzymes involved in metabolism of a primary substrate. Under aerobic conditions, some of the primary carbon and energy substrates are methane, propane, toluene, phenol, and ammonia; under anaerobic conditions, glucose, sucrose, acetate, propionate, isopropanol, methanol, and even natural organics act as the carbon source. Published biochemical studies suggest that the limiting step is often the initial part of the biodegradation pathway within the microbial system. For aerobic systems, the limiting step is thought to be the reaction catalyzed by mono- and dioxygenases which are induced by most primary substrates, although some constitutive strains have been reported. Other critical features of the biodegradative pathway include: (1) activity losses of critical enzyme(s) through the action of metabolic byproducts, (2) energetic needs of contaminant biodegradation which must be met by catabolism of the primary substrates, (3) changes in metabolic patterns in mixed cultures found in nature depending on the availability of electron acceptors, and (4) the associated accumulation and disappearance of metabolic intermediates. Often, the contaminant pool itself consists of several chlorinated solvents with separate and interactive biochemical needs. The existing models address some of the issues mentioned above. However, their ability to successfully predict biological fate of chlorinated solvents in nature is severely limited due to the existing mathematical models. Limiting step(s), inactivation of critical enzymes, recovery action, energetics, and a framework for multiple degradative pathways will be presented as a comprehensive model. 91 refs.

  2. First-Order Frameworks for Managing Models in Engineering Optimization

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natlia M.; Lewis, Robert Michael

    2000-01-01

    Approximation/model management optimization (AMMO) is a rigorous methodology for attaining solutions of high-fidelity optimization problems with minimal expense in high- fidelity function and derivative evaluation. First-order AMMO frameworks allow for a wide variety of models and underlying optimization algorithms. Recent demonstrations with aerodynamic optimization achieved three-fold savings in terms of high- fidelity function and derivative evaluation in the case of variable-resolution models and five-fold savings in the case of variable-fidelity physics models. The savings are problem dependent but certain trends are beginning to emerge. We give an overview of the first-order frameworks, current computational results, and an idea of the scope of the first-order framework applicability.

  3. Frameworks for Assessing the Quality of Modeling and Simulation Capabilities

    NASA Astrophysics Data System (ADS)

    Rider, W. J.

    2012-12-01

    The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are

  4. A Model Framework for Course Materials Construction. Third Edition.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    A model framework for course materials construction is presented as an aid to Coast Guard course writers and coordinators, curriculum developers, and instructors who must modify a course or draft a new one. The model assumes that the instructor or other designated person has: (1) completed a task analysis which identifies the competencies, skills…

  5. Argumentation in Science Education: A Model-Based Framework

    ERIC Educational Resources Information Center

    Bottcher, Florian; Meisert, Anke

    2011-01-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons…

  6. Characteristics and Conceptual Framework of the Easy-Play Model

    ERIC Educational Resources Information Center

    Lu, Chunlei; Steele, Kyle

    2014-01-01

    The Easy-Play Model offers a defined framework to organize games that promote an inclusive and enjoyable sport experience. The model can be implemented by participants playing sports in educational, recreational or social contexts with the goal of achieving an active lifestyle in an inclusive, cooperative and enjoyable environment. The Easy-Play…

  7. A National Modeling Framework for Water Management Decisions

    NASA Astrophysics Data System (ADS)

    Bales, J. D.; Cline, D. W.; Pietrowsky, R.

    2013-12-01

    The National Weather Service (NWS), the U.S. Army Corps of Engineers (USACE), and the U.S. Geological Survey (USGS), all Federal agencies with complementary water-resources activities, entered into an Interagency Memorandum of Understanding (MOU) "Collaborative Science Services and Tools to Support Integrated and Adaptive Water Resources Management" to collaborate in activities that are supportive to their respective missions. One of the interagency activities is the development of a highly integrated national water modeling framework and information services framework. Together these frameworks establish a common operating picture, improve modeling and synthesis, support the sharing of data and products among agencies, and provide a platform for incorporation of new scientific understanding. Each of the agencies has existing operational systems to assist in carrying out their respective missions. The systems generally are designed, developed, tested, fielded, and supported by specialized teams. A broader, shared approach is envisioned and would include community modeling, wherein multiple independent investigators or teams develop and contribute new modeling capabilities based on science advances; modern technology in coupling model components and visualizing results; and a coupled atmospheric - hydrologic model construct such that the framework could be used in real-time water-resources decision making or for long-term management decisions. The framework also is being developed to account for organizational structures of the three partners such that, for example, national data sets can move down to the regional scale, and vice versa. We envision the national water modeling framework to be an important element of North American Water Program, to contribute to goals of the Program, and to be informed by the science and approaches developed as a part of the Program.

  8. Design of single object model of software reuse framework

    NASA Astrophysics Data System (ADS)

    Yan, Liu

    2011-12-01

    In order to embody the reuse significance of software reuse framework fully, this paper will analyze in detail about the single object model mentioned in the article "The overall design of software reuse framework" and induce them as add and delete and modify mode, check mode, and search and scroll and display integrated mode. Three modes correspond to their own interface design template, class and database design concept. The modelling idea helps developers clear their minds and speed up. Even laymen can complete the development task easily.

  9. A software engineering perspective on environmental modeling framework design: The object modeling system

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The environmental modeling community has historically been concerned with the proliferation of models and the effort associated with collective model development tasks (e.g., code generation, data provisioning and transformation, etc.). Environmental modeling frameworks (EMFs) have been developed to...

  10. A Type-Theoretic Framework for Certified Model Transformations

    NASA Astrophysics Data System (ADS)

    Calegari, Daniel; Luna, Carlos; Szasz, Nora; Tasistro, Álvaro

    We present a framework based on the Calculus of Inductive Constructions (CIC) and its associated tool the Coq proof assistant to allow certification of model transformations in the context of Model-Driven Engineering (MDE). The approached is based on a semi-automatic translation process from metamodels, models and transformations of the MDE technical space into types, propositions and functions of the CIC technical space. We describe this translation and illustrate its use in a standard case study.

  11. 3-D HYDRODYNAMIC MODELING IN A GEOSPATIAL FRAMEWORK

    SciTech Connect

    Bollinger, J; Alfred Garrett, A; Larry Koffman, L; David Hayes, D

    2006-08-24

    3-D hydrodynamic models are used by the Savannah River National Laboratory (SRNL) to simulate the transport of thermal and radionuclide discharges in coastal estuary systems. Development of such models requires accurate bathymetry, coastline, and boundary condition data in conjunction with the ability to rapidly discretize model domains and interpolate the required geospatial data onto the domain. To facilitate rapid and accurate hydrodynamic model development, SRNL has developed a pre- and post-processor application in a geospatial framework to automate the creation of models using existing data. This automated capability allows development of very detailed models to maximize exploitation of available surface water radionuclide sample data and thermal imagery.

  12. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation.

    PubMed

    Mangado, Nerea; Ceresa, Mario; Duchateau, Nicolas; Kjer, Hans Martin; Vera, Sergio; Dejea Velardo, Hector; Mistrik, Pavel; Paulsen, Rasmus R; Fagertun, Jens; Noailly, Jérôme; Piella, Gemma; González Ballester, Miguel Ángel

    2016-08-01

    Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging. To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient's CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns constitutive parameters to all components of the finite element model. This model can then be used to study in silico the effects of the electrical stimulation of the cochlear implant. Results are shown on a total of 25 models of patients. In all cases, a final mesh suitable for finite element simulations was obtained, in an average time of 94 s. The framework has proven to be fast and robust, and is promising for a detailed prognosis of the cochlear implantation surgery. PMID:26715210

  13. A computational framework for a database of terrestrial biosphere models

    NASA Astrophysics Data System (ADS)

    Metzler, Holger; Müller, Markus; Ceballos-Núñez, Verónika; Sierra, Carlos A.

    2016-04-01

    Most terrestrial biosphere models consist of a set of coupled ordinary first order differential equations. Each equation represents a pool containing carbon with a certain turnover rate. Although such models share some basic mathematical structures, they can have very different properties such as number of pools, cycling rates, and internal fluxes. We present a computational framework that helps analyze the structure and behavior of terrestrial biosphere models using as an example the process of soil organic matter decomposition. The same framework can also be used for other sub-processes such as carbon fixation or allocation. First, the models have to be fed into a database consisting of simple text files with a common structure. Then they are read in using Python and transformed into an internal 'Model Class' that can be used to automatically create an overview stating the model's structure, state variables, internal and external fluxes. SymPy, a Python library for symbolic mathematics, helps to also calculate the Jacobian matrix at possibly given steady states and the eigenvalues of this matrix. If complete parameter sets are available, the model can also be run using R to simulate its behavior under certain conditions and to support a deeper stability analysis. In this case, the framework is also able to provide phase-plane plots if appropriate. Furthermore, an overview of all the models in the database can be given to help identify their similarities and differences.

  14. Framework for Understanding Structural Errors (FUSE): a modular framework to diagnose differences between hydrological models

    USGS Publications Warehouse

    Clark, Martyn P.; Slater, Andrew G.; Rupp, David E.; Woods, Ross A.; Vrugt, Jasper A.; Gupta, Hoshin V.; Wagener, Thorsten; Hay, Lauren E.

    2008-01-01

    The problems of identifying the most appropriate model structure for a given problem and quantifying the uncertainty in model structure remain outstanding research challenges for the discipline of hydrology. Progress on these problems requires understanding of the nature of differences between models. This paper presents a methodology to diagnose differences in hydrological model structures: the Framework for Understanding Structural Errors (FUSE). FUSE was used to construct 79 unique model structures by combining components of 4 existing hydrological models. These new models were used to simulate streamflow in two of the basins used in the Model Parameter Estimation Experiment (MOPEX): the Guadalupe River (Texas) and the French Broad River (North Carolina). Results show that the new models produced simulations of streamflow that were at least as good as the simulations produced by the models that participated in the MOPEX experiment. Our initial application of the FUSE method for the Guadalupe River exposed relationships between model structure and model performance, suggesting that the choice of model structure is just as important as the choice of model parameters. However, further work is needed to evaluate model simulations using multiple criteria to diagnose the relative importance of model structural differences in various climate regimes and to assess the amount of independent information in each of the models. This work will be crucial to both identifying the most appropriate model structure for a given problem and quantifying the uncertainty in model structure. To facilitate research on these problems, the FORTRAN-90 source code for FUSE is available upon request from the lead author.

  15. S-factor in the framework of the ``shadow`` model

    SciTech Connect

    Scalia, A. |

    1995-02-05

    The {ital S}({ital E}) factor is obtained in the framework of the ``shadow`` model. The analytical expression of the {ital S}({ital E}) function is not compatible with a {ital S}-factor which is a slow varying function of the energy. {copyright} {ital 1995} {ital American} {ital Institute} {ital of} {ital Physics}.

  16. A Theoretical Framework for Physics Education Research: Modeling Student Thinking

    ERIC Educational Resources Information Center

    Redish, Edward F.

    2004-01-01

    Education is a goal-oriented field. But if we want to treat education scientifically so we can accumulate, evaluate, and refine what we learn, then we must develop a theoretical framework that is strongly rooted in objective observations and through which different theoretical models of student thinking can be compared. Much that is known in the…

  17. Language Arts Curriculum Framework: Sample Curriculum Model, Grade 1.

    ERIC Educational Resources Information Center

    Arkansas State Dept. of Education, Little Rock.

    Based on the 1998 Arkansas State Language Arts Framework, this sample curriculum model for grade one language arts is divided into sections focusing on writing; listening, speaking, and viewing; and reading. Each section lists standards; benchmarks; assessments; and strategies/activities. The reading section itself is divided into print awareness;…

  18. Language Arts Curriculum Framework: Sample Curriculum Model, Grade K.

    ERIC Educational Resources Information Center

    Arkansas State Dept. of Education, Little Rock.

    Based on the 1998 Arkansas State Language Arts Framework, this sample curriculum model for kindergarten language arts is divided into sections focusing on writing; listening, speaking, and viewing; and reading. Each section lists standards; benchmarks; assessments; and strategies/activities. The reading section itself is divided into print…

  19. The BMW Model: A New Framework for Teaching Monetary Economics

    ERIC Educational Resources Information Center

    Bofinger, Peter; Mayer, Eric; Wollmershauser, Timo

    2006-01-01

    Although the IS/LM-AS/AD model is still the central tool of macroeconomic teaching in most macroeconomic textbooks, it has been criticized by several economists. Colander (1995) demonstrated that the framework is logically inconsistent, Romer (2000) showed that it is unable to deal with a monetary policy that uses the interest rate as its…

  20. A Framework for Understanding Physics Students' Computational Modeling Practices

    NASA Astrophysics Data System (ADS)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by

  1. Theoretical Models and Operational Frameworks in Public Health Ethics

    PubMed Central

    Petrini, Carlo

    2010-01-01

    The article is divided into three sections: (i) an overview of the main ethical models in public health (theoretical foundations); (ii) a summary of several published frameworks for public health ethics (practical frameworks); and (iii) a few general remarks. Rather than maintaining the superiority of one position over the others, the main aim of the article is to summarize the basic approaches proposed thus far concerning the development of public health ethics by describing and comparing the various ideas in the literature. With this in mind, an extensive list of references is provided. PMID:20195441

  2. Model framework for describing the dynamics of evolving networks

    NASA Astrophysics Data System (ADS)

    Tobochnik, Jan; Strandburg, Katherine; Csardi, Gabor; Erdi, Peter

    2007-03-01

    We present a model framework for describing the dynamics of evolving networks. In this framework the addition of edges is stochastically governed by some important intrinsic and structural properties of network vertices through an attractiveness function. We discuss the solution of the inverse problem: determining the attractiveness function from the network evolution data. We also present a number of example applications: the description of the US patent citation network using vertex degree, patent age and patent category variables, and we show how the time-dependent version of the method can be used to find and describe important changes in the internal dynamics. We also compare our results to scientific citation networks.

  3. A High Performance Bayesian Computing Framework for Spatiotemporal Uncertainty Modeling

    NASA Astrophysics Data System (ADS)

    Cao, G.

    2015-12-01

    All types of spatiotemporal measurements are subject to uncertainty. With spatiotemporal data becomes increasingly involved in scientific research and decision making, it is important to appropriately model the impact of uncertainty. Quantitatively modeling spatiotemporal uncertainty, however, is a challenging problem considering the complex dependence and dataheterogeneities.State-space models provide a unifying and intuitive framework for dynamic systems modeling. In this paper, we aim to extend the conventional state-space models for uncertainty modeling in space-time contexts while accounting for spatiotemporal effects and data heterogeneities. Gaussian Markov Random Field (GMRF) models, also known as conditional autoregressive models, are arguably the most commonly used methods for modeling of spatially dependent data. GMRF models basically assume that a geo-referenced variable primarily depends on its neighborhood (Markov property), and the spatial dependence structure is described via a precision matrix. Recent study has shown that GMRFs are efficient approximation to the commonly used Gaussian fields (e.g., Kriging), and compared with Gaussian fields, GMRFs enjoy a series of appealing features, such as fast computation and easily accounting for heterogeneities in spatial data (e.g, point and areal). This paper represents each spatial dataset as a GMRF and integrates them into a state-space form to statistically model the temporal dynamics. Different types of spatial measurements (e.g., categorical, count or continuous), can be accounted for by according link functions. A fast alternative to MCMC framework, so-called Integrated Nested Laplace Approximation (INLA), was adopted for model inference.Preliminary case studies will be conducted to showcase the advantages of the described framework. In the first case, we apply the proposed method for modeling the water table elevation of Ogallala aquifer over the past decades. In the second case, we analyze the

  4. Compendium of models from a gauge U(1) framework

    NASA Astrophysics Data System (ADS)

    Ma, Ernest

    2016-06-01

    A gauge U(1) framework was established in 2002 to extend the supersymmetric Standard Model. It has many possible realizations. Whereas all have the necessary and sufficient ingredients to explain the possible 750 GeV diphoton excess, observed recently by the ATLAS and CMS Collaborations at the large hadron collider (LHC), they differ in other essential aspects. A compendium of such models is discussed.

  5. A framework for modeling uncertainty in regional climate change (Invited)

    NASA Astrophysics Data System (ADS)

    Monier, E.; Gao, X.; Scott, J. R.; Sokolov, A. P.; Schlosser, C. A.

    2013-12-01

    In this study, we present a new modeling framework and a large ensemble of climate projections to investigate the uncertainty in regional climate change over the United States associated with four dimensions of uncertainty. The sources of uncertainty considered in this framework are the emissions projections (using different climate policies), the climate system response (represented by different values of climate sensitivity and net aerosol forcing), natural variability (by perturbing initial conditions) and structural uncertainty (using different climate models). The modeling framework revolves around the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model with an intermediate complexity earth system model (with a two-dimensional zonal-mean atmosphere). Regional climate change over the United States is obtained through a two-pronged approach. First, we use the IGSM-CAM framework which links the IGSM to the National Center for Atmospheric Research (NCAR) Community Atmosphere Model (CAM). Secondly, we use a pattern-scaling method that extends the IGSM zonal mean based on climate change patterns from various climate models. Results show that uncertainty in temperature changes are mainly driven by policy choices and the range of climate sensitivity considered. Meanwhile, the four sources of uncertainty contribute more equally to precipitation changes, with natural variability having a large impact in the first part of the 21st century. Overall, the choice of policy is the largest driver of uncertainty in future projections of climate change over the United States. In light of these results, we recommend that when investigating climate change impacts over specific regions, studies consider all four sources of uncertainty analyzed in this paper.

  6. An enhanced BSIM modeling framework for selfheating aware circuit design

    NASA Astrophysics Data System (ADS)

    Schleyer, M.; Leuschner, S.; Baumgartner, P.; Mueller, J.-E.; Klar, H.

    2014-11-01

    This work proposes a modeling framework to enhance the industry-standard BSIM4 MOSFET models with capabilities for coupled electro-thermal simulations. An automated simulation environment extracts thermal information from model data as provided by the semiconductor foundry. The standard BSIM4 model is enhanced with a Verilog-A based wrapper module, adding thermal nodes which can be connected to a thermal-equivalent RC network. The proposed framework allows a fully automated extraction process based on the netlist of the top-level design and the model library. A numerical analysis tool is used to control the extraction flow and to obtain all required parameters. The framework is used to model self-heating effects on a fully integrated class A/AB power amplifier (PA) designed in a standard 65 nm CMOS process. The PA is driven with +30 dBm output power, leading to an average temperature rise of approximately 40 °C over ambient temperature.

  7. Possibilities: A framework for modeling students' deductive reasoning in physics

    NASA Astrophysics Data System (ADS)

    Gaffney, Jonathan David Housley

    Students often make errors when trying to solve qualitative or conceptual physics problems, and while many successful instructional interventions have been generated to prevent such errors, the process of deduction that students use when solving physics problems has not been thoroughly studied. In an effort to better understand that reasoning process, I have developed a new framework, which is based on the mental models framework in psychology championed by P. N. Johnson-Laird. My new framework models how students search possibility space when thinking about conceptual physics problems and suggests that errors arise from failing to flesh out all possibilities. It further suggests that instructional interventions should focus on making apparent those possibilities, as well as all physical consequences those possibilities would incur. The possibilities framework emerged from the analysis of data from a unique research project specifically invented for the purpose of understanding how students use deductive reasoning. In the selection task, participants were given a physics problem along with three written possible solutions with the goal of identifying which one of the three possible solutions was correct. Each participant was also asked to identify the errors in the incorrect solutions. For the study presented in this dissertation, participants not only performed the selection task individually on four problems, but they were also placed into groups of two or three and asked to discuss with each other the reasoning they used in making their choices and attempt to reach a consensus about which solution was correct. Finally, those groups were asked to work together to perform the selection task on three new problems. The possibilities framework appropriately models the reasoning that students use, and it makes useful predictions about potentially helpful instructional interventions. The study reported in this dissertation emphasizes the useful insight the

  8. Open source data assimilation framework for hydrological modeling

    NASA Astrophysics Data System (ADS)

    Ridler, Marc; Hummel, Stef; van Velzen, Nils; Katrine Falk, Anne; Madsen, Henrik

    2013-04-01

    An open-source data assimilation framework is proposed for hydrological modeling. Data assimilation (DA) in hydrodynamic and hydrological forecasting systems has great potential to improve predictions and improve model result. The basic principle is to incorporate measurement information into a model with the aim to improve model results by error minimization. Great strides have been made to assimilate traditional in-situ measurements such as discharge, soil moisture, hydraulic head and snowpack into hydrologic models. More recently, remotely sensed data retrievals of soil moisture, snow water equivalent or snow cover area, surface water elevation, terrestrial water storage and land surface temperature have been successfully assimilated in hydrological models. The assimilation algorithms have become increasingly sophisticated to manage measurement and model bias, non-linear systems, data sparsity (time & space) and undetermined system uncertainty. It is therefore useful to use a pre-existing DA toolbox such as OpenDA. OpenDA is an open interface standard for (and free implementation of) a set of tools to quickly implement DA and calibration for arbitrary numerical models. The basic design philosophy of OpenDA is to breakdown DA into a set of building blocks programmed in object oriented languages. To implement DA, a model must interact with OpenDA to create model instances, propagate the model, get/set variables (or parameters) and free the model once DA is completed. An open-source interface for hydrological models exists capable of all these tasks: OpenMI. OpenMI is an open source standard interface already adopted by key hydrological model providers. It defines a universal approach to interact with hydrological models during simulation to exchange data during runtime, thus facilitating the interactions between models and data sources. The interface is flexible enough so that models can interact even if the model is coded in a different language, represent

  9. A new framework for an electrophotographic printer model

    NASA Astrophysics Data System (ADS)

    Colon-Lopez, Fermin A.

    Digital halftoning is a printing technology that creates the illusion of continuous tone images for printing devices such as electrophotographic printers that can only produce a limited number of tone levels. Digital halftoning works because the human visual system has limited spatial resolution which blurs the printed dots of the halftone image, creating the gray sensation of a continuous tone image. Because the printing process is imperfect it introduces distortions to the halftone image. The quality of the printed image depends, among other factors, on the complex interactions between the halftone image, the printer characteristics, the colorant, and the printing substrate. Printer models are used to assist in the development of new types of halftone algorithms that are designed to withstand the effects of printer distortions. For example, model-based halftone algorithms optimize the halftone image through an iterative process that integrates a printer model within the algorithm. The two main goals of a printer model are to provide accurate estimates of the tone and of the spatial characteristics of the printed halftone pattern. Various classes of printer models, from simple tone calibrations to complex mechanistic models, have been reported in the literature. Existing models have one or more of the following limiting factors: they only predict tone reproduction, they depend on the halftone pattern, they require complex calibrations or complex calculations, they are printer specific, they reproduce unrealistic dot structures, and they are unable to adapt responses to new data. The two research objectives of this dissertation are (1) to introduce a new framework for printer modeling and (2) to demonstrate the feasibility of such a framework in building an electrophotographic printer model. The proposed framework introduces the concept of modeling a printer as a texture transformation machine. The basic premise is that modeling the texture differences between the

  10. A VGI data integration framework based on linked data model

    NASA Astrophysics Data System (ADS)

    Wan, Lin; Ren, Rongrong

    2015-12-01

    This paper aims at the geographic data integration and sharing method for multiple online VGI data sets. We propose a semantic-enabled framework for online VGI sources cooperative application environment to solve a target class of geospatial problems. Based on linked data technologies - which is one of core components of semantic web, we can construct the relationship link among geographic features distributed in diverse VGI platform by using linked data modeling methods, then deploy these semantic-enabled entities on the web, and eventually form an interconnected geographic data network to support geospatial information cooperative application across multiple VGI data sources. The mapping and transformation from VGI sources to RDF linked data model is presented to guarantee the unique data represent model among different online social geographic data sources. We propose a mixed strategy which combined spatial distance similarity and feature name attribute similarity as the measure standard to compare and match different geographic features in various VGI data sets. And our work focuses on how to apply Markov logic networks to achieve interlinks of the same linked data in different VGI-based linked data sets. In our method, the automatic generating method of co-reference object identification model according to geographic linked data is discussed in more detail. It finally built a huge geographic linked data network across loosely-coupled VGI web sites. The results of the experiment built on our framework and the evaluation of our method shows the framework is reasonable and practicable.

  11. Framework for the Parametric System Modeling of Space Exploration Architectures

    NASA Technical Reports Server (NTRS)

    Komar, David R.; Hoffman, Jim; Olds, Aaron D.; Seal, Mike D., II

    2008-01-01

    This paper presents a methodology for performing architecture definition and assessment prior to, or during, program formulation that utilizes a centralized, integrated architecture modeling framework operated by a small, core team of general space architects. This framework, known as the Exploration Architecture Model for IN-space and Earth-to-orbit (EXAMINE), enables: 1) a significantly larger fraction of an architecture trade space to be assessed in a given study timeframe; and 2) the complex element-to-element and element-to-system relationships to be quantitatively explored earlier in the design process. Discussion of the methodology advantages and disadvantages with respect to the distributed study team approach typically used within NASA to perform architecture studies is presented along with an overview of EXAMINE s functional components and tools. An example Mars transportation system architecture model is used to demonstrate EXAMINE s capabilities in this paper. However, the framework is generally applicable for exploration architecture modeling with destinations to any celestial body in the solar system.

  12. An Intercomparison of 2-D Models Within a Common Framework

    NASA Technical Reports Server (NTRS)

    Weisenstein, Debra K.; Ko, Malcolm K. W.; Scott, Courtney J.; Jackman, Charles H.; Fleming, Eric L.; Considine, David B.; Kinnison, Douglas E.; Connell, Peter S.; Rotman, Douglas A.; Bhartia, P. K. (Technical Monitor)

    2002-01-01

    A model intercomparison among the Atmospheric and Environmental Research (AER) 2-D model, the Goddard Space Flight Center (GSFC) 2-D model, and the Lawrence Livermore National Laboratory 2-D model allows us to separate differences due to model transport from those due to the model's chemical formulation. This is accomplished by constructing two hybrid models incorporating the transport parameters of the GSFC and LLNL models within the AER model framework. By comparing the results from the native models (AER and e.g. GSFC) with those from the hybrid model (e.g. AER chemistry with GSFC transport), differences due to chemistry and transport can be identified. For the analysis, we examined an inert tracer whose emission pattern is based on emission from a High Speed Civil Transport (HSCT) fleet; distributions of trace species in the 2015 atmosphere; and the response of stratospheric ozone to an HSCT fleet. Differences in NO(y) in the upper stratosphere are found between models with identical transport, implying different model representations of atmospheric chemical processes. The response of O3 concentration to HSCT aircraft emissions differs in the models from both transport-dominated differences in the HSCT-induced perturbations of H2O and NO(y) as well as from differences in the model represent at ions of O3 chemical processes. The model formulations of cold polar processes are found to be the most significant factor in creating large differences in the calculated ozone perturbations

  13. A Structural Model Decomposition Framework for Systems Health Management

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Daigle, Matthew J.; Bregon, Anibal; Pulido, Belamino

    2013-01-01

    Systems health management (SHM) is an important set of technologies aimed at increasing system safety and reliability by detecting, isolating, and identifying faults; and predicting when the system reaches end of life (EOL), so that appropriate fault mitigation and recovery actions can be taken. Model-based SHM approaches typically make use of global, monolithic system models for online analysis, which results in a loss of scalability and efficiency for large-scale systems. Improvement in scalability and efficiency can be achieved by decomposing the system model into smaller local submodels and operating on these submodels instead. In this paper, the global system model is analyzed offline and structurally decomposed into local submodels. We define a common model decomposition framework for extracting submodels from the global model. This framework is then used to develop algorithms for solving model decomposition problems for the design of three separate SHM technologies, namely, estimation (which is useful for fault detection and identification), fault isolation, and EOL prediction. We solve these model decomposition problems using a three-tank system as a case study.

  14. Modelling Framework and Assistive Device for Peripheral Intravenous Injections

    NASA Astrophysics Data System (ADS)

    Kam, Kin F.; Robinson, Martin P.; Gilbert, Mathew A.; Pelah, Adar

    2016-02-01

    Intravenous access for blood sampling or drug administration that requires peripheral venepuncture is perhaps the most common invasive procedure practiced in hospitals, clinics and general practice surgeries.We describe an idealised mathematical framework for modelling the dynamics of the peripheral venepuncture process. Basic assumptions of the model are confirmed through motion analysis of needle trajectories during venepuncture, taken from video recordings of a skilled practitioner injecting into a practice kit. The framework is also applied to the design and construction of a proposed device for accurate needle guidance during venepuncture administration, assessed as consistent and repeatable in application and does not lead to over puncture. The study provides insights into the ubiquitous peripheral venepuncture process and may contribute to applications in training and in the design of new devices, including for use in robotic automation.

  15. A unifying framework for marine ecological model comparison

    NASA Astrophysics Data System (ADS)

    Fennel, Wolfgang; Osborn, Thomas

    2005-05-01

    The complex network of the marine food chain with all the details of the behavior of individuals and the interactions with physical processes cannot be included into one generic model. Modelling requires simplification and idealization. The reduction of complex problems to simpler, but tractable problems are guided by the questions being addressed. Consequently, a variety of different models have been developed with different choices of state variables, process formulations, and different degree of physical control. In the last decade a multitude of studies were based on biogeochemical models, population models, and individual based models. There are now models available that cover the full range from individual based models, to population models, to biomass models, to combinations thereof. The biological model components are linked to physical models ranging from 1d water column models to full 3d general circulation models. This paper attempts to develop an unifying theoretical framework that elucidates the relationships among the different classes of models. The theory is based on state densities, which characterize individuals in an abstract phase space. Integration of the state densities over spatial or biological variables relates population densities, abundance or biomass to individuals.

  16. Common and Innovative Visuals: A sparsity modeling framework for video.

    PubMed

    Abdolhosseini Moghadam, Abdolreza; Kumar, Mrityunjay; Radha, Hayder

    2014-05-01

    Efficient video representation models are critical for many video analysis and processing tasks. In this paper, we present a framework based on the concept of finding the sparsest solution to model video frames. To model the spatio-temporal information, frames from one scene are decomposed into two components: (i) a common frame, which describes the visual information common to all the frames in the scene/segment, and (ii) a set of innovative frames, which depicts the dynamic behaviour of the scene. The proposed approach exploits and builds on recent results in the field of compressed sensing to jointly estimate the common frame and the innovative frames for each video segment. We refer to the proposed modeling framework by CIV (Common and Innovative Visuals). We show how the proposed model can be utilized to find scene change boundaries and extend CIV to videos from multiple scenes. Furthermore, the proposed model is robust to noise and can be used for various video processing applications without relying on motion estimation and detection or image segmentation. Results for object tracking, video editing (object removal, inpainting) and scene change detection are presented to demonstrate the efficiency and the performance of the proposed model. PMID:24808407

  17. A framework for modeling contaminant impacts on reservoir water quality

    NASA Astrophysics Data System (ADS)

    Jeznach, Lillian C.; Jones, Christina; Matthews, Thomas; Tobiason, John E.; Ahlfeld, David P.

    2016-06-01

    This study presents a framework for using hydrodynamic and water quality models to understand the fate and transport of potential contaminants in a reservoir and to develop appropriate emergency response and remedial actions. In the event of an emergency situation, prior detailed modeling efforts and scenario evaluations allow for an understanding of contaminant plume behavior, including maximum concentrations that could occur at the drinking water intake and contaminant travel time to the intake. A case study assessment of the Wachusett Reservoir, a major drinking water supply for metropolitan Boston, MA, provides an example of an application of the framework and how hydrodynamic and water quality models can be used to quantitatively and scientifically guide management in response to varieties of contaminant scenarios. The model CE-QUAL-W2 was used to investigate the water quality impacts of several hypothetical contaminant scenarios, including hypothetical fecal coliform input from a sewage overflow as well as an accidental railway spill of ammonium nitrate. Scenarios investigated the impacts of decay rates, season, and inter-reservoir transfers on contaminant arrival times and concentrations at the drinking water intake. The modeling study highlights the importance of a rapid operational response by managers to contain a contaminant spill in order to minimize the mass of contaminant that enters the water column, based on modeled reservoir hydrodynamics. The development and use of hydrodynamic and water quality models for surface drinking water sources subject to the potential for contaminant entry can provide valuable guidance for making decisions about emergency response and remediation actions.

  18. New model framework and structure and the commonality evaluation model. [concerning unmanned spacecraft projects

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The development of a framework and structure for shuttle era unmanned spacecraft projects and the development of a commonality evaluation model is documented. The methodology developed for model utilization in performing cost trades and comparative evaluations for commonality studies is discussed. The model framework consists of categories of activities associated with the spacecraft system's development process. The model structure describes the physical elements to be treated as separate identifiable entities. Cost estimating relationships for subsystem and program-level components were calculated.

  19. An Integrated Snow Radiance and Snow Physics Modeling Framework for Cold Land Surface Modeling

    NASA Technical Reports Server (NTRS)

    Kim, Edward J.; Tedesco, Marco

    2006-01-01

    Recent developments in forward radiative transfer modeling and physical land surface modeling are converging to allow the assembly of an integrated snow/cold lands modeling framework for land surface modeling and data assimilation applications. The key elements of this framework include: a forward radiative transfer model (FRTM) for snow, a snowpack physical model, a land surface water/energy cycle model, and a data assimilation scheme. Together these form a flexible framework for self-consistent remote sensing and water/energy cycle studies. In this paper we will describe the elements and the integration plan. Each element of this framework is modular so the choice of element can be tailored to match the emphasis of a particular study. For example, within our framework, four choices of a FRTM are available to simulate the brightness temperature of snow: Two models are available to model the physical evolution of the snowpack and underlying soil, and two models are available to handle the water/energy balance at the land surface. Since the framework is modular, other models-physical or statistical--can be accommodated, too. All modules will operate within the framework of the Land Information System (LIS), a land surface modeling framework with data assimilation capabilities running on a parallel-node computing cluster at the NASA Goddard Space Flight Center. The advantages of such an integrated modular framework built on the LIS will be described through examples-e.g., studies to analyze snow field experiment observations, and simulations of future satellite missions for snow and cold land processes.

  20. A constitutive model for magnetostriction based on thermodynamic framework

    NASA Astrophysics Data System (ADS)

    Ho, Kwangsoo

    2016-08-01

    This work presents a general framework for the continuum-based formulation of dissipative materials with magneto-mechanical coupling in the viewpoint of irreversible thermodynamics. The thermodynamically consistent model developed for the magnetic hysteresis is extended to include the magnetostrictive effect. The dissipative and hysteretic response of magnetostrictive materials is captured through the introduction of internal state variables. The evolution rate of magnetostrictive strain as well as magnetization is derived from thermodynamic and dissipative potentials in accordance with the general principles of thermodynamics. It is then demonstrated that the constitutive model is competent to describe the magneto-mechanical behavior by comparing simulation results with the experimental data reported in the literature.

  1. Hybrid automata as a unifying framework for modeling excitable cells.

    PubMed

    Ye, P; Entcheva, E; Smolka, S A; True, M R; Grosu, R

    2006-01-01

    We propose hybrid automata (HA) as a unifying framework for computational models of excitable cells. HA, which combine discrete transition graphs with continuous dynamics, can be naturally used to obtain a piecewise, possibly linear, approximation of a nonlinear excitable-cell model. We first show how HA can be used to efficiently capture the action-potential morphology--as well as reproduce typical excitable-cell characteristics such as refractoriness and restitution--of the dynamic Luo-Rudy model of a guinea-pig ventricular myocyte. We then recast two well-known computational models, Biktashev's and Fenton-Karma, as HA without any loss of expressiveness. Given that HA possess an intuitive graphical representation and are supported by a rich mathematical theory and numerous analysis tools, we argue that they are well positioned as a computational model for biological processes. PMID:17947070

  2. Velo: A Knowledge Management Framework for Modeling and Simulation

    SciTech Connect

    Gorton, Ian; Sivaramakrishnan, Chandrika; Black, Gary D.; White, Signe K.; Purohit, Sumit; Lansing, Carina S.; Madison, Michael C.; Schuchardt, Karen L.; Liu, Yan

    2012-03-01

    Modern scientific enterprises are inherently knowledge-intensive. Scientific studies in domains such as geosciences, climate, and biology require the acquisition and manipulation of large amounts of experimental and field data to create inputs for large-scale computational simulations. The results of these simulations are then analyzed, leading to refinements of inputs and models and additional simulations. The results of this process must be managed and archived to provide justifications for regulatory decisions and publications that are based on the models. In this paper we introduce our Velo framework that is designed as a reusable, domain independent knowledge management infrastructure for modeling and simulation. Velo leverages, integrates and extends open source collaborative and content management technologies to create a scalable and flexible core platform that can be tailored to specific scientific domains. We describe the architecture of Velo for managing and associating the various types of data that are used and created in modeling and simulation projects, as well as the framework for integrating domain-specific tools. To demonstrate realizations of Velo, we describe examples from two deployed sites for carbon sequestration and climate modeling. These provide concrete example of the inherent extensibility and utility of our approach.

  3. Development of a distributed air pollutant dry deposition modeling framework.

    PubMed

    Hirabayashi, Satoshi; Kroll, Charles N; Nowak, David J

    2012-12-01

    A distributed air pollutant dry deposition modeling system was developed with a geographic information system (GIS) to enhance the functionality of i-Tree Eco (i-Tree, 2011). With the developed system, temperature, leaf area index (LAI) and air pollutant concentration in a spatially distributed form can be estimated, and based on these and other input variables, dry deposition of carbon monoxide (CO), nitrogen dioxide (NO(2)), sulfur dioxide (SO(2)), and particulate matter less than 10 microns (PM10) to trees can be spatially quantified. Employing nationally available road network, traffic volume, air pollutant emission/measurement and meteorological data, the developed system provides a framework for the U.S. city managers to identify spatial patterns of urban forest and locate potential areas for future urban forest planting and protection to improve air quality. To exhibit the usability of the framework, a case study was performed for July and August of 2005 in Baltimore, MD. PMID:22858662

  4. The ontology model of FrontCRM framework

    NASA Astrophysics Data System (ADS)

    Budiardjo, Eko K.; Perdana, Wira; Franshisca, Felicia

    2013-03-01

    Adoption and implementation of Customer Relationship Management (CRM) is not merely a technological installation, but the emphasis is more on the application of customer-centric philosophy and culture as a whole. CRM must begin at the level of business strategy, the only level that thorough organizational changes are possible to be done. Changes agenda can be directed to each departmental plans, and supported by information technology. Work processes related to CRM concept include marketing, sales, and services. FrontCRM is developed as framework to guide in identifying business processes related to CRM in which based on the concept of strategic planning approach. This leads to processes and practices identification in every process area related to marketing, sales, and services. The Ontology model presented on this paper by means serves as tools to avoid framework misunderstanding, to define practices systematically within process area and to find CRM software features related to those practices.

  5. A General Framework for Multiphysics Modeling Based on Numerical Averaging

    NASA Astrophysics Data System (ADS)

    Lunati, I.; Tomin, P.

    2014-12-01

    In the last years, multiphysics (hybrid) modeling has attracted increasing attention as a tool to bridge the gap between pore-scale processes and a continuum description at the meter-scale (laboratory scale). This approach is particularly appealing for complex nonlinear processes, such as multiphase flow, reactive transport, density-driven instabilities, and geomechanical coupling. We present a general framework that can be applied to all these classes of problems. The method is based on ideas from the Multiscale Finite-Volume method (MsFV), which has been originally developed for Darcy-scale application. Recently, we have reformulated MsFV starting with a local-global splitting, which allows us to retain the original degree of coupling for the local problems and to use spatiotemporal adaptive strategies. The new framework is based on the simple idea that different characteristic temporal scales are inherited from different spatial scales, and the global and the local problems are solved with different temporal resolutions. The global (coarse-scale) problem is constructed based on a numerical volume-averaging paradigm and a continuum (Darcy-scale) description is obtained by introducing additional simplifications (e.g., by assuming that pressure is the only independent variable at the coarse scale, we recover an extended Darcy's law). We demonstrate that it is possible to adaptively and dynamically couple the Darcy-scale and the pore-scale descriptions of multiphase flow in a single conceptual and computational framework. Pore-scale problems are solved only in the active front region where fluid distribution changes with time. In the rest of the domain, only a coarse description is employed. This framework can be applied to other important problems such as reactive transport and crack propagation. As it is based on a numerical upscaling paradigm, our method can be used to explore the limits of validity of macroscopic models and to illuminate the meaning of

  6. PyCatch: catchment modelling in the PCRaster framework

    NASA Astrophysics Data System (ADS)

    Karssenberg, Derek; Lana-Renault, Noemí; Schmitz, Oliver

    2015-04-01

    PCRaster is an open source software framework for the construction and execution of stochastic, spatio-temporal, forward, models. It provides a large number of spatial operations on raster maps, with an emphasis on operations that are capable of transporting material (water, sediment) over a drainage network. These operations have been written in C++ and are provided to the model builder as Python functions. Models are constructed by combining these functions in a Python script. To ease implementation of models that use time steps and Monte Carlo iterations, the software comes with a Python framework providing control flow for temporal modelling and Monte Carlo simulation, including options for Bayesian data assimilation (Ensemble Kalman Filter, Particle Filter). A sophisticated visualization tool is provided capable of visualizing, animating, and exploring stochastic, spatio-temporal input or model output data. PCRaster is used for construction of for instance hydrological models (hillslope to global scale), land use change models, and geomorphological models. It is still being improved upon, for instance by adding under the hood functionality for executing models on multiple CPU cores, and by adding components for agent-based and network simulation. The software runs in MS Windows and Linux and is available at http://www.pcraster.eu. We provide an extensive set of online course materials (partly available free of charge). Using the PCRaster software framework, we recently developed the PyCatch model components for hydrological modelling and land degradation modelling at catchment scale. The PyCatch components run at time steps of seconds to weeks, and grid cell sizes of approximately 1-100 m, which can be selected depending on the case study for which PyCatch is used. Hydrological components currently implemented include classes for simulation of incoming solar radiation, evapotranspiration (Penman-Monteith), surface storage, infiltration (Green and Ampt

  7. Modeling air pollution in the Tracking and Analysis Framework (TAF)

    SciTech Connect

    Shannon, J.D.

    1998-12-31

    The Tracking and Analysis Framework (TAF) is a set of interactive computer models for integrated assessment of the Acid Rain Provisions (Title IV) of the 1990 Clean Air Act Amendments. TAF is designed to execute in minutes on a personal computer, thereby making it feasible for a researcher or policy analyst to examine quickly the effects of alternate modeling assumptions or policy scenarios. Because the development of TAF involves researchers in many different disciplines, TAF has been given a modular structure. In most cases, the modules contain reduced-form models that are based on more complete models exercised off-line. The structure of TAF as of December 1996 is shown. Both the Atmospheric Pathways Module produce estimates for regional air pollution variables.

  8. Modelling multimedia teleservices with OSI upper layers framework: Short paper

    NASA Astrophysics Data System (ADS)

    Widya, I.; Vanrijssen, E.; Michiels, E.

    The paper presents the use of the concepts and modelling principles of the Open Systems Interconnection (OSI) upper layers structure in the modelling of multimedia teleservices. It puts emphasis on the revised Application Layer Structure (OSI/ALS). OSI/ALS is an object based reference model which intends to coordinate the development of application oriented services and protocols in a consistent and modular way. It enables the rapid deployment and integrated use of these services. The paper emphasizes further on the nesting structure defined in OSI/ALS which allows the design of scalable and user tailorable/controllable teleservices. OSI/ALS consistent teleservices are moreover implementable on communication platforms of different capabilities. An analysis of distributed multimedia architectures which can be found in the literature, confirms the ability of the OSI/ALS framework to model the interworking functionalities of teleservices.

  9. A Structural Model Decomposition Framework for Hybrid Systems Diagnosis

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Bregon, Anibal; Roychoudhury, Indranil

    2015-01-01

    Nowadays, a large number of practical systems in aerospace and industrial environments are best represented as hybrid systems that consist of discrete modes of behavior, each defined by a set of continuous dynamics. These hybrid dynamics make the on-line fault diagnosis task very challenging. In this work, we present a new modeling and diagnosis framework for hybrid systems. Models are composed from sets of user-defined components using a compositional modeling approach. Submodels for residual generation are then generated for a given mode, and reconfigured efficiently when the mode changes. Efficient reconfiguration is established by exploiting causality information within the hybrid system models. The submodels can then be used for fault diagnosis based on residual generation and analysis. We demonstrate the efficient causality reassignment, submodel reconfiguration, and residual generation for fault diagnosis using an electrical circuit case study.

  10. Generalized framework for context-specific metabolic model extraction methods

    PubMed Central

    Robaina Estévez, Semidán; Nikoloski, Zoran

    2014-01-01

    Genome-scale metabolic models (GEMs) are increasingly applied to investigate the physiology not only of simple prokaryotes, but also eukaryotes, such as plants, characterized with compartmentalized cells of multiple types. While genome-scale models aim at including the entirety of known metabolic reactions, mounting evidence has indicated that only a subset of these reactions is active in a given context, including: developmental stage, cell type, or environment. As a result, several methods have been proposed to reconstruct context-specific models from existing genome-scale models by integrating various types of high-throughput data. Here we present a mathematical framework that puts all existing methods under one umbrella and provides the means to better understand their functioning, highlight similarities and differences, and to help users in selecting a most suitable method for an application. PMID:25285097

  11. Flexible Modeling of Epidemics with an Empirical Bayes Framework

    PubMed Central

    Brooks, Logan C.; Farrow, David C.; Hyun, Sangwon; Tibshirani, Ryan J.; Rosenfeld, Roni

    2015-01-01

    Seasonal influenza epidemics cause consistent, considerable, widespread loss annually in terms of economic burden, morbidity, and mortality. With access to accurate and reliable forecasts of a current or upcoming influenza epidemic’s behavior, policy makers can design and implement more effective countermeasures. This past year, the Centers for Disease Control and Prevention hosted the “Predict the Influenza Season Challenge”, with the task of predicting key epidemiological measures for the 2013–2014 U.S. influenza season with the help of digital surveillance data. We developed a framework for in-season forecasts of epidemics using a semiparametric Empirical Bayes framework, and applied it to predict the weekly percentage of outpatient doctors visits for influenza-like illness, and the season onset, duration, peak time, and peak height, with and without using Google Flu Trends data. Previous work on epidemic modeling has focused on developing mechanistic models of disease behavior and applying time series tools to explain historical data. However, tailoring these models to certain types of surveillance data can be challenging, and overly complex models with many parameters can compromise forecasting ability. Our approach instead produces possibilities for the epidemic curve of the season of interest using modified versions of data from previous seasons, allowing for reasonable variations in the timing, pace, and intensity of the seasonal epidemics, as well as noise in observations. Since the framework does not make strict domain-specific assumptions, it can easily be applied to some other diseases with seasonal epidemics. This method produces a complete posterior distribution over epidemic curves, rather than, for example, solely point predictions of forecasting targets. We report prospective influenza-like-illness forecasts made for the 2013–2014 U.S. influenza season, and compare the framework’s cross-validated prediction error on historical data to

  12. A visual interface for the SUPERFLEX hydrological modelling framework

    NASA Astrophysics Data System (ADS)

    Gao, H.; Fenicia, F.; Kavetski, D.; Savenije, H. H. G.

    2012-04-01

    The SUPERFLEX framework is a modular modelling system for conceptual hydrological modelling at the catchment scale. This work reports the development of a visual interface for the SUPERFLEX model. This aims to enhance the communication between the hydrologic experimentalists and modelers, in particular further bridging the gap between the field soft data and the modeler's knowledge. In collaboration with field experimentalists, modelers can visually and intuitively hypothesize different model architectures and combinations of reservoirs, select from a library of constructive functions to describe the relationship between reservoirs' storage and discharge, specify the shape of lag functions and, finally, set parameter values. The software helps hydrologists take advantage of any existing insights into the study site, translate it into a conceptual hydrological model and implement it within a computationally robust algorithm. This tool also helps challenge and contrast competing paradigms such as the "uniqueness of place" vs "one model fits all". Using this interface, hydrologists can test different hypotheses and model representations, and stepwise build deeper understanding of the watershed of interest.

  13. Archetype Model-Driven Development Framework for EHR Web System

    PubMed Central

    Kimura, Eizen; Ishihara, Ken

    2013-01-01

    Objectives This article describes the Web application framework for Electronic Health Records (EHRs) we have developed to reduce construction costs for EHR sytems. Methods The openEHR project has developed clinical model driven architecture for future-proof interoperable EHR systems. This project provides the specifications to standardize clinical domain model implementations, upon which the ISO/CEN 13606 standards are based. The reference implementation has been formally described in Eiffel. Moreover C# and Java implementations have been developed as reference. While scripting languages had been more popular because of their higher efficiency and faster development in recent years, they had not been involved in the openEHR implementations. From 2007, we have used the Ruby language and Ruby on Rails (RoR) as an agile development platform to implement EHR systems, which is in conformity with the openEHR specifications. Results We implemented almost all of the specifications, the Archetype Definition Language parser, and RoR scaffold generator from archetype. Although some problems have emerged, most of them have been resolved. Conclusions We have provided an agile EHR Web framework, which can build up Web systems from archetype models using RoR. The feasibility of the archetype model to provide semantic interoperability of EHRs has been demonstrated and we have verified that that it is suitable for the construction of EHR systems. PMID:24523991

  14. Modeling QCD for Hadron Physics

    NASA Astrophysics Data System (ADS)

    Tandy, P. C.

    2011-10-01

    We review the approach to modeling soft hadron physics observables based on the Dyson-Schwinger equations of QCD. The focus is on light quark mesons and in particular the pseudoscalar and vector ground states, their decays and electromagnetic couplings. We detail the wide variety of observables that can be correlated by a ladder-rainbow kernel with one infrared parameter fixed to the chiral quark condensate. A recently proposed novel perspective in which the quark condensate is contained within hadrons and not the vacuum is mentioned. The valence quark parton distributions, in the pion and kaon, as measured in the Drell Yan process, are investigated with the same ladder-rainbow truncation of the Dyson-Schwinger and Bethe-Salpeter equations.

  15. Modeling QCD for Hadron Physics

    SciTech Connect

    Tandy, P. C.

    2011-10-24

    We review the approach to modeling soft hadron physics observables based on the Dyson-Schwinger equations of QCD. The focus is on light quark mesons and in particular the pseudoscalar and vector ground states, their decays and electromagnetic couplings. We detail the wide variety of observables that can be correlated by a ladder-rainbow kernel with one infrared parameter fixed to the chiral quark condensate. A recently proposed novel perspective in which the quark condensate is contained within hadrons and not the vacuum is mentioned. The valence quark parton distributions, in the pion and kaon, as measured in the Drell Yan process, are investigated with the same ladder-rainbow truncation of the Dyson-Schwinger and Bethe-Salpeter equations.

  16. LQCD workflow execution framework: Models, provenance and fault-tolerance

    NASA Astrophysics Data System (ADS)

    Piccoli, Luciano; Dubey, Abhishek; Simone, James N.; Kowalkowlski, James B.

    2010-04-01

    Large computing clusters used for scientific processing suffer from systemic failures when operated over long continuous periods for executing workflows. Diagnosing job problems and faults leading to eventual failures in this complex environment is difficult, specifically when the success of an entire workflow might be affected by a single job failure. In this paper, we introduce a model-based, hierarchical, reliable execution framework that encompass workflow specification, data provenance, execution tracking and online monitoring of each workflow task, also referred to as participants. The sequence of participants is described in an abstract parameterized view, which is translated into a concrete data dependency based sequence of participants with defined arguments. As participants belonging to a workflow are mapped onto machines and executed, periodic and on-demand monitoring of vital health parameters on allocated nodes is enabled according to pre-specified rules. These rules specify conditions that must be true pre-execution, during execution and post-execution. Monitoring information for each participant is propagated upwards through the reflex and healing architecture, which consists of a hierarchical network of decentralized fault management entities, called reflex engines. They are instantiated as state machines or timed automatons that change state and initiate reflexive mitigation action(s) upon occurrence of certain faults. We describe how this cluster reliability framework is combined with the workflow execution framework using formal rules and actions specified within a structure of first order predicate logic that enables a dynamic management design that reduces manual administrative workload, and increases cluster-productivity.

  17. Service-Oriented Approach to Coupling Earth System Models and Modeling Frameworks

    NASA Astrophysics Data System (ADS)

    Goodall, J. L.; Saint, K. D.; Ercan, M. B.; Briley, L. J.; Murphy, S.; You, H.; DeLuca, C.; Rood, R. B.

    2012-12-01

    Modeling water systems often requires coupling models across traditional Earth science disciplinary boundaries. While there has been significant effort within various Earth science disciplines (e.g., atmospheric science, hydrology, and Earth surface dynamics) to create models and, more recently, modeling frameworks, there has been less work on methods for coupling across disciplinary-specific models and modeling frameworks. We present work investigating one possible method for coupling across disciplinary-specific Earth system models and modeling frameworks: service-oriented architectures. In a service-oriented architecture, models act as distinct units or components within a system and are designed to pass well defined messages to consumers of the service. While the approach offers the potential to couple heterogeneous computational models by allowing a high degree of autonomy across models of the Earth system, there are significant scientific and technical challenges to be addressed when coupling models designed for different communities and built for different modeling frameworks. We have addressed some of these challenges through a case study where we coupled a hydrologic model compliant with the OpenMI standard with an atmospheric model compliant with the EMSF standard. In this case study, the two models were coupled through data exchanges of boundary conditions enabled by exposing the atmospheric model as a web service. A discussion of the technical and scientific challenges, some that we have addressed and others that remain open, will be presented including differences in computer architectures, data semantics, and spatial scales between the coupled models.

  18. ASSESSING MULTIMEDIA/MULTIPATHWAY EXPOSURE TO ARSENIC USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    A series of case studies is presented focusing on multimedia/multipathway population exposures to arsenic, employing the Population Based Modeling approach of the MENTOR (Modeling Environment for Total Risks) framework. This framework considers currently five exposure routes: i...

  19. A Graph Based Framework to Model Virus Integration Sites.

    PubMed

    Fronza, Raffaele; Vasciaveo, Alessandro; Benso, Alfredo; Schmidt, Manfred

    2016-01-01

    With next generation sequencing thousands of virus and viral vector integration genome targets are now under investigation to uncover specific integration preferences and to define clusters of integration, termed common integration sites (CIS), that may allow to assess gene therapy safety or to detect disease related genomic features such as oncogenes. Here, we addressed the challenge to: 1) define the notion of CIS on graph models, 2) demonstrate that the structure of CIS enters in the category of scale-free networks and 3) show that our network approach analyzes CIS dynamically in an integrated systems biology framework using the Retroviral Transposon Tagged Cancer Gene Database (RTCGD) as a testing dataset. PMID:27257470

  20. A Framework and Model for Evaluating Clinical Decision Support Architectures

    PubMed Central

    Wright, Adam; Sittig, Dean F.

    2008-01-01

    In this paper, we develop a four-phase model for evaluating architectures for clinical decision support that focuses on: defining a set of desirable features for a decision support architecture; building a proof-of-concept prototype; demonstrating that the architecture is useful by showing that it can be integrated with existing decision support systems and comparing its coverage to that of other architectures. We apply this framework to several well-known decision support architectures, including Arden Syntax, GLIF, SEBASTIAN and SAGE PMID:18462999

  1. A Robust Control Design Framework for Substructure Models

    NASA Technical Reports Server (NTRS)

    Lim, Kyong B.

    1994-01-01

    A framework for designing control systems directly from substructure models and uncertainties is proposed. The technique is based on combining a set of substructure robust control problems by an interface stiffness matrix which appears as a constant gain feedback. Variations of uncertainties in the interface stiffness are treated as a parametric uncertainty. It is shown that multivariable robust control can be applied to generate centralized or decentralized controllers that guarantee performance with respect to uncertainties in the interface stiffness, reduced component modes and external disturbances. The technique is particularly suited for large, complex, and weakly coupled flexible structures.

  2. An Integrated Framework Advancing Membrane Protein Modeling and Design

    PubMed Central

    Weitzner, Brian D.; Duran, Amanda M.; Tilley, Drew C.; Elazar, Assaf; Gray, Jeffrey J.

    2015-01-01

    Membrane proteins are critical functional molecules in the human body, constituting more than 30% of open reading frames in the human genome. Unfortunately, a myriad of difficulties in overexpression and reconstitution into membrane mimetics severely limit our ability to determine their structures. Computational tools are therefore instrumental to membrane protein structure prediction, consequently increasing our understanding of membrane protein function and their role in disease. Here, we describe a general framework facilitating membrane protein modeling and design that combines the scientific principles for membrane protein modeling with the flexible software architecture of Rosetta3. This new framework, called RosettaMP, provides a general membrane representation that interfaces with scoring, conformational sampling, and mutation routines that can be easily combined to create new protocols. To demonstrate the capabilities of this implementation, we developed four proof-of-concept applications for (1) prediction of free energy changes upon mutation; (2) high-resolution structural refinement; (3) protein-protein docking; and (4) assembly of symmetric protein complexes, all in the membrane environment. Preliminary data show that these algorithms can produce meaningful scores and structures. The data also suggest needed improvements to both sampling routines and score functions. Importantly, the applications collectively demonstrate the potential of combining the flexible nature of RosettaMP with the power of Rosetta algorithms to facilitate membrane protein modeling and design. PMID:26325167

  3. An Integrated Framework Advancing Membrane Protein Modeling and Design.

    PubMed

    Alford, Rebecca F; Koehler Leman, Julia; Weitzner, Brian D; Duran, Amanda M; Tilley, Drew C; Elazar, Assaf; Gray, Jeffrey J

    2015-09-01

    Membrane proteins are critical functional molecules in the human body, constituting more than 30% of open reading frames in the human genome. Unfortunately, a myriad of difficulties in overexpression and reconstitution into membrane mimetics severely limit our ability to determine their structures. Computational tools are therefore instrumental to membrane protein structure prediction, consequently increasing our understanding of membrane protein function and their role in disease. Here, we describe a general framework facilitating membrane protein modeling and design that combines the scientific principles for membrane protein modeling with the flexible software architecture of Rosetta3. This new framework, called RosettaMP, provides a general membrane representation that interfaces with scoring, conformational sampling, and mutation routines that can be easily combined to create new protocols. To demonstrate the capabilities of this implementation, we developed four proof-of-concept applications for (1) prediction of free energy changes upon mutation; (2) high-resolution structural refinement; (3) protein-protein docking; and (4) assembly of symmetric protein complexes, all in the membrane environment. Preliminary data show that these algorithms can produce meaningful scores and structures. The data also suggest needed improvements to both sampling routines and score functions. Importantly, the applications collectively demonstrate the potential of combining the flexible nature of RosettaMP with the power of Rosetta algorithms to facilitate membrane protein modeling and design. PMID:26325167

  4. A hybrid parallel framework for the cellular Potts model simulations

    SciTech Connect

    Jiang, Yi; He, Kejing; Dong, Shoubin

    2009-01-01

    The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approach achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).

  5. A unifying modeling framework for highly multivariate disease mapping.

    PubMed

    Botella-Rocamora, P; Martinez-Beneito, M A; Banerjee, S

    2015-04-30

    Multivariate disease mapping refers to the joint mapping of multiple diseases from regionally aggregated data and continues to be the subject of considerable attention for biostatisticians and spatial epidemiologists. The key issue is to map multiple diseases accounting for any correlations among themselves. Recently, Martinez-Beneito (2013) provided a unifying framework for multivariate disease mapping. While attractive in that it colligates a variety of existing statistical models for mapping multiple diseases, this and other existing approaches are computationally burdensome and preclude the multivariate analysis of moderate to large numbers of diseases. Here, we propose an alternative reformulation that accrues substantial computational benefits enabling the joint mapping of tens of diseases. Furthermore, the approach subsumes almost all existing classes of multivariate disease mapping models and offers substantial insight into the properties of statistical disease mapping models. PMID:25645551

  6. CIMS: A FRAMEWORK FOR INFRASTRUCTURE INTERDEPENDENCY MODELING AND ANALYSIS

    SciTech Connect

    Donald D. Dudenhoeffer; May R. Permann; Milos Manic

    2006-12-01

    Today’s society relies greatly upon an array of complex national and international infrastructure networks such as transportation, utilities, telecommunication, and even financial networks. While modeling and simulation tools have provided insight into the behavior of individual infrastructure networks, a far less understood area is that of the interrelationships among multiple infrastructure networks including the potential cascading effects that may result due to these interdependencies. This paper first describes infrastructure interdependencies as well as presenting a formalization of interdependency types. Next the paper describes a modeling and simulation framework called CIMS© and the work that is being conducted at the Idaho National Laboratory (INL) to model and simulate infrastructure interdependencies and the complex behaviors that can result.

  7. The Application of Architecture Frameworks to Modelling Exploration Operations Costs

    NASA Technical Reports Server (NTRS)

    Shishko, Robert

    2006-01-01

    Developments in architectural frameworks and system-of-systems thinking have provided useful constructs for systems engineering. DoDAF concepts, language, and formalisms, in particular, provide a natural way of conceptualizing an operations cost model applicable to NASA's space exploration vision. Not all DoDAF products have meaning or apply to a DoDAF inspired operations cost model, but this paper describes how such DoDAF concepts as nodes, systems, and operational activities relate to the development of a model to estimate exploration operations costs. The paper discusses the specific implementation to the Mission Operations Directorate (MOD) operational functions/activities currently being developed and presents an overview of how this powerful representation can apply to robotic space missions as well.

  8. Using the Mead model as a framework for nursing care.

    PubMed

    Edwards, S L

    1992-12-01

    A model of nursing has no valid purpose unless it serves nurses to help make their nursing better (Fawcett, 1989). The Mead model formed the basis for nursing care of Jason, a young patient who sustained a head injury, a puncture wound and lacerations to his face, in the study presented here. Examination of the Mead Model of nursing is followed by an account of why this model was used in preference to others as a framework for Jason's care. Three components of his nursing care--wound care, communication, involvement of relatives--are discussed in relation to both the model and current knowledge. It was concluded that as a structured way of planning and giving care, the Mead model lacks adequate guidelines. A less experienced nurse using the Mead model may overlook certain aspects of care, an experienced nurse may use his/her knowledge to give high standard care using research-based information. However, models need to be tested so they may be rejected or modified as guidelines for care in this case in the United Kingdom, within a welfare-orientated society. PMID:1483020

  9. Sensor management using a new framework for observation modeling

    NASA Astrophysics Data System (ADS)

    Kolba, Mark P.; Collins, Leslie M.

    2009-05-01

    In previous work, a sensor management framework has been developed that manages a suite of sensors in a search for static targets within a grid of cells. This framework has been studied for binary, non-binary, and correlated sensor observations, and the sensor manager was found to outperform a direct search technique with each of these different types of observations. Uncertainty modeling for both binary and non-binary observations has also been studied. In this paper, a new observation model is introduced that is motivated by the physics of static target detection problems such as landmine detection and unexploded ordnance (UXO) discrimination. The new observation model naturally accommodates correlated sensor observations and models both the correlation that occurs between observations made by different sensors and the correlation that occurs between observations made by the same sensor. Uncertainty modeling is also implicitly incorporated into the observation model because the underlying parameters of the target and clutter cells are allowed to vary and are not assumed to be constant across target cells and across clutter cells. Sensor management is then performed by maximizing the expected information gain that is made with each new sensor observation. The performance of the sensor manager is examined through performance evaluation with real data from the UXO discrimination application. It is demonstrated that the sensor manager is able to provide comparable detection performance to a direct search strategy using fewer sensor observations than direct search. It is also demonstrated that the sensor manager is able to ignore features that are uninformative to the discrimination problem.

  10. DEVELOP MULTI-STRESSOR, OPEN ARCHITECTURE MODELING FRAMEWORK FOR ECOLOGICAL EXPOSURE FROM SITE TO WATERSHED SCALE

    EPA Science Inventory

    A number of multimedia modeling frameworks are currently being developed. The Multimedia Integrated Modeling System (MIMS) is one of these frameworks. A framework should be seen as more of a multimedia modeling infrastructure than a single software system. This infrastructure do...

  11. Receptor modeling application framework for particle source apportionment.

    PubMed

    Watson, John G; Zhu, Tan; Chow, Judith C; Engelbrecht, Johann; Fujita, Eric M; Wilson, William E

    2002-12-01

    Receptor models infer contributions from particulate matter (PM) source types using multivariate measurements of particle chemical and physical properties. Receptor models complement source models that estimate concentrations from emissions inventories and transport meteorology. Enrichment factor, chemical mass balance, multiple linear regression, eigenvector. edge detection, neural network, aerosol evolution, and aerosol equilibrium models have all been used to solve particulate air quality problems, and more than 500 citations of their theory and application document these uses. While elements, ions, and carbons were often used to apportion TSP, PM10, and PM2.5 among many source types, many of these components have been reduced in source emissions such that more complex measurements of carbon fractions, specific organic compounds, single particle characteristics, and isotopic abundances now need to be measured in source and receptor samples. Compliance monitoring networks are not usually designed to obtain data for the observables, locations, and time periods that allow receptor models to be applied. Measurements from existing networks can be used to form conceptual models that allow the needed monitoring network to be optimized. The framework for using receptor models to solve air quality problems consists of: (1) formulating a conceptual model; (2) identifying potential sources; (3) characterizing source emissions; (4) obtaining and analyzing ambient PM samples for major components and source markers; (5) confirming source types with multivariate receptor models; (6) quantifying source contributions with the chemical mass balance; (7) estimating profile changes and the limiting precursor gases for secondary aerosols; and (8) reconciling receptor modeling results with source models, emissions inventories, and receptor data analyses. PMID:12492167

  12. Modeling of Active Transmembrane Transport in a Mixture Theory Framework

    PubMed Central

    Ateshian, Gerard A.; Morrison, Barclay; Hung, Clark T.

    2010-01-01

    This study formulates governing equations for active transport across semi-permeable membranes within the framework of the theory of mixtures. In mixture theory, which models the interactions of any number of fluid and solid constituents, a supply term appears in the conservation of linear momentum to describe momentum exchanges among the constituents. In past applications, this momentum supply was used to model frictional interactions only, thereby describing passive transport processes. In this study, it is shown that active transport processes, which impart momentum to solutes or solvent, may also be incorporated in this term. By projecting the equation of conservation of linear momentum along the normal to the membrane, a jump condition is formulated for the mechano-electrochemical potential of fluid constituents which is generally applicable to nonequilibrium processes involving active transport. The resulting relations are simple and easy to use, and address an important need in the membrane transport literature. PMID:20213212

  13. Modeling the spectral solar irradiance in the SOTERIA Project Framework

    NASA Astrophysics Data System (ADS)

    Vieira, Luis Eduardo; Dudok de Wit, Thierry; Kretzschmar, Matthieu; Cessateur, Gaël

    The evolution of the radiative energy input is a key element to understand the variability of the Earth's neutral and ionized atmospheric components. However, reliable observations are limited to the last decades, when observations realized above the Earth's atmosphere became possible. These observations have provide insights about the variability of the spectral solar irradiance on time scales from days to years, but there is still large uncertainties on the evolu-tion on time scales from decades to centuries. Here we discuss the physics-based modeling of the ultraviolet solar irradiance under development in the Solar-Terrestrial Investigations and Archives (SOTERIA) project framework. In addition, we compare the modeled solar emission with variability observed by LYRA instrument onboard of Proba2 spacecraft.

  14. A Novel Modeling Framework for Heterogeneous Catalyst Design

    NASA Astrophysics Data System (ADS)

    Katare, Santhoji; Bhan, Aditya; Caruthers, James; Delgass, Nicholas; Lauterbach, Jochen; Venkatasubramanian, Venkat

    2002-03-01

    A systems-oriented, integrated knowledge architecture that enables the use of data from High Throughput Experiments (HTE) for catalyst design is being developed. Higher-level critical reasoning is required to extract information efficiently from the increasingly available HTE data and to develop predictive models that can be used for design purposes. Towards this objective, we have developed a framework that aids the catalyst designer in negotiating the data and model complexities. Traditional kinetic and statistical tools have been systematically implemented and novel artificial intelligence tools have been developed and integrated to speed up the process of modeling catalytic reactions. Multiple nonlinear models that describe CO oxidation on supported metals have been screened using qualitative and quantitative features based optimization ideas. Physical constraints of the system have been used to select the optimum model parameters from the multiple solutions to the parameter estimation problem. Preliminary results about the selection of catalyst descriptors that match a target performance and the use of HTE data for refining fundamentals based models will be discussed.

  15. An epidemiological framework for modelling fungicide dynamics and control.

    PubMed

    Castle, Matthew D; Gilligan, Christopher A

    2012-01-01

    Defining appropriate policies for controlling the spread of fungal disease in agricultural landscapes requires appropriate theoretical models. Most existing models for the fungicidal control of plant diseases do not explicitly include the dynamics of the fungicide itself, nor do they consider the impact of infection occurring during the host growth phase. We introduce a modelling framework for fungicide application that allows us to consider how "explicit" modelling of fungicide dynamics affects the invasion and persistence of plant pathogens. Specifically, we show that "explicit" models exhibit bistability zones for values of the basic reproductive number (R0) less than one within which the invasion and persistence threshold depends on the initial infection levels. This is in contrast to classical models where invasion and persistence thresholds are solely dependent on R0. In addition if initial infection occurs during the growth phase then an additional "invasion zone" can exist for even smaller values of R0. Within this region the system will experience an epidemic that is not able to persist. We further show that ideal fungicides with high levels of effectiveness, low rates of application and low rates of decay lead to the existence of these bistability zones. The results are robust to the inclusion of demographic stochasticity. PMID:22899992

  16. A framework for quantifying net benefits of alternative prognostic models.

    PubMed

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G

    2012-01-30

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. PMID:21905066

  17. A Categorical Framework for Model Classification in the Geosciences

    NASA Astrophysics Data System (ADS)

    Hauhs, Michael; Trancón y Widemann, Baltasar; Lange, Holger

    2016-04-01

    Models have a mixed record of success in the geosciences. In meteorology, model development and implementation has been among the first and most successful examples of triggering computer technology in science. On the other hand, notorious problems such as the 'equifinality issue' in hydrology lead to a rather mixed reputation of models in other areas. The most successful models in geosciences are applications of dynamic systems theory to non-living systems or phenomena. Thus, we start from the hypothesis that the success of model applications relates to the influence of life on the phenomenon under study. We thus focus on the (formal) representation of life in models. The aim is to investigate whether disappointment in model performance is due to system properties such as heterogeneity and historicity of ecosystems, or rather reflects an abstraction and formalisation problem at a fundamental level. As a formal framework for this investigation, we use category theory as applied in computer science to specify behaviour at an interface. Its methods have been developed for translating and comparing formal structures among different application areas and seems highly suited for a classification of the current "model zoo" in the geosciences. The approach is rather abstract, with a high degree of generality but a low level of expressibility. Here, category theory will be employed to check the consistency of assumptions about life in different models. It will be shown that it is sufficient to distinguish just four logical cases to check for consistency of model content. All four cases can be formalised as variants of coalgebra-algebra homomorphisms. It can be demonstrated that transitions between the four variants affect the relevant observations (time series or spatial maps), the formalisms used (equations, decision trees) and the test criteria of success (prediction, classification) of the resulting model types. We will present examples from hydrology and ecology in

  18. Proposed framework for thermomechanical life modeling of metal matrix composites

    NASA Technical Reports Server (NTRS)

    Halford, Gary R.; Lerch, Bradley A.; Saltsman, James F.

    1993-01-01

    The framework of a mechanics of materials model is proposed for thermomechanical fatigue (TMF) life prediction of unidirectional, continuous-fiber metal matrix composites (MMC's). Axially loaded MMC test samples are analyzed as structural components whose fatigue lives are governed by local stress-strain conditions resulting from combined interactions of the matrix, interfacial layer, and fiber constituents. The metallic matrix is identified as the vehicle for tracking fatigue crack initiation and propagation. The proposed framework has three major elements. First, TMF flow and failure characteristics of in situ matrix material are approximated from tests of unreinforced matrix material, and matrix TMF life prediction equations are numerically calibrated. The macrocrack initiation fatigue life of the matrix material is divided into microcrack initiation and microcrack propagation phases. Second, the influencing factors created by the presence of fibers and interfaces are analyzed, characterized, and documented in equation form. Some of the influences act on the microcrack initiation portion of the matrix fatigue life, others on the microcrack propagation life, while some affect both. Influencing factors include coefficient of thermal expansion mismatch strains, residual (mean) stresses, multiaxial stress states, off-axis fibers, internal stress concentrations, multiple initiation sites, nonuniform fiber spacing, fiber debonding, interfacial layers and cracking, fractured fibers, fiber deflections of crack fronts, fiber bridging of matrix cracks, and internal oxidation along internal interfaces. Equations exist for some, but not all, of the currently identified influencing factors. The third element is the inclusion of overriding influences such as maximum tensile strain limits of brittle fibers that could cause local fractures and ensuing catastrophic failure of surrounding matrix material. Some experimental data exist for assessing the plausibility of the proposed

  19. Quasi-3D Multi-scale Modeling Framework Development

    NASA Astrophysics Data System (ADS)

    Arakawa, A.; Jung, J.

    2008-12-01

    When models are truncated in or near an energetically active range of the spectrum, model physics must be changed as the resolution changes. The model physics of GCMs and that of CRMs are, however, quite different from each other and at present there is no unified formulation of model physics that automatically provides transition between these model physics. The Quasi-3D (Q3D) Multi-scale Modeling Framework (MMF) is an attempt to bridge this gap. Like the recently proposed Heterogeneous Multiscale Method (HMM) (E and Engquist 2003), MMF combines a macroscopic model, GCM, and a microscopic model, CRM. Unlike the traditional multiscale methods such as the multi-grid and adapted mesh refinement techniques, HMM and MMF are for solving multi-physics problems. They share the common objective "to design combined macroscopic-microscopic computational methods that are much more efficient than solving the full microscopic model and at the same time give the information we need" (E et al. 2008). The question is then how to meet this objective in practice, which can be highly problem dependent. In HHM, the efficiency is gained typically by localization of the microscale problem. Following the pioneering work by Grabowski and Smolarkiewicz (1999) and Grabowski (2001), MMF takes advantage of the fact that 2D CRMs are reasonably successful in simulating deep clouds. In this approach, the efficiency is gained by sacrificing the three-dimensionality of cloud-scale motion. It also "localizes" the algorithm through embedding a CRM in each GCM grid box using cyclic boundary condition. The Q3D MMF is an attempt to reduce the expense due to these constraints by partially including the cloud-scale 3D effects and extending the CRM beyond individual GCM grid boxes. As currently formulated, the Q3D MMF is a 4D estimation/prediction framework that combines a GCM with a 3D anelastic cloud-resolving vector vorticity equation model (VVM) applied to a network of horizontal grids. The network

  20. Usage Intention Framework Model: A Fuzzy Logic Interpretation of the Classical Utaut Model

    ERIC Educational Resources Information Center

    Sandaire, Johnny

    2009-01-01

    A fuzzy conjoint analysis (FCA: Turksen, 1992) model for enhancing management decision in the technology adoption domain was implemented as an extension to the UTAUT model (Venkatesh, Morris, Davis, & Davis, 2003). Additionally, a UTAUT-based Usage Intention Framework Model (UIFM) introduced a closed-loop feedback system. The empirical evidence…

  1. Assessment of solution uncertainties in single-column modeling frameworks

    SciTech Connect

    Hack, J.J.; Pedretti, J.A.

    2000-01-15

    Single-column models (SCMs) have been extensively promoted in recent years as an effective means to develop and test physical parameterizations targeted for more complex three-dimensional climate models. Although there are some clear advantages associated with single-column modeling, there are also some significant disadvantages, including the absence of large-scale feedbacks. Basic limitations of an SCM framework can make it difficult to interpret solutions, and at times contribute to rather striking failures to identify even first-order sensitivities as they would be observed in a global climate simulation. This manuscript will focus on one of the basic experimental approaches currently exploited by the single-column modeling community, with an emphasis on establishing the inherent uncertainties in the numerical solutions. The analysis will employ the standard physics package from the NCAR CCM3 and will illustrate the nature of solution uncertainties that arise from nonlinearities in parameterized physics. The results of this study suggest the need to make use of an ensemble methodology when conducting single-column modeling investigations.

  2. Modelling grain growth in the framework of Rational Extended Thermodynamics

    NASA Astrophysics Data System (ADS)

    Kertsch, Lukas; Helm, Dirk

    2016-05-01

    Grain growth is a significant phenomenon for the thermomechanical processing of metals. Since the mobility of the grain boundaries is thermally activated and energy stored in the grain boundaries is released during their motion, a mutual interaction with the process conditions occurs. To model such phenomena, a thermodynamic framework for the representation of thermomechanical coupling phenomena in metals including a microstructure description is required. For this purpose, Rational Extended Thermodynamics appears to be a useful tool. We apply an entropy principle to derive a thermodynamically consistent model for grain coarsening due to the growth and shrinkage of individual grains. Despite the rather different approaches applied, we obtain a grain growth model which is similar to existing ones and can be regarded as a thermodynamic extension of that by Hillert (1965) to more general systems. To demonstrate the applicability of the model, we compare our simulation results to grain growth experiments in pure copper by different authors, which we are able to reproduce very accurately. Finally, we study the implications of the energy release due to grain growth on the energy balance. The present unified approach combining a microstructure description and continuum mechanics is ready to be further used to develop more elaborate material models for complex thermo-chemo-mechanical coupling phenomena.

  3. Sensor models and a framework for sensor management

    NASA Astrophysics Data System (ADS)

    Gaskell, Alex P.; Probert, Penelope J.

    1993-08-01

    We describe the use of Bayesian belief networks and decision theoretic principles for sensor management in multi-sensor systems. This framework provides a way of representing sensory data and choosing actions under uncertainty. The work considers how to distribute functionality between sensors and the controller. Use is made of logical sensors based on complementary physical sensors to provide information at the task level of abstraction represented within the network. We are applying these methods in the area of low level planning in mobile robotics. A key feature of the work is the development of quantified models to represent diverse sensors, in particular the sonar array and infra-red triangulation sensors we use on our AGV. We need to develop a model which can handle these very different sensors but provides a common interface to the sensor management process. We do this by quantifying the uncertainty through probabilistic models of the sensors, taking into account their physical characteristics and interaction with the expected environment. Modelling the sensor characteristics to an appropriate level of detail has the advantage of giving more accurate and robust mapping between the physical and logical sensor, as well as a better understanding of environmental dependency and its limitations. We describe a model of a sonar array, which explicitly takes into account features such as beam-width and ranging errors, and its integration into the sensor management process.

  4. Structural uncertainty in watershed phosphorus modeling: Toward a stochastic framework

    NASA Astrophysics Data System (ADS)

    Chen, Lei; Gong, Yongwei; Shen, Zhenyao

    2016-06-01

    Structural uncertainty is an important source of model predictive errors, but few studies have been conducted on the error-transitivity from model structure to nonpoint source (NPS) prediction. In this study, we focused on the structural uncertainty caused by the algorithms and equations that are used to describe the phosphorus (P) cycle at the watershed scale. The sensitivity of simulated P to each algorithm/equation was quantified using the Soil and Water Assessment Tool (SWAT) in the Three Gorges Reservoir Area, China. The results indicated that the ratios of C:N and P:N for humic materials, as well as the algorithm of fertilization and P leaching contributed the largest output uncertainties. In comparison, the initiation of inorganic P in the soil layer and the transformation algorithm between P pools are less sensitive for the NPS-P predictions. In addition, the coefficient of variation values were quantified as 0.028-0.086, indicating that the structure-induced uncertainty is minor compared to NPS-P prediction uncertainty caused by the model input and parameters. Using the stochastic framework, the cumulative probability of simulated NPS-P data provided a trade-off between expenditure burden and desired risk. In this sense, this paper provides valuable information for the control of model structural uncertainty, and can be extrapolated to other model-based studies.

  5. Extreme Precipitation in a Multi-Scale Modeling Framework

    NASA Astrophysics Data System (ADS)

    Phillips, M.; Denning, S.; Arabi, M.

    2015-12-01

    Extreme precipitation events are characterized by infrequent but large magnitude accummulatations that generally occur on scales belowthat resolved by the typical Global Climate Model. The Multi-scale Modeling Framework allows for information about the precipitation on these scales to be simulated for long periods of time without the large computational resources required for the use of a full cloud permitting model. The Community Earth System Model was run for 30 years in both its MMF and GCM modes, and the annual maximum series of 24 hour precipitation accumulations were used to estimate the parameters of statistical distributions. The distributions generated from model ouput were then t to a General Extreme Value distribution and evaluated against observations. These results indicate that the MMF produces extreme precipitation with a statistical distribution that closely resembles that of observations and motivates the continued use of the MMF for analysis of extreme precipitation, and shows an improvement over the traditional GCM. The improvement in statistical distributions of annual maxima is greatest in regions that are dominated by convective precipitation where the small-scale information provided by the MMF heavily influences precipitation processes.

  6. A Data Driven Framework for Integrating Regional Climate Models

    NASA Astrophysics Data System (ADS)

    Lansing, C.; Kleese van Dam, K.; Liu, Y.; Elsethagen, T.; Guillen, Z.; Stephan, E.; Critchlow, T.; Gorton, I.

    2012-12-01

    There are increasing needs for research addressing complex climate sensitive issues of concern to decision-makers and policy planners at a regional level. Decisions about allocating scarce water across competing municipal, agricultural, and ecosystem demands is just one of the challenges ahead, along with decisions regarding competing land use priorities such as biofuels, food, and species habitat. Being able to predict the extent of future climate change in the context of introducing alternative energy production strategies requires a new generation of modeling capabilities. We will also need more complete representations of human systems at regional scales, incorporating the influences of population centers, land use, agriculture and existing and planned electrical demand and generation infrastructure. At PNNL we are working towards creating a first-of-a-kind capability known as the Integrated Regional Earth System Model (iRESM). The fundamental goal of the iRESM initiative is the critical analyses of the tradeoffs and consequences of decision and policy making for integrated human and environmental systems. This necessarily combines different scientific processes, bridging different temporal and geographic scales and resolving the semantic differences between them. To achieve this goal, iRESM is developing a modeling framework and supporting infrastructure that enable the scientific team to evaluate different scenarios in light of specific stakeholder questions such as "How do regional changes in mean climate states and climate extremes affect water storage and energy consumption and how do such decisions influence possible mitigation and carbon management schemes?" The resulting capability will give analysts a toolset to gain insights into how regional economies can respond to climate change mitigation policies and accelerated deployment of alternative energy technologies. The iRESM framework consists of a collection of coupled models working with high

  7. LAMMPS framework for dynamic bonding and an application modeling DNA

    NASA Astrophysics Data System (ADS)

    Svaneborg, Carsten

    2012-08-01

    We have extended the Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) to support directional bonds and dynamic bonding. The framework supports stochastic formation of new bonds, breakage of existing bonds, and conversion between bond types. Bond formation can be controlled to limit the maximal functionality of a bead with respect to various bond types. Concomitant with the bond dynamics, angular and dihedral interactions are dynamically introduced between newly connected triplets and quartets of beads, where the interaction type is determined from the local pattern of bead and bond types. When breaking bonds, all angular and dihedral interactions involving broken bonds are removed. The framework allows chemical reactions to be modeled, and use it to simulate a simplistic, coarse-grained DNA model. The resulting DNA dynamics illustrates the power of the present framework. Catalogue identifier: AEME_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEME_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 2 243 491 No. of bytes in distributed program, including test data, etc.: 771 Distribution format: tar.gz Programming language: C++ Computer: Single and multiple core servers Operating system: Linux/Unix/Windows Has the code been vectorized or parallelized?: Yes. The code has been parallelized by the use of MPI directives. RAM: 1 Gb Classification: 16.11, 16.12 Nature of problem: Simulating coarse-grain models capable of chemistry e.g. DNA hybridization dynamics. Solution method: Extending LAMMPS to handle dynamic bonding and directional bonds. Unusual features: Allows bonds to be created and broken while angular and dihedral interactions are kept consistent. Additional comments: The distribution file for this program is approximately 36 Mbytes and therefore is not delivered directly

  8. A modelling framework to simulate foliar fungal epidemics using functional–structural plant models

    PubMed Central

    Garin, Guillaume; Fournier, Christian; Andrieu, Bruno; Houlès, Vianney; Robert, Corinne; Pradal, Christophe

    2014-01-01

    Background and Aims Sustainable agriculture requires the identification of new, environmentally responsible strategies of crop protection. Modelling of pathosystems can allow a better understanding of the major interactions inside these dynamic systems and may lead to innovative protection strategies. In particular, functional–structural plant models (FSPMs) have been identified as a means to optimize the use of architecture-related traits. A current limitation lies in the inherent complexity of this type of modelling, and thus the purpose of this paper is to provide a framework to both extend and simplify the modelling of pathosystems using FSPMs. Methods Different entities and interactions occurring in pathosystems were formalized in a conceptual model. A framework based on these concepts was then implemented within the open-source OpenAlea modelling platform, using the platform's general strategy of modelling plant–environment interactions and extending it to handle plant interactions with pathogens. New developments include a generic data structure for representing lesions and dispersal units, and a series of generic protocols to communicate with objects representing the canopy and its microenvironment in the OpenAlea platform. Another development is the addition of a library of elementary models involved in pathosystem modelling. Several plant and physical models are already available in OpenAlea and can be combined in models of pathosystems using this framework approach. Key Results Two contrasting pathosystems are implemented using the framework and illustrate its generic utility. Simulations demonstrate the framework's ability to simulate multiscaled interactions within pathosystems, and also show that models are modular components within the framework and can be extended. This is illustrated by testing the impact of canopy architectural traits on fungal dispersal. Conclusions This study provides a framework for modelling a large number of pathosystems

  9. Legitimising data-driven models: exemplification of a new data-driven mechanistic modelling framework

    NASA Astrophysics Data System (ADS)

    Mount, N. J.; Dawson, C. W.; Abrahart, R. J.

    2013-07-01

    In this paper the difficult problem of how to legitimise data-driven hydrological models is addressed using an example of a simple artificial neural network modelling problem. Many data-driven models in hydrology have been criticised for their black-box characteristics, which prohibit adequate understanding of their mechanistic behaviour and restrict their wider heuristic value. In response, presented here is a new generic data-driven mechanistic modelling framework. The framework is significant because it incorporates an evaluation of the legitimacy of a data-driven model's internal modelling mechanism as a core element in the modelling process. The framework's value is demonstrated by two simple artificial neural network river forecasting scenarios. We develop a novel adaptation of first-order partial derivative, relative sensitivity analysis to enable each model's mechanistic legitimacy to be evaluated within the framework. The results demonstrate the limitations of standard, goodness-of-fit validation procedures by highlighting how the internal mechanisms of complex models that produce the best fit scores can have lower mechanistic legitimacy than simpler counterparts whose scores are only slightly inferior. Thus, our study directly tackles one of the key debates in data-driven, hydrological modelling: is it acceptable for our ends (i.e. model fit) to justify our means (i.e. the numerical basis by which that fit is achieved)?

  10. Improving NASA's Multiscale Modeling Framework for Tropical Cyclone Climate Study

    NASA Technical Reports Server (NTRS)

    Shen, Bo-Wen; Nelson, Bron; Cheung, Samson; Tao, Wei-Kuo

    2013-01-01

    One of the current challenges in tropical cyclone (TC) research is how to improve our understanding of TC interannual variability and the impact of climate change on TCs. Recent advances in global modeling, visualization, and supercomputing technologies at NASA show potential for such studies. In this article, the authors discuss recent scalability improvement to the multiscale modeling framework (MMF) that makes it feasible to perform long-term TC-resolving simulations. The MMF consists of the finite-volume general circulation model (fvGCM), supplemented by a copy of the Goddard cumulus ensemble model (GCE) at each of the fvGCM grid points, giving 13,104 GCE copies. The original fvGCM implementation has a 1D data decomposition; the revised MMF implementation retains the 1D decomposition for most of the code, but uses a 2D decomposition for the massive copies of GCEs. Because the vast majority of computation time in the MMF is spent computing the GCEs, this approach can achieve excellent speedup without incurring the cost of modifying the entire code. Intelligent process mapping allows differing numbers of processes to be assigned to each domain for load balancing. The revised parallel implementation shows highly promising scalability, obtaining a nearly 80-fold speedup by increasing the number of cores from 30 to 3,335.

  11. A hierarchical modeling framework for multiple observer transect surveys.

    PubMed

    Conn, Paul B; Laake, Jeffrey L; Johnson, Devin S

    2012-01-01

    Ecologists often use multiple observer transect surveys to census animal populations. In addition to animal counts, these surveys produce sequences of detections and non-detections for each observer. When combined with additional data (i.e. covariates such as distance from the transect line), these sequences provide the additional information to estimate absolute abundance when detectability on the transect line is less than one. Although existing analysis approaches for such data have proven extremely useful, they have some limitations. For instance, it is difficult to extrapolate from observed areas to unobserved areas unless a rigorous sampling design is adhered to; it is also difficult to share information across spatial and temporal domains or to accommodate habitat-abundance relationships. In this paper, we introduce a hierarchical modeling framework for multiple observer line transects that removes these limitations. In particular, abundance intensities can be modeled as a function of habitat covariates, making it easier to extrapolate to unsampled areas. Our approach relies on a complete data representation of the state space, where unobserved animals and their covariates are modeled using a reversible jump Markov chain Monte Carlo algorithm. Observer detections are modeled via a bivariate normal distribution on the probit scale, with dependence induced by a distance-dependent correlation parameter. We illustrate performance of our approach with simulated data and on a known population of golf tees. In both cases, we show that our hierarchical modeling approach yields accurate inference about abundance and related parameters. In addition, we obtain accurate inference about population-level covariates (e.g. group size). We recommend that ecologists consider using hierarchical models when analyzing multiple-observer transect data, especially when it is difficult to rigorously follow pre-specified sampling designs. We provide a new R package, hierarchical

  12. A Smart Modeling Framework for Integrating BMI-enabled Models as Web Services

    NASA Astrophysics Data System (ADS)

    Jiang, P.; Elag, M.; Kumar, P.; Peckham, S. D.; Liu, R.; Marini, L.; Hsu, L.

    2015-12-01

    Serviced-oriented computing provides an opportunity to couple web service models using semantic web technology. Through this approach, models that are exposed as web services can be conserved in their own local environment, thus making it easy for modelers to maintain and update the models. In integrated modeling, the serviced-oriented loose-coupling approach requires (1) a set of models as web services, (2) the model metadata describing the external features of a model (e.g., variable name, unit, computational grid, etc.) and (3) a model integration framework. We present the architecture of coupling web service models that are self-describing by utilizing a smart modeling framework. We expose models that are encapsulated with CSDMS (Community Surface Dynamics Modeling System) Basic Model Interfaces (BMI) as web services. The BMI-enabled models are self-describing by uncovering models' metadata through BMI functions. After a BMI-enabled model is serviced, a client can initialize, execute and retrieve the meta-information of the model by calling its BMI functions over the web. Furthermore, a revised version of EMELI (Peckham, 2015), an Experimental Modeling Environment for Linking and Interoperability, is chosen as the framework for coupling BMI-enabled web service models. EMELI allows users to combine a set of component models into a complex model by standardizing model interface using BMI as well as providing a set of utilities smoothing the integration process (e.g., temporal interpolation). We modify the original EMELI so that the revised modeling framework is able to initialize, execute and find the dependencies of the BMI-enabled web service models. By using the revised EMELI, an example will be presented on integrating a set of topoflow model components that are BMI-enabled and exposed as web services. Reference: Peckham, S.D. (2014) EMELI 1.0: An experimental smart modeling framework for automatic coupling of self-describing models, Proceedings of HIC 2014

  13. The Pretzelosity Distribution Function and Intrinsic Motion of the Constituents in Nucleon

    SciTech Connect

    Efremov, A. V.; Teryaev, O. V.; Schweitzer, P.; Zavada, P.

    2009-08-04

    The pretzelosity distribution function h{sub 1T}{sup perpendicular} is studied in a covariant quark-parton model which describes the structure of the nucleon in terms of 3D quark intrinsic motion. This relativistic model framework supports the relation between helicity, transversity and pretzelosity observed in other relativistic models without assuming SU(6) spin-flavor symmetry. Numerical results and predictions for SIDIS experiments are presented.

  14. a Framework for AN Open Source Geospatial Certification Model

    NASA Astrophysics Data System (ADS)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105

  15. Internal modelling under Risk-Based Capital (RBC) framework

    NASA Astrophysics Data System (ADS)

    Ling, Ang Siew; Hin, Pooi Ah

    2015-12-01

    Very often the methods for the internal modelling under the Risk-Based Capital framework make use of the data which are in the form of run-off triangle. The present research will instead extract from a group of n customers, the historical data for the sum insured si of the i-th customer together with the amount paid yij and the amount aij reported but not yet paid in the j-th development year for j = 1, 2, 3, 4, 5, 6. We model the future value (yij+1, aij+1) to be dependent on the present year value (yij, aij) and the sum insured si via a conditional distribution which is derived from a multivariate power-normal mixture distribution. For a group of given customers with different original purchase dates, the distribution of the aggregate claims liabilities may be obtained from the proposed model. The prediction interval based on the distribution for the aggregate claim liabilities is found to have good ability of covering the observed aggregate claim liabilities.

  16. Investigating GPDs in the framework of the double distribution model

    NASA Astrophysics Data System (ADS)

    Nazari, F.; Mirjalili, A.

    2016-06-01

    In this paper, we construct the generalized parton distribution (GPD) in terms of the kinematical variables x, ξ, t, using the double distribution model. By employing these functions, we could extract some quantities which makes it possible to gain a three-dimensional insight into the nucleon structure function at the parton level. The main objective of GPDs is to combine and generalize the concepts of ordinary parton distributions and form factors. They also provide an exclusive framework to describe the nucleons in terms of quarks and gluons. Here, we first calculate, in the Double Distribution model, the GPD based on the usual parton distributions arising from the GRV and CTEQ phenomenological models. Obtaining quarks and gluons angular momenta from the GPD, we would be able to calculate the scattering observables which are related to spin asymmetries of the produced quarkonium. These quantities are represented by AN and ALS. We also calculate the Pauli and Dirac form factors in deeply virtual Compton scattering. Finally, in order to compare our results with the existing experimental data, we use the difference of the polarized cross-section for an initial longitudinal leptonic beam and unpolarized target particles (ΔσLU). In all cases, our obtained results are in good agreement with the available experimental data.

  17. Evolution of Climate Science Modelling Language within international standards frameworks

    NASA Astrophysics Data System (ADS)

    Lowe, Dominic; Woolf, Andrew

    2010-05-01

    The Climate Science Modelling Language (CSML) was originally developed as part of the NERC Data Grid (NDG) project in the UK. It was one of the first Geography Markup Language (GML) application schemas describing complex feature types for the metocean domain. CSML feature types can be used to describe typical climate products such as model runs or atmospheric profiles. CSML has been successfully used within NDG to provide harmonised access to a number of different data sources. For example, meteorological observations held in heterogeneous databases by the British Atmospheric Data Centre (BADC) and Centre for Ecology and Hydrology (CEH) were served uniformly as CSML features via Web Feature Service. CSML has now been substantially revised to harmonise it with the latest developments in OGC and ISO conceptual modelling for geographic information. In particular, CSML is now aligned with the near-final ISO 19156 Observations & Measurements (O&M) standard. CSML combines the O&M concept of 'sampling features' together with an observation result based on the coverage model (ISO 19123). This general pattern is specialised for particular data types of interest, classified on the basis of sampling geometry and topology. In parallel work, the OGC Met Ocean Domain Working Group has established a conceptual modelling activity. This is a cross-organisational effort aimed at reaching consensus on a common core data model that could be re-used in a number of met-related application areas: operational meteorology, aviation meteorology, climate studies, and the research community. It is significant to note that this group has also identified sampling geometry and topology as a key classification axis for data types. Using the Model Driven Architecture (MDA) approach as adopted by INSPIRE we demonstrate how the CSML application schema is derived from a formal UML conceptual model based on the ISO TC211 framework. By employing MDA tools which map consistently between UML and GML we

  18. AN INTEGRATED MODELING FRAMEWORK FOR CARBON MANAGEMENT TECHNOLOGIES

    SciTech Connect

    Anand B. Rao; Edward S. Rubin; Michael B. Berkenpas

    2004-03-01

    CO{sub 2} capture and storage (CCS) is gaining widespread interest as a potential method to control greenhouse gas emissions from fossil fuel sources, especially electric power plants. Commercial applications of CO{sub 2} separation and capture technologies are found in a number of industrial process operations worldwide. Many of these capture technologies also are applicable to fossil fuel power plants, although applications to large-scale power generation remain to be demonstrated. This report describes the development of a generalized modeling framework to assess alternative CO{sub 2} capture and storage options in the context of multi-pollutant control requirements for fossil fuel power plants. The focus of the report is on post-combustion CO{sub 2} capture using amine-based absorption systems at pulverized coal-fired plants, which are the most prevalent technology used for power generation today. The modeling framework builds on the previously developed Integrated Environmental Control Model (IECM). The expanded version with carbon sequestration is designated as IECM-cs. The expanded modeling capability also includes natural gas combined cycle (NGCC) power plants and integrated coal gasification combined cycle (IGCC) systems as well as pulverized coal (PC) plants. This report presents details of the performance and cost models developed for an amine-based CO{sub 2} capture system, representing the baseline of current commercial technology. The key uncertainties and variability in process design, performance and cost parameters which influence the overall cost of carbon mitigation also are characterized. The new performance and cost models for CO{sub 2} capture systems have been integrated into the IECM-cs, along with models to estimate CO{sub 2} transport and storage costs. The CO{sub 2} control system also interacts with other emission control technologies such as flue gas desulfurization (FGD) systems for SO{sub 2} control. The integrated model is applied to

  19. Model-Based Reasoning in the Physics Laboratory: Framework and Initial Results

    ERIC Educational Resources Information Center

    Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.

    2015-01-01

    We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable…

  20. Models of Recognition, Repetition Priming, and Fluency : Exploring a New Framework

    ERIC Educational Resources Information Center

    Berry, Christopher J.; Shanks, David R.; Speekenbrink, Maarten; Henson, Richard N. A.

    2012-01-01

    We present a new modeling framework for recognition memory and repetition priming based on signal detection theory. We use this framework to specify and test the predictions of 4 models: (a) a single-system (SS) model, in which one continuous memory signal drives recognition and priming; (b) a multiple-systems-1 (MS1) model, in which completely…

  1. Smart licensing and environmental flows: Modeling framework and sensitivity testing

    NASA Astrophysics Data System (ADS)

    Wilby, R. L.; Fenn, C. R.; Wood, P. J.; Timlett, R.; Lequesne, T.

    2011-12-01

    Adapting to climate change is just one among many challenges facing river managers. The response will involve balancing the long-term water demands of society with the changing needs of the environment in sustainable and cost effective ways. This paper describes a modeling framework for evaluating the sensitivity of low river flows to different configurations of abstraction licensing under both historical climate variability and expected climate change. A rainfall-runoff model is used to quantify trade-offs among environmental flow (e-flow) requirements, potential surface and groundwater abstraction volumes, and the frequency of harmful low-flow conditions. Using the River Itchen in southern England as a case study it is shown that the abstraction volume is more sensitive to uncertainty in the regional climate change projection than to the e-flow target. It is also found that "smarter" licensing arrangements (involving a mix of hands off flows and "rising block" abstraction rules) could achieve e-flow targets more frequently than conventional seasonal abstraction limits, with only modest reductions in average annual yield, even under a hotter, drier climate change scenario.

  2. A modeling framework for potential induced degradation in PV modules

    NASA Astrophysics Data System (ADS)

    Bermel, Peter; Asadpour, Reza; Zhou, Chao; Alam, Muhammad A.

    2015-09-01

    Major sources of performance degradation and failure in glass-encapsulated PV modules include moisture-induced gridline corrosion, potential-induced degradation (PID) of the cell, and stress-induced busbar delamination. Recent studies have shown that PV modules operating in damp heat at -600 V are vulnerable to large amounts of degradation, potentially up to 90% of the original power output within 200 hours. To improve module reliability and restore power production in the presence of PID and other failure mechanisms, a fundamental rethinking of accelerated testing is needed. This in turn will require an improved understanding of technology choices made early in development that impact failures later. In this work, we present an integrated approach of modeling, characterization, and validation to address these problems. A hierarchical modeling framework will allows us to clarify the mechanisms of corrosion, PID, and delamination. We will employ a physics-based compact model of the cell, topology of the electrode interconnection, geometry of the packaging stack, and environmental operating conditions to predict the current, voltage, temperature, and stress distributions in PV modules correlated with the acceleration of specific degradation modes. A self-consistent solution will capture the essential complexity of the technology-specific acceleration of PID and other degradation mechanisms as a function of illumination, ambient temperature, and relative humidity. Initial results from our model include specific lifetime predictions suitable for direct comparison with indoor and outdoor experiments, which are qualitatively validated by prior work. This approach could play a significant role in developing novel accelerated lifetime tests.

  3. A trajectory generation framework for modeling spacecraft entry in MDAO

    NASA Astrophysics Data System (ADS)

    D`Souza, Sarah N.; Sarigul-Klijn, Nesrin

    2016-04-01

    In this paper a novel trajectory generation framework was developed that optimizes trajectory event conditions for use in a Generalized Entry Guidance algorithm. The framework was developed to be adaptable via the use of high fidelity equations of motion and drag based analytical bank profiles. Within this framework, a novel technique was implemented that resolved the sensitivity of the bank profile to atmospheric non-linearities. The framework's adaptability was established by running two different entry bank conditions. Each case yielded a reference trajectory and set of transition event conditions that are flight feasible and implementable in a Generalized Entry Guidance algorithm.

  4. A modeling and simulation framework for electrokinetic nanoparticle treatment

    NASA Astrophysics Data System (ADS)

    Phillips, James

    2011-12-01

    The focus of this research is to model and provide a simulation framework for the packing of differently sized spheres within a hard boundary. The novel contributions of this dissertation are the cylinders of influence (COI) method and sectoring method implementations. The impetus for this research stems from modeling electrokinetic nanoparticle (EN) treatment, which packs concrete pores with differently sized nanoparticles. We show an improved speed of the simulation compared to previously published results of EN treatment simulation while obtaining similar porosity reduction results. We mainly focused on readily, commercially available particle sizes of 2 nm and 20 nm particles, but have the capability to model other sizes. Our simulation has graphical capabilities and can provide additional data unobtainable from physical experimentation. The data collected has a median of 0.5750 and a mean of 0.5504. The standard error is 0.0054 at alpha = 0.05 for a 95% confidence interval of 0.5504 +/- 0.0054. The simulation has produced maximum packing densities of 65% and minimum packing densities of 34%. Simulation data are analyzed using linear regression via the R statistical language to obtain two equations: one that describes porosity reduction based on all cylinder and particle characteristics, and another that focuses on describing porosity reduction based on cylinder diameter for 2 and 20 nm particles into pores of 100 nm height. Simulation results are similar to most physical results obtained from MIP and WLR. Some MIP results do not fall within the simulation limits; however, this is expected as MIP has been documented to be an inaccurate measure of pore distribution and porosity of concrete. Despite the disagreement between WLR and MIP, there is a trend that porosity reduction is higher two inches from the rebar as compared to the rebar-concrete interface. The simulation also detects a higher porosity reduction further from the rebar. This may be due to particles

  5. Robust, Adaptive Functional Regression in Functional Mixed Model Framework

    PubMed Central

    Zhu, Hongxiao; Brown, Philip J.; Morris, Jeffrey S.

    2012-01-01

    Functional data are increasingly encountered in scientific studies, and their high dimensionality and complexity lead to many analytical challenges. Various methods for functional data analysis have been developed, including functional response regression methods that involve regression of a functional response on univariate/multivariate predictors with nonparametrically represented functional coefficients. In existing methods, however, the functional regression can be sensitive to outlying curves and outlying regions of curves, so is not robust. In this paper, we introduce a new Bayesian method, robust functional mixed models (R-FMM), for performing robust functional regression within the general functional mixed model framework, which includes multiple continuous or categorical predictors and random effect functions accommodating potential between-function correlation induced by the experimental design. The underlying model involves a hierarchical scale mixture model for the fixed effects, random effect and residual error functions. These modeling assumptions across curves result in robust nonparametric estimators of the fixed and random effect functions which down-weight outlying curves and regions of curves, and produce statistics that can be used to flag global and local outliers. These assumptions also lead to distributions across wavelet coefficients that have outstanding sparsity and adaptive shrinkage properties, with great flexibility for the data to determine the sparsity and the heaviness of the tails. Together with the down-weighting of outliers, these within-curve properties lead to fixed and random effect function estimates that appear in our simulations to be remarkably adaptive in their ability to remove spurious features yet retain true features of the functions. We have developed general code to implement this fully Bayesian method that is automatic, requiring the user to only provide the functional data and design matrices. It is efficient

  6. Digital Moon: A three-dimensional framework for lunar modeling

    NASA Astrophysics Data System (ADS)

    Paige, D. A.; Elphic, R. C.; Foote, E. J.; Meeker, S. R.; Siegler, M. A.; Vasavada, A. R.

    2009-12-01

    The Moon has a complex three-dimensional shape with significant large-scale and small-scale topographic relief. The Moon’s topography largely controls the distribution of incident solar radiation, as well as the scattered solar and infrared radiation fields. Topography also affects the Moon’s interaction with the space environment, its magnetic field, and the propagation of seismic waves. As more extensive and detailed lunar datasets become available, there is an increasing need to interpret and compare them with the results of physical models in a fully three-dimensional context. We have developed a three-dimensional framework for lunar modeling we call the Digital Moon. The goal of this work is to enable high fidelity physical modeling and visualization of the Moon in a parallel computing environment. The surface of the Moon is described by a continuous triangular mesh of arbitrary shape and spatial scale. For regions of limited geographic extent, it is convenient to employ meshes on a rectilinear grid. However for global-scale modeling, we employ a continuous geodesic gridding scheme (Teanby, 2008). Each element in the mesh surface is allowed to have a unique set of physical properties. Photon and particle interactions between mesh elements are modeled using efficient ray tracing algorithms. Heat, mass, photon and particle transfer within each mesh element are modeled in one dimension. Each compute node is assigned a portion of the mesh and collective interactions between elements are handled through network interfaces. We have used the model to calculate lunar surface and subsurface temperatures that can be compared directly with radiometric temperatures measured by the Diviner Lunar Radiometer Experiment on the Lunar Reconnaissance Orbiter. The model includes realistic surface photometric functions based on goniometric measurements of lunar soil samples (Foote and Paige, 2009), and one-dimensional thermal models based on lunar remote sensing and Apollo

  7. Modelling competition and dispersal in a statistical phylogeographic framework.

    PubMed

    Ranjard, Louis; Welch, David; Paturel, Marie; Guindon, Stéphane

    2014-09-01

    Competition between organisms influences the processes governing the colonization of new habitats. As a consequence, species or populations arriving first at a suitable location may prevent secondary colonization. Although adaptation to environmental variables (e.g., temperature, altitude, etc.) is essential, the presence or absence of certain species at a particular location often depends on whether or not competing species co-occur. For example, competition is thought to play an important role in structuring mammalian communities assembly. It can also explain spatial patterns of low genetic diversity following rapid colonization events or the "progression rule" displayed by phylogenies of species found on archipelagos. Despite the potential of competition to maintain populations in isolation, past quantitative analyses have largely ignored it because of the difficulty in designing adequate methods for assessing its impact. We present here a new model that integrates competition and dispersal into a Bayesian phylogeographic framework. Extensive simulations and analysis of real data show that our approach clearly outperforms the traditional Mantel test for detecting correlation between genetic and geographic distances. But most importantly, we demonstrate that competition can be detected with high sensitivity and specificity from the phylogenetic analysis of genetic variation in space. PMID:24929898

  8. Adaptive invasive species distribution models: A framework for modeling incipient invasions

    USGS Publications Warehouse

    Uden, Daniel R.; Allen, Craig R.; Angeler, David G.; Corral, Lucia; Fricke, Kent A.

    2015-01-01

    The utilization of species distribution model(s) (SDM) for approximating, explaining, and predicting changes in species’ geographic locations is increasingly promoted for proactive ecological management. Although frameworks for modeling non-invasive species distributions are relatively well developed, their counterparts for invasive species—which may not be at equilibrium within recipient environments and often exhibit rapid transformations—are lacking. Additionally, adaptive ecological management strategies address the causes and effects of biological invasions and other complex issues in social-ecological systems. We conducted a review of biological invasions, species distribution models, and adaptive practices in ecological management, and developed a framework for adaptive, niche-based, invasive species distribution model (iSDM) development and utilization. This iterative, 10-step framework promotes consistency and transparency in iSDM development, allows for changes in invasive drivers and filters, integrates mechanistic and correlative modeling techniques, balances the avoidance of type 1 and type 2 errors in predictions, encourages the linking of monitoring and management actions, and facilitates incremental improvements in models and management across space, time, and institutional boundaries. These improvements are useful for advancing coordinated invasive species modeling, management and monitoring from local scales to the regional, continental and global scales at which biological invasions occur and harm native ecosystems and economies, as well as for anticipating and responding to biological invasions under continuing global change.

  9. The BlueSky Smoke Modeling Framework: Recent Developments

    NASA Astrophysics Data System (ADS)

    Sullivan, D. C.; Larkin, N.; Raffuse, S. M.; Strand, T.; ONeill, S. M.; Leung, F. T.; Qu, J. J.; Hao, X.

    2012-12-01

    (TRMM) Multi-satellite Precipitation Analysis Real-Time (TMPA-RT) data set is being used to improve dead fuel moisture estimates. - EastFire live fuel moisture estimates, which are derived from NASA's MODIS direct broadcast, are being used to improve live fuel moisture estimates. - NASA's Multi-angle Imaging Spectroradiometer (MISR) stereo heights are being used to improve estimates of plume injection heights. Further, the Fire Location and Modeling of Burning Emissions (FLAMBÉ) model was incorporated into the BlueSky Framework as an alternative means of calculating fire emissions. FLAMBÉ directly estimates emissions on the basis of fire detections and radiance measures from NASA's MODIS and NOAA's GOES satellites. (The authors gratefully acknowledge NASA's Applied Sciences Program [Grant Nos. NN506AB52A and NNX09AV76G)], the USDA Forest Service, and the Joint Fire Science Program for their support.)

  10. D Geological Framework Models as a Teaching Aid for Geoscience

    NASA Astrophysics Data System (ADS)

    Kessler, H.; Ward, E.; Geological ModelsTeaching Project Team

    2010-12-01

    3D geological models have great potential as a resource for universities when teaching foundation geological concepts as it allows the student to visualise and interrogate UK geology. They are especially useful when dealing with the conversion of 2D field, map and GIS outputs into three dimensional geological units, which is a common problem for all students of geology. Today’s earth science students use a variety of skills and processes during their learning experience including the application of schema’s, spatial thinking, image construction, detecting patterns, memorising figures, mental manipulation and interpretation, making predictions and deducing the orientation of themselves and the rocks. 3D geological models can reinforce spatial thinking strategies and encourage students to think about processes and properties, in turn helping the student to recognise pre-learnt geological principles in the field and to convert what they see at the surface into a picture of what is going on at depth. Learning issues faced by students may also be encountered by experts, policy managers, and stakeholders when dealing with environmental problems. Therefore educational research of student learning in earth science may also improve environmental decision making. 3D geological framework models enhance the learning of Geosciences because they: ● enable a student to observe, manipulate and interpret geology; in particular the models instantly convert two-dimensional geology (maps, boreholes and cross-sections) into three dimensions which is a notoriously difficult geospatial skill to acquire. ● can be orientated to whatever the user finds comfortable and most aids recognition and interpretation. ● can be used either to teach geosciences to complete beginners or add to experienced students body of knowledge (whatever point that may be at). Models could therefore be packaged as a complete educational journey or students and tutor can select certain areas of the model

  11. Simulation-optimization framework for multi-season hybrid stochastic models

    NASA Astrophysics Data System (ADS)

    Srivastav, R. K.; Srinivasan, K.; Sudheer, K. P.

    2011-07-01

    SummaryA novel simulation-optimization framework is proposed that enables the automation of the hybrid stochastic modeling process for synthetic generation of multi-season streamflows. This framework aims to minimize the drudgery, judgment and subjectivity involved in the selection of the most appropriate hybrid stochastic model. It consists of a multi-objective optimization model as the driver and the hybrid multi-season stochastic streamflow generation model, hybrid matched block boostrap (HMABB) as the simulation engine. For the estimation of the hybrid model parameters, the proposed framework employs objective functions that aim to minimize the overall errors in the preservation of storage capacities at various demand levels, unlike the traditional approaches that are simulation based. Moreover this framework yields a number of competent hybrid stochastic models in a single run of the simulation-optimization framework. The efficacy of the proposed simulation-optimization framework is brought out through application to two monthly streamflow data sets from USA of varying sample sizes that exhibit multi-modality and a complex dependence structure. The results show that the hybrid models obtained from the proposed framework are able to preserve the statistical characteristics as well as the storage characteristics better than the simulation based HMABB model, while minimizing the manual effort and the subjectivity involved in the modeling process. The proposed framework can be easily extended to model multi-site multi-season streamflow data.

  12. Legitimising neural network river forecasting models: a new data-driven mechanistic modelling framework

    NASA Astrophysics Data System (ADS)

    Mount, N. J.; Dawson, C. W.; Abrahart, R. J.

    2013-01-01

    In this paper we address the difficult problem of gaining an internal, mechanistic understanding of a neural network river forecasting (NNRF) model. Neural network models in hydrology have long been criticised for their black-box character, which prohibits adequate understanding of their modelling mechanisms and has limited their broad acceptance by hydrologists. In response, we here present a new, data-driven mechanistic modelling (DDMM) framework that incorporates an evaluation of the legitimacy of a neural network's internal modelling mechanism as a core element in the model development process. The framework is exemplified for two NNRF modelling scenarios, and uses a novel adaptation of first order, partial derivate, relative sensitivity analysis methods as the means by which each model's mechanistic legitimacy is explored. The results demonstrate the limitations of standard, goodness-of-fit validation procedures applied by NNRF modellers, by highlighting how the internal mechanisms of complex models that produce the best fit scores can have much lower legitimacy than simpler counterparts whose scores are only slightly inferior. The study emphasises the urgent need for better mechanistic understanding of neural network-based hydrological models and the further development of methods for elucidating their mechanisms.

  13. Devising a New Model-Driven Framework for Developing GUI for Enterprise Applications

    NASA Astrophysics Data System (ADS)

    Akiki, Pierre

    The main goal of this chapter is to demonstrate the design and development of a GUI framework that is model driven and is not directly linked to one presentation technology or any specific presentation subsystem of a certain programming language. This framework will allow us to create graphical user interfaces that are not only dynamically customizable but also multilingual. In order to demonstrate this new concept we design in this chapter a new framework called Customizable Enterprise Data Administrator (CEDAR). Additionally, we build a prototype of this framework and a technology-dependent engine which would transform the output of our framework into a known presentation technology.

  14. Subsurface and Surface Characterization using an Information Framework Model

    NASA Astrophysics Data System (ADS)

    Samuel-Ojo, Olusola

    Groundwater plays a critical dual role as a reservoir of fresh water for human consumption and as a cause of the most severe problems when dealing with construction works below the water table. This is why it is critical to monitor groundwater recharge, distribution, and discharge on a continuous basis. The conventional method of monitoring groundwater employs a network of sparsely distributed monitoring wells and it is laborious, expensive, and intrusive. The problem of sparse data and undersampling reduces the accuracy of sampled survey data giving rise to poor interpretation. This dissertation addresses this problem by investigating groundwater-deformation response in order to augment the conventional method. A blend of three research methods was employed, namely design science research, geological methods, and geophysical methods, to examine whether persistent scatterer interferometry, a remote sensing technique, might augment conventional groundwater monitoring. Observation data (including phase information for displacement deformation from permanent scatterer interferometric synthetic aperture radar and depth to groundwater data) was obtained from the Water District, Santa Clara Valley, California. An information framework model was built and applied, and then evaluated. Data was preprocessed and decomposed into five components or parts: trend, seasonality, low frequency, high frequency and octave bandwidth. Digital elevation models of observed and predicted hydraulic head were produced, illustrating the piezometric or potentiometric surface. The potentiometric surface characterizes the regional aquifer of the valley showing areal variation of rate of percolation, velocity and permeability, and completely defines flow direction, advising characteristics and design levels. The findings show a geologic forcing phenomenon which explains in part the long-term deformation behavior of the valley, characterized by poroelastic, viscoelastic, elastoplastic and

  15. Insight into model mechanisms through automatic parameter fitting: a new methodological framework for model development

    PubMed Central

    2014-01-01

    Background Striking a balance between the degree of model complexity and parameter identifiability, while still producing biologically feasible simulations using modelling is a major challenge in computational biology. While these two elements of model development are closely coupled, parameter fitting from measured data and analysis of model mechanisms have traditionally been performed separately and sequentially. This process produces potential mismatches between model and data complexities that can compromise the ability of computational frameworks to reveal mechanistic insights or predict new behaviour. In this study we address this issue by presenting a generic framework for combined model parameterisation, comparison of model alternatives and analysis of model mechanisms. Results The presented methodology is based on a combination of multivariate metamodelling (statistical approximation of the input–output relationships of deterministic models) and a systematic zooming into biologically feasible regions of the parameter space by iterative generation of new experimental designs and look-up of simulations in the proximity of the measured data. The parameter fitting pipeline includes an implicit sensitivity analysis and analysis of parameter identifiability, making it suitable for testing hypotheses for model reduction. Using this approach, under-constrained model parameters, as well as the coupling between parameters within the model are identified. The methodology is demonstrated by refitting the parameters of a published model of cardiac cellular mechanics using a combination of measured data and synthetic data from an alternative model of the same system. Using this approach, reduced models with simplified expressions for the tropomyosin/crossbridge kinetics were found by identification of model components that can be omitted without affecting the fit to the parameterising data. Our analysis revealed that model parameters could be constrained to a standard

  16. Linking Tectonics and Surface Processes through SNAC-CHILD Coupling: Preliminary Results Towards Interoperable Modeling Frameworks

    NASA Astrophysics Data System (ADS)

    Choi, E.; Kelbert, A.; Peckham, S. D.

    2014-12-01

    We demonstrate that code coupling can be an efficient and flexible method for modeling complicated two-way interactions between tectonic and surface processes with SNAC-CHILD coupling as an example. SNAC is a deep earth process model (a geodynamic/tectonics model), built upon a scientific software framework called StGermain and also compatible with a model coupling framework called Pyre. CHILD is a popular surface process model (a landscape evolution model), interfaced to the CSDMS (Community Surface Dynamics Modeling System) modeling framework. We first present proof-of-concept but non-trivial results from a simplistic coupling scheme. We then report progress towards augmenting SNAC with a Basic Model Interface (BMI), a framework-agnostic standard interface developed by CSDMS that uses the CSDMS Standard Names as controlled vocabulary for model communication across domains. Newly interfaced to BMI, SNAC will be easily coupled with CHILD as well as other BMI-compatible models. In broader context, this work will test BMI as a general and easy-to-implement mechanism for sharing models between modeling frameworks and is a part of the NSF-funded EarthCube Building Blocks project, "Earth System Bridge: Spanning Scientific Communities with Interoperable Modeling Frameworks."

  17. A Framework of Operating Models for Interdisciplinary Research Programs in Clinical Service Organizations

    ERIC Educational Resources Information Center

    King, Gillian; Currie, Melissa; Smith, Linda; Servais, Michelle; McDougall, Janette

    2008-01-01

    A framework of operating models for interdisciplinary research programs in clinical service organizations is presented, consisting of a "clinician-researcher" skill development model, a program evaluation model, a researcher-led knowledge generation model, and a knowledge conduit model. Together, these models comprise a tailored, collaborative…

  18. The Foundations of Learning Framework: A Model for School Readiness

    ERIC Educational Resources Information Center

    Sorrels, Barbara

    2012-01-01

    Since the National Education Goals Panel was convened in 1991, school readiness for all children has remained a high priority across our nation. The Foundations of Learning Framework is a tool to understand what it means for a child to be "ready." Preparation for educational success requires two key ingredients--relationships and play. In the…

  19. GULF OF MEXICO HYPOXIA MONITORING AND MODELING FRAMEWORK

    EPA Science Inventory

    The USEPA ORD in partnership with the Gulf of Mexico Program Office, the Office of Water and Regions 4 and 6 have developed and implemented plans for a framework that will help guide the science needed to address the hypoxia problem in the Gulf of Mexico. ORD's Gulf Ecology Divis...

  20. A unified framework for modeling landscape evolution by discrete flows

    NASA Astrophysics Data System (ADS)

    Shelef, Eitan; Hilley, George E.

    2016-05-01

    Topographic features such as branched valley networks and undissected convex-up hillslopes are observed in disparate physical environments. In some cases, these features are formed by sediment transport processes that occur discretely in space and time, while in others, by transport processes that are uniformly distributed across the landscape. This paper presents an analytical framework that reconciles the basic attributes of such sediment transport processes with the topographic features that they form and casts those in terms that are likely common to different physical environments. In this framework, temporal changes in surface elevation reflect the frequency with which the landscape is traversed by geophysical flows generated discretely in time and space. This frequency depends on the distance to which flows travel downslope, which depends on the dynamics of individual flows, the lithologic and topographic properties of the underlying substrate, and the coevolution of topography, erosion, and the routing of flows over the topographic surface. To explore this framework, we postulate simple formulations for sediment transport and flow runout distance and demonstrate that the conditions for hillslope and channel network formation can be cast in terms of fundamental parameters such as distance from drainage divide and a friction-like coefficient that describes a flow's resistance to motion. The framework we propose is intentionally general, but the postulated formulas can be substituted with those that aim to describe a specific process and to capture variations in the size distribution of such flow events.

  1. Holland's RIASEC Model as an Integrative Framework for Individual Differences

    ERIC Educational Resources Information Center

    Armstrong, Patrick Ian; Day, Susan X.; McVay, Jason P.; Rounds, James

    2008-01-01

    Using data from published sources, the authors investigated J. L. Holland's (1959, 1997) theory of interest types as an integrative framework for organizing individual differences variables that are used in counseling psychology. Holland's interest types were used to specify 2- and 3-dimensional interest structures. In Study 1, measures of…

  2. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    PubMed

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department. PMID:23565356

  3. System modeling with the DISC framework: evidence from safety-critical domains.

    PubMed

    Reiman, Teemu; Pietikäinen, Elina; Oedewald, Pia; Gotcheva, Nadezhda

    2012-01-01

    The objective of this paper is to illustrate the development and application of the Design for Integrated Safety Culture (DISC) framework for system modeling by evaluating organizational potential for safety in nuclear and healthcare domains. The DISC framework includes criteria for good safety culture and a description of functions that the organization needs to implement in order to orient the organization toward the criteria. Three case studies will be used to illustrate the utilization of the DISC framework in practice. PMID:22317179

  4. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    PubMed Central

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  5. Advances in Geoscience Modeling: Smart Modeling Frameworks, Self-Describing Models and the Role of Standardized Metadata

    NASA Astrophysics Data System (ADS)

    Peckham, Scott

    2016-04-01

    Over the last decade, model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that make it much easier for modelers to connect heterogeneous sets of process models in a plug-and-play manner to create composite "system models". These mechanisms greatly simplify code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing with standardized metadata. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can use the self description functions to learn about each process model in a collection to be coupled and then automatically call framework service components (e.g. regridders

  6. Collaborative Project. A Flexible Atmospheric Modeling Framework for the Community Earth System Model (CESM)

    SciTech Connect

    Gettelman, Andrew

    2015-10-01

    In this project we have been upgrading the Multiscale Modeling Framework (MMF) in the Community Atmosphere Model (CAM), also known as Super-Parameterized CAM (SP-CAM). This has included a major effort to update the coding standards and interface with CAM so that it can be placed on the main development trunk. It has also included development of a new software structure for CAM to be able to handle sub-grid column information. These efforts have formed the major thrust of the work.

  7. Integration of the DAYCENT Biogeochemical Model within a Multi-Model Framework

    SciTech Connect

    David Muth

    2012-07-01

    Agricultural residues are the largest near term source of cellulosic 13 biomass for bioenergy production, but removing agricultural residues sustainably 14 requires considering the critical roles that residues play in the agronomic system. 15 Determining sustainable removal rates for agricultural residues has received 16 significant attention and integrated modeling strategies have been built to evaluate 17 sustainable removal rates considering soil erosion and organic matter constraints. 18 However the current integrated model does not quantitatively assess soil carbon 19 and long term crop yields impacts of residue removal. Furthermore the current 20 integrated model does not evaluate the greenhouse gas impacts of residue 21 removal, specifically N2O and CO2 gas fluxes from the soil surface. The DAYCENT 22 model simulates several important processes for determining agroecosystem 23 performance. These processes include daily Nitrogen-gas flux, daily carbon dioxide 24 flux from soil respiration, soil organic carbon and nitrogen, net primary productivity, 25 and daily water and nitrate leaching. Each of these processes is an indicator of 26 sustainability when evaluating emerging cellulosic biomass production systems for 27 bioenergy. A potentially vulnerable cellulosic biomass resource is agricultural 28 residues. This paper presents the integration of the DAYCENT model with the 29 existing integration framework modeling tool to investigate additional environment 30 impacts of agricultural residue removal. The integrated model is extended to 31 facilitate two-way coupling between DAYCENT and the existing framework. The 32 extended integrated model is applied to investigate additional environmental 33 impacts from a recent sustainable agricultural residue removal dataset. The 34 integrated model with DAYCENT finds some differences in sustainable removal 35 rates compared to previous results for a case study county in Iowa. The extended 36 integrated model with

  8. Applying the Nominal Response Model within a Longitudinal Framework to Construct the Positive Family Relationships Scale

    ERIC Educational Resources Information Center

    Preston, Kathleen Suzanne Johnson; Parral, Skye N.; Gottfried, Allen W.; Oliver, Pamella H.; Gottfried, Adele Eskeles; Ibrahim, Sirena M.; Delany, Danielle

    2015-01-01

    A psychometric analysis was conducted using the nominal response model under the item response theory framework to construct the Positive Family Relationships scale. Using data from the Fullerton Longitudinal Study, this scale was constructed within a long-term longitudinal framework spanning middle childhood through adolescence. Items tapping…

  9. Model Curriculum Standards, Program Framework, and Process Guide for Industrial and Technology Education in California.

    ERIC Educational Resources Information Center

    California State Dept. of Education, Sacramento. Div. of Career-Vocational Education.

    This three-section document contains the model curriculum standards, program framework, and process guide that will assist schools in California in providing career-vocational education programs that are responsive to a world marketplace characterized by constantly changing technology. The standards and frameworks can be implemented to provide a…

  10. Design of a component-based integrated environmental modeling framework

    EPA Science Inventory

    Integrated environmental modeling (IEM) includes interdependent science-based components (e.g., models, databases, viewers, assessment protocols) that comprise an appropriate software modeling system. The science-based components are responsible for consuming and producing inform...

  11. Linear models of coregionalization for multivariate lattice data: a general framework for coregionalized multivariate CAR models.

    PubMed

    MacNab, Ying C

    2016-09-20

    We present a general coregionalization framework for developing coregionalized multivariate Gaussian conditional autoregressive (cMCAR) models for Bayesian analysis of multivariate lattice data in general and multivariate disease mapping data in particular. This framework is inclusive of cMCARs that facilitate flexible modelling of spatially structured symmetric or asymmetric cross-variable local interactions, allowing a wide range of separable or non-separable covariance structures, and symmetric or asymmetric cross-covariances, to be modelled. We present a brief overview of established univariate Gaussian conditional autoregressive (CAR) models for univariate lattice data and develop coregionalized multivariate extensions. Classes of cMCARs are presented by formulating precision structures. The resulting conditional properties of the multivariate spatial models are established, which cast new light on cMCARs with richly structured covariances and cross-covariances of different spatial ranges. The related methods are illustrated via an in-depth Bayesian analysis of a Minnesota county-level cancer data set. We also bring a new dimension to the traditional enterprize of Bayesian disease mapping: estimating and mapping covariances and cross-covariances of the underlying disease risks. Maps of covariances and cross-covariances bring to light spatial characterizations of the cMCARs and inform on spatial risk associations between areas and diseases. Copyright © 2016 John Wiley & Sons, Ltd. PMID:27091685

  12. Assessing Students' Understandings of Biological Models and their Use in Science to Evaluate a Theoretical Framework

    NASA Astrophysics Data System (ADS)

    Grünkorn, Juliane; Belzen, Annette Upmeier zu; Krüger, Dirk

    2014-07-01

    Research in the field of students' understandings of models and their use in science describes different frameworks concerning these understandings. Currently, there is no conjoint framework that combines these structures and so far, no investigation has focused on whether it reflects students' understandings sufficiently (empirical evaluation). Therefore, the purpose of this article is to present the results of an empirical evaluation of a conjoint theoretical framework. The theoretical framework integrates relevant research findings and comprises five aspects which are subdivided into three levels each: nature of models, multiple models, purpose of models, testing, and changing models. The study was conducted with a sample of 1,177 seventh to tenth graders (aged 11-19 years) using open-ended items. The data were analysed by identifying students' understandings of models (nature of models and multiple models) and their use in science (purpose of models, testing, and changing models), and comparing as well as assigning them to the content of the theoretical framework. A comprehensive category system of students' understandings was thus developed. Regarding the empirical evaluation, the students' understandings of the nature and the purpose of models were sufficiently described by the theoretical framework. Concerning the understandings of multiple, testing, and changing models, additional initial understandings (only one model possible, no testing of models, and no change of models) need to be considered. This conjoint and now empirically tested framework for students' understandings can provide a common basis for future science education research. Furthermore, evidence-based indications can be provided for teachers and their instructional practice.

  13. EarthCube - Earth System Bridge: Spanning Scientific Communities with Interoperable Modeling Frameworks

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.; DeLuca, C.; Gochis, D. J.; Arrigo, J.; Kelbert, A.; Choi, E.; Dunlap, R.

    2014-12-01

    In order to better understand and predict environmental hazards of weather/climate, ecology and deep earth processes, geoscientists develop and use physics-based computational models. These models are used widely both in academic and federal communities. Because of the large effort required to develop and test models, there is widespread interest in component-based modeling, which promotes model reuse and simplified coupling to tackle problems that often cross discipline boundaries. In component-based modeling, the goal is to make relatively small changes to models that make it easy to reuse them as "plug-and-play" components. Sophisticated modeling frameworks exist to rapidly couple these components to create new composite models. They allow component models to exchange variables while accommodating different programming languages, computational grids, time-stepping schemes, variable names and units. Modeling frameworks have arisen in many modeling communities. CSDMS (Community Surface Dynamics Modeling System) serves the academic earth surface process dynamics community, while ESMF (Earth System Modeling Framework) serves many federal Earth system modeling projects. Others exist in both the academic and federal domains and each satisfies design criteria that are determined by the community they serve. While they may use different interface standards or semantic mediation strategies, they share fundamental similarities. The purpose of the Earth System Bridge project is to develop mechanisms for interoperability between modeling frameworks, such as the ability to share a model or service component. This project has three main goals: (1) Develop a Framework Description Language (ES-FDL) that allows modeling frameworks to be described in a standard way so that their differences and similarities can be assessed. (2) Demonstrate that if a model is augmented with a framework-agnostic Basic Model Interface (BMI), then simple, universal adapters can go from BMI to a

  14. Temporo-spatial model construction using the MML and software framework.

    PubMed

    Chang, David C; Dokos, Socrates; Lovell, Nigel H

    2011-12-01

    Development of complex temporo-spatial biological computational models can be a time consuming and arduous task. These models may contain hundreds of differential equations as well as realistic geometries that may require considerable investment in time to ensure that all model components are correctly implemented and error free. To tackle this problem, the Modeling Markup Languages (MML) and software framework is a modular XML/HDF5-based specification and toolkits that aims to simplify this process. The main goal of this framework is to encourage reusability, sharing and storage. To achieve this, the MML framework utilizes the CellML specification and repository, which comprises an extensive range of curated models available for use. The MML framework is an open-source project available at http://mml.gsbme.unsw.edu.au. PMID:21947514

  15. Using a scalable modeling and simulation framework to evaluate the benefits of intelligent transportation systems.

    SciTech Connect

    Ewing, T.; Tentner, A.

    2000-03-21

    A scalable, distributed modeling and simulation framework has been developed at Argonne National Laboratory to study Intelligent Transportation Systems. The framework can run on a single-processor workstation, or run distributed on a multiprocessor computer or network of workstations. The framework is modular and supports plug-in models, hardware, and live data sources. The initial set of models currently includes road network and traffic flow, probe and smart vehicles, traffic management centers, communications between vehicles and centers, in-vehicle navigation systems, roadway traffic advisories. The modeling and simulation capability has been used to examine proposed ITS concepts. Results are presented from modeling scenarios from the Advanced Driver and Vehicle Advisory Navigation Concept (ADVANCE) experimental program to demonstrate how the framework can be used to evaluate the benefits of ITS and to plan future ITS operational tests and deployment initiatives.

  16. Model-based reasoning in the physics laboratory: Framework and initial results

    NASA Astrophysics Data System (ADS)

    Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.

    2015-12-01

    [This paper is part of the Focused Collection on Upper Division Physics Courses.] We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable process, within physics education, it has been preferentially applied to the iterative development of broadly applicable principles (e.g., Newton's laws of motion in introductory mechanics). A significant feature of the new framework is that measurement tools (in addition to the physical system being studied) are subjected to the process of modeling. Think-aloud interviews were used to refine the framework and demonstrate its utility by documenting examples of model-based reasoning in the laboratory. When applied to the think-aloud interviews, the framework captures and differentiates students' model-based reasoning and helps identify areas of future research. The interviews showed how students productively applied similar facets of modeling to the physical system and measurement tools: construction, prediction, interpretation of data, identification of model limitations, and revision. Finally, we document students' challenges in explicitly articulating assumptions when constructing models of experimental systems and further challenges in model construction due to students' insufficient prior conceptual understanding. A modeling perspective reframes many of the seemingly arbitrary technical details of measurement tools and apparatus as an opportunity for authentic and engaging scientific sense making.

  17. Hydrogeologic Framework Model for the Saturated Zone Site Scale flow and Transport Model

    SciTech Connect

    T. Miller

    2004-11-15

    The purpose of this report is to document the 19-unit, hydrogeologic framework model (19-layer version, output of this report) (HFM-19) with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results in accordance with AP-SIII.10Q, Models. The HFM-19 is developed as a conceptual model of the geometric extent of the hydrogeologic units at Yucca Mountain and is intended specifically for use in the development of the ''Saturated Zone Site-Scale Flow Model'' (BSC 2004 [DIRS 170037]). Primary inputs to this model report include the GFM 3.1 (DTN: MO9901MWDGFM31.000 [DIRS 103769]), borehole lithologic logs, geologic maps, geologic cross sections, water level data, topographic information, and geophysical data as discussed in Section 4.1. Figure 1-1 shows the information flow among all of the saturated zone (SZ) reports and the relationship of this conceptual model in that flow. The HFM-19 is a three-dimensional (3-D) representation of the hydrogeologic units surrounding the location of the Yucca Mountain geologic repository for spent nuclear fuel and high-level radioactive waste. The HFM-19 represents the hydrogeologic setting for the Yucca Mountain area that covers about 1,350 km2 and includes a saturated thickness of about 2.75 km. The boundaries of the conceptual model were primarily chosen to be coincident with grid cells in the Death Valley regional groundwater flow model (DTN: GS960808312144.003 [DIRS 105121]) such that the base of the site-scale SZ flow model is consistent with the base of the regional model (2,750 meters below a smoothed version of the potentiometric surface), encompasses the exploratory boreholes, and provides a framework over the area of interest for groundwater flow and radionuclide transport modeling. In depth, the model domain extends from land surface to the base of the regional groundwater flow model (D'Agnese et al. 1997 [DIRS 100131], p 2). For the site-scale SZ flow model, the HFM

  18. A Framework for Multifaceted Evaluation of Student Models

    ERIC Educational Resources Information Center

    Huang, Yun; González-Brenes, José P.; Kumar, Rohit; Brusilovsky, Peter

    2015-01-01

    Latent variable models, such as the popular Knowledge Tracing method, are often used to enable adaptive tutoring systems to personalize education. However, finding optimal model parameters is usually a difficult non-convex optimization problem when considering latent variable models. Prior work has reported that latent variable models obtained…

  19. A Model Independent S/W Framework for Search-Based Software Testing

    PubMed Central

    Baik, Jongmoon

    2014-01-01

    In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model. PMID:25302314

  20. Design of a Model Execution Framework: Repetitive Object-Oriented Simulation Environment (ROSE)

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Briggs, Jeffery L.

    2008-01-01

    The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.

  1. Deep inelastic phenomena

    SciTech Connect

    Prescott, C.Y.

    1980-10-01

    Nucleon structure as seen in the context of deep inelastic scattering is discussed. The lectures begin with consideration of the quark-parton model. The model forms the basis of understanding lepton-nucleon inelastic scattering. As improved data in lepton-nucleon scattering at high energies became available, the quark-parton model failed to explain some crucial features of these data. At approximately the same time a candidate theory of strong interactions based on a SU(3) gauge theory of color was being discussed in the literature, and new ideas on the explanation of inelastic scattering data became popular. A new theory of strong interactions, now called quantum chromodynamics provides a new framework for understanding the data, with a much stronger theoretical foundation, and seems to explain well the features of the data. The lectures conclude with a look at some recent experiments which provide new data at very high energies. These lectures are concerned primarily with charged lepton inelastic scattering and to a lesser extent with neutrino results. Furthermore, due to time and space limitations, topics such as final state hadron studies, and multi-muon production are omitted here. The lectures concentrate on the more central issues: the quark-parton model and concepts of scaling, scale breaking and the ideas of quantum chromodynamics, the Q/sup 2/ dependence of structure function, moments, and the important parameter R.

  2. An Integrated Modeling Framework Forcasting Ecosystem Services--Application to the Albemarle Pamlico Basins, NC and VA (USA) and Beyond

    EPA Science Inventory

    We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...

  3. An Integrated Modeling Framework Forecasting Ecosystem Services: Application to the Albemarle Pamlico Basins, NC and VA (USA)

    EPA Science Inventory

    We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...

  4. Landscape - Soilscape Modelling: Proposed framework for a model comparison benchmarking exercise, who wants to join?

    NASA Astrophysics Data System (ADS)

    Schoorl, Jeroen M.; Jetten, Victor G.; Coulthard, Thomas J.; Hancock, Greg R.; Renschler, Chris S.; Irvine, Brian J.; Cerdan, Olivier; Kirkby, Mike J.; (A) Veldkamp, Tom

    2014-05-01

    Current landscape - soilscape modelling frameworks are developed under a wide range of spatial and temporal resolutions and extents, from the so called event-based models, soil erosion models to the landscape evolution models. In addition, these models are based on different assumptions, include variable and different processes descriptions and produce different outcomes. Consequently, the models often need specific input data and their development and calibration is best linked to a specific area and local conditions. Model validation is often limited and restricted to the shorter time scales and single events. A first workshop on catchment based modelling (6 event based models were challenged then) was organised in the late 90's and the results lead to some excellent discussions on predictive modelling, equifinality and a special issue in Catena. It is time for a similar exercise: new models have been made, older models have been updated, and judging from literature there is a lot more experience in calibration/validation and reflections on processes observed in the field and how these should be simulated. In addition there are new data sources, such as high resolution remote sensing (including DEMs), new pattern analysis, comparison techniques and continuous developments and results in dating sediment archives and erosion rates. The main goal of this renewed exercise will be to come up with a benchmarking methodology for comparing and judging model behaviour including the issues of upscaling and downscaling of results. Model comparison may lead to the development of new research questions and lead to a firmer understanding of different models performance under different circumstances.

  5. ASSESSING POPULATION EXPOSURES TO MULTIPLE AIR POLLUTANTS USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    The Modeling Environment for Total Risks studies (MENTOR) system, combined with an extension of the SHEDS (Stochastic Human Exposure and Dose Simulation) methodology, provide a mechanistically consistent framework for conducting source-to-dose exposure assessments of multiple pol...

  6. A Framework for Sharing and Integrating Remote Sensing and GIS Models Based on Web Service

    PubMed Central

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a “black box” and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users. PMID:24901016

  7. A flexible and efficient multi-model framework in support of water management

    NASA Astrophysics Data System (ADS)

    Wolfs, Vincent; Tran Quoc, Quan; Willems, Patrick

    2016-05-01

    Flexible, fast and accurate water quantity models are essential tools in support of water management. Adjustable levels of model detail and the ability to handle varying spatial and temporal resolutions are requisite model characteristics to ensure that such models can be employed efficiently in various applications. This paper uses a newly developed flexible modelling framework that aims to generate such models. The framework incorporates several approaches to model catchment hydrology, rivers and floodplains, and the urban drainage system by lumping processes on different levels. To illustrate this framework, a case study of integrated hydrological-hydraulic modelling is elaborated for the Grote Nete catchment in Belgium. Three conceptual rainfall-runoff models (NAM, PDM and VHM) were implemented in a generalized model structure, allowing flexibility in the spatial resolution by means of an innovative disaggregation/aggregation procedure. They were linked to conceptual hydraulic models of the rivers in the catchment, which were developed by means of an advanced model structure identification and calibration procedure. The conceptual models manage to emulate the simulation results of a detailed full hydrodynamic model accurately. The models configured using the approaches of this framework are well-suited for many applications in water management due to their very short calculation time, interfacing possibilities and adjustable level of detail.

  8. CONCEPTUAL FRAMEWORK FOR REGRESSION MODELING OF GROUND-WATER FLOW.

    USGS Publications Warehouse

    Cooley, Richard L.

    1985-01-01

    The author examines the uses of ground-water flow models and which classes of use require treatment of stochastic components. He then compares traditional and stochastic procedures for modeling actual (as distinguished from hypothetical) systems. Finally, he examines the conceptual basis and characteristics of the regression approach to modeling ground-water flow.

  9. An Interactive Reference Framework for Modeling a Dynamic Immune System

    PubMed Central

    Spitzer, Matthew H.; Gherardini, Pier Federico; Fragiadakis, Gabriela K.; Bhattacharya, Nupur; Yuan, Robert T.; Hotson, Andrew N.; Finck, Rachel; Carmi, Yaron; Zunder, Eli R.; Fantl, Wendy J.; Bendall, Sean C.; Engleman, Edgar G.; Nolan, Garry P.

    2015-01-01

    Immune cells function in an interacting hierarchy that coordinates activities of various cell types according to genetic and environmental contexts. We developed graphical approaches to construct an extensible immune reference map from mass cytometry data of cells from different organs, incorporating landmark cell populations as flags on the map to compare cells from distinct samples. The maps recapitulated canonical cellular phenotypes and revealed reproducible, tissue-specific deviations. The approach revealed influences of genetic variation and circadian rhythms on immune system structure, enabled direct comparisons of murine and human blood cell phenotypes, and even enabled archival fluorescence-based flow cytometry data to be mapped onto the reference framework. This foundational reference map provides a working definition of systemic immune organization to which new data can be integrated to reveal deviations driven by genetics, environment, or pathology. PMID:26160952

  10. BioASF: a framework for automatically generating executable pathway models specified in BioPAX

    PubMed Central

    Haydarlou, Reza; Jacobsen, Annika; Bonzanni, Nicola; Feenstra, K. Anton; Abeln, Sanne; Heringa, Jaap

    2016-01-01

    Motivation: Biological pathways play a key role in most cellular functions. To better understand these functions, diverse computational and cell biology researchers use biological pathway data for various analysis and modeling purposes. For specifying these biological pathways, a community of researchers has defined BioPAX and provided various tools for creating, validating and visualizing BioPAX models. However, a generic software framework for simulating BioPAX models is missing. Here, we attempt to fill this gap by introducing a generic simulation framework for BioPAX. The framework explicitly separates the execution model from the model structure as provided by BioPAX, with the advantage that the modelling process becomes more reproducible and intrinsically more modular; this ensures natural biological constraints are satisfied upon execution. The framework is based on the principles of discrete event systems and multi-agent systems, and is capable of automatically generating a hierarchical multi-agent system for a given BioPAX model. Results: To demonstrate the applicability of the framework, we simulated two types of biological network models: a gene regulatory network modeling the haematopoietic stem cell regulators and a signal transduction network modeling the Wnt/β-catenin signaling pathway. We observed that the results of the simulations performed using our framework were entirely consistent with the simulation results reported by the researchers who developed the original models in a proprietary language. Availability and Implementation: The framework, implemented in Java, is open source and its source code, documentation and tutorial are available at http://www.ibi.vu.nl/programs/BioASF. Contact: j.heringa@vu.nl PMID:27307645

  11. Approaches to implementing deterministic models in a probabilistic framework

    SciTech Connect

    Talbott, D.V.

    1995-04-01

    The increasing use of results from probabilistic risk assessments in the decision-making process makes it ever more important to eliminate simplifications in probabilistic models that might lead to conservative results. One area in which conservative simplifications are often made is modeling the physical interactions that occur during the progression of an accident sequence. This paper demonstrates and compares different approaches for incorporating deterministic models of physical parameters into probabilistic models; parameter range binning, response curves, and integral deterministic models. An example that combines all three approaches in a probabilistic model for the handling of an energetic material (i.e. high explosive, rocket propellant,...) is then presented using a directed graph model.

  12. Multiple-species analysis of point count data: A more parsimonious modelling framework

    USGS Publications Warehouse

    Alldredge, M.W.; Pollock, K.H.; Simons, T.R.; Shriner, S.A.

    2007-01-01

    1. Although population surveys often provide information on multiple species, these data are rarely analysed within a multiple-species framework despite the potential for more efficient estimation of population parameters. 2. We have developed a multiple-species modelling framework that uses similarities in capture/detection processes among species to model multiple species data more parsimoniously. We present examples of this approach applied to distance, time of detection and multiple observer sampling for avian point count data. 3. Models that included species as a covariate and individual species effects were generally selected as the best models for distance sampling, but group models without species effects performed best for the time of detection and multiple observer methods. Population estimates were more precise for no-species-effect models than for species-effect models, demonstrating the benefits of exploiting species' similarities when modelling multiple species data. Partial species-effect models and additive models were also useful because they modelled similarities among species while allowing for species differences. 4. Synthesis and applications. We recommend the adoption of multiple-species modelling because of its potential for improved population estimates. This framework will be particularly beneficial for modelling count data from rare species because information on the detection process can be 'borrowed' from more common species. The multiple-species modelling framework presented here is applicable to a wide range of sampling techniques and taxa. ?? 2007 The Authors.

  13. Enhancing a socio-hydrological modelling framework through field observations: a case study in India

    NASA Astrophysics Data System (ADS)

    den Besten, Nadja; Pande, Saket; Savenije, Huub H. G.

    2016-04-01

    Recently a smallholder socio-hydrological modelling framework was proposed and deployed to understand the underlying dynamics of Agrarian Crisis in Maharashtra state of India. It was found that cotton and sugarcane smallholders whom lack irrigation and storage techniques are most susceptible to distress. This study further expands the application of the modelling framework to other crops that are abundant in the state of Maharashtra, such as Paddy, Jowar and Soyabean to assess whether the conclusions on the possible causes behind smallholder distress still hold. Further, a fieldwork will be undertaken in March 2016 in the district of Pune. During the fieldwork 50 smallholders will be interviewed in which socio-hydrological assumptions on hydrology and capital equations and corresponding closure relationships, incorporated the current model, will be put to test. Besides the assumptions, the questionnaires will be used to better understand the hydrological reality of the farm holders, in terms of water usage and storage capacity. In combination with historical records on the smallholders' socio-economic data acquired over the last thirty years available through several NGOs in the region, socio-hydrological realism of the modelling framework will be enhanced. The preliminary outcomes of a desktop study show the possibilities of a water-centric modelling framework in understanding the constraints on smallholder farming. The results and methods described can be a first step guiding following research on the modelling framework: a start in testing the framework in multiple rural locations around the globe.

  14. Comparing droplet activation parameterisations against adiabatic parcel models using a novel inverse modelling framework

    NASA Astrophysics Data System (ADS)

    Partridge, Daniel; Morales, Ricardo; Stier, Philip

    2015-04-01

    Many previous studies have compared droplet activation parameterisations against adiabatic parcel models (e.g. Ghan et al., 2001). However, these have often involved comparisons for a limited number of parameter combinations based upon certain aerosol regimes. Recent studies (Morales et al., 2014) have used wider ranges when evaluating their parameterisations, however, no study has explored the full possible multi-dimensional parameter space that would be experienced by droplet activations within a global climate model (GCM). It is important to be able to efficiently highlight regions of the entire multi-dimensional parameter space in which we can expect the largest discrepancy between parameterisation and cloud parcel models in order to ascertain which regions simulated by a GCM can be expected to be a less accurate representation of the process of cloud droplet activation. This study provides a new, efficient, inverse modelling framework for comparing droplet activation parameterisations to more complex cloud parcel models. To achieve this we couple a Markov Chain Monte Carlo algorithm (Partridge et al., 2012) to two independent adiabatic cloud parcel models and four droplet activation parameterisations. This framework is computationally faster than employing a brute force Monte Carlo simulation, and allows us to transparently highlight which parameterisation provides the closest representation across all aerosol physiochemical and meteorological environments. The parameterisations are demonstrated to perform well for a large proportion of possible parameter combinations, however, for certain key parameters; most notably the vertical velocity and accumulation mode aerosol concentration, large discrepancies are highlighted. These discrepancies correspond for parameter combinations that result in very high/low simulated values of maximum supersaturation. By identifying parameter interactions or regimes within the multi-dimensional parameter space we hope to guide

  15. Integrated Bayesian network framework for modeling complex ecological issues.

    PubMed

    Johnson, Sandra; Mengersen, Kerrie

    2012-07-01

    The management of environmental problems is multifaceted, requiring varied and sometimes conflicting objectives and perspectives to be considered. Bayesian network (BN) modeling facilitates the integration of information from diverse sources and is well suited to tackling the management challenges of complex environmental problems. However, combining several perspectives in one model can lead to large, unwieldy BNs that are difficult to maintain and understand. Conversely, an oversimplified model may lead to an unrealistic representation of the environmental problem. Environmental managers require the current research and available knowledge about an environmental problem of interest to be consolidated in a meaningful way, thereby enabling the assessment of potential impacts and different courses of action. Previous investigations of the environmental problem of interest may have already resulted in the construction of several disparate ecological models. On the other hand, the opportunity may exist to initiate this modeling. In the first instance, the challenge is to integrate existing models and to merge the information and perspectives from these models. In the second instance, the challenge is to include different aspects of the environmental problem incorporating both the scientific and management requirements. Although the paths leading to the combined model may differ for these 2 situations, the common objective is to design an integrated model that captures the available information and research, yet is simple to maintain, expand, and refine. BN modeling is typically an iterative process, and we describe a heuristic method, the iterative Bayesian network development cycle (IBNDC), for the development of integrated BN models that are suitable for both situations outlined above. The IBNDC approach facilitates object-oriented BN (OOBN) modeling, arguably viewed as the next logical step in adaptive management modeling, and that embraces iterative development

  16. Introducing MERGANSER: A Flexible Framework for Ecological Niche Modeling

    NASA Astrophysics Data System (ADS)

    Klawonn, M.; Dow, E. M.

    2015-12-01

    Ecological Niche Modeling (ENM) is a collection of techniques to find a "fundamental niche", the range of environmental conditions suitable for a species' survival in the absence of inter-species interactions, given a set of environmental parameters. Traditional approaches to ENM face a number of obstacles including limited data accessibility, data management problems, computational costs, interface usability, and model validation. The MERGANSER system, which stands for Modeling Ecological Residency Given A Normalized Set of Environmental Records, addresses these issues through powerful data persistence and flexible data access, coupled with a clear presentation of results and fine-tuned control over model parameters. MERGANSER leverages data measuring 72 weather related phenomena, land cover, soil type, population, species occurrence, general species information, and elevation, totaling over 1.5 TB of data. To the best of the authors' knowledge, MERGANSER uses higher-resolution spatial data sets than previously published models. Since MERGANSER stores data in an instance of Apache SOLR, layers generated in support of niche models are accessible to users via simplified Apache Lucene queries. This is made even simpler via an HTTP front end that generates Lucene queries automatically. Specifically, a user need only enter the name of a place and a species to run a model. Using this approach to synthesizing model layers, the MERGANSER system has successfully reproduced previously published niche model results with a simplified user experience. Input layers for the model are generated dynamically using OpenStreetMap and SOLR's spatial search functionality. Models are then run using either user-specified or automatically determined parameters after normalizing them into a common grid. Finally, results are visualized in the web interface, which allows for quick validation. Model results and all surrounding metadata are also accessible to the user for further study.

  17. Towards a Framework for Modeling Space Systems Architectures

    NASA Technical Reports Server (NTRS)

    Shames, Peter; Skipper, Joseph

    2006-01-01

    Topics covered include: 1) Statement of the problem: a) Space system architecture is complex; b) Existing terrestrial approaches must be adapted for space; c) Need a common architecture methodology and information model; d) Need appropriate set of viewpoints. 2) Requirements on a space systems model. 3) Model Based Engineering and Design (MBED) project: a) Evaluated different methods; b) Adapted and utilized RASDS & RM-ODP; c) Identified useful set of viewpoints; d) Did actual model exchanges among selected subset of tools. 4) Lessons learned & future vision.

  18. Integration of the Radiation Belt Environment Model Into the Space Weather Modeling Framework

    NASA Technical Reports Server (NTRS)

    Glocer, A.; Toth, G.; Fok, M.; Gombosi, T.; Liemohn, M.

    2009-01-01

    We have integrated the Fok radiation belt environment (RBE) model into the space weather modeling framework (SWMF). RBE is coupled to the global magnetohydrodynamics component (represented by the Block-Adaptive-Tree Solar-wind Roe-type Upwind Scheme, BATS-R-US, code) and the Ionosphere Electrodynamics component of the SWMF, following initial results using the Weimer empirical model for the ionospheric potential. The radiation belt (RB) model solves the convection-diffusion equation of the plasma in the energy range of 10 keV to a few MeV. In stand-alone mode RBE uses Tsyganenko's empirical models for the magnetic field, and Weimer's empirical model for the ionospheric potential. In the SWMF the BATS-R-US model provides the time dependent magnetic field by efficiently tracing the closed magnetic field-lines and passing the geometrical and field strength information to RBE at a regular cadence. The ionosphere electrodynamics component uses a two-dimensional vertical potential solver to provide new potential maps to the RBE model at regular intervals. We discuss the coupling algorithm and show some preliminary results with the coupled code. We run our newly coupled model for periods of steady solar wind conditions and compare our results to the RB model using an empirical magnetic field and potential model. We also simulate the RB for an active time period and find that there are substantial differences in the RB model results when changing either the magnetic field or the electric field, including the creation of an outer belt enhancement via rapid inward transport on the time scale of tens of minutes.

  19. A general framework for modeling growth and division of mammalian cells

    PubMed Central

    2011-01-01

    Background Modeling the cell-division cycle has been practiced for many years. As time has progressed, this work has gone from understanding the basic principles to addressing distinct biological problems, e.g., the nature of the restriction point, how checkpoints operate, the nonlinear dynamics of the cell cycle, the effect of localization, etc. Most models consist of coupled ordinary differential equations developed by the researchers, restricted to deal with the interactions of a limited number of molecules. In the future, cell-cycle modeling--and indeed all modeling of complex biologic processes--will increase in scope and detail. Results A framework for modeling complex cell-biologic processes is proposed here. The framework is based on two constructs: one describing the entire lifecycle of a molecule and the second describing the basic cellular machinery. Use of these constructs allows complex models to be built in a straightforward manner that fosters rigor and completeness. To demonstrate the framework, an example model of the mammalian cell cycle is presented that consists of several hundred differential equations of simple mass action kinetics. The model calculates energy usage, amino acid and nucleotide usage, membrane transport, RNA synthesis and destruction, and protein synthesis and destruction for 33 proteins to give an in-depth look at the cell cycle. Conclusions The framework presented here addresses how to develop increasingly descriptive models of complex cell-biologic processes. The example model of cellular growth and division constructed with the framework demonstrates that large structured models can be created with the framework, and these models can generate non-trivial descriptions of cellular processes. Predictions from the example model include those at both the molecular level--e.g., Wee1 spontaneously reactivates--and at the system level--e.g., pathways for timing-critical processes must shut down redundant pathways. A future effort is

  20. Experimental analysis of chaotic neural network models for combinatorial optimization under a unifying framework.

    PubMed

    Kwok, T; Smith, K A

    2000-09-01

    The aim of this paper is to study both the theoretical and experimental properties of chaotic neural network (CNN) models for solving combinatorial optimization problems. Previously we have proposed a unifying framework which encompasses the three main model types, namely, Chen and Aihara's chaotic simulated annealing (CSA) with decaying self-coupling, Wang and Smith's CSA with decaying timestep, and the Hopfield network with chaotic noise. Each of these models can be represented as a special case under the framework for certain conditions. This paper combines the framework with experimental results to provide new insights into the effect of the chaotic neurodynamics of each model. By solving the N-queen problem of various sizes with computer simulations, the CNN models are compared in different parameter spaces, with optimization performance measured in terms of feasibility, efficiency, robustness and scalability. Furthermore, characteristic chaotic neurodynamics crucial to effective optimization are identified, together with a guide to choosing the corresponding model parameters. PMID:11152205

  1. A Model Framework for Course Materials Construction (Second Edition).

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    Designed for use by Coast Guard course writers, curriculum developers, course coordinators, and instructors as a decision-support system, this publication presents a model that translates the Intraservices Procedures for Instructional Systems Development curriculum design model into materials usable by classroom teachers and students. Although…

  2. Simulation Framework for Teaching in Modeling and Simulation Areas

    ERIC Educational Resources Information Center

    De Giusti, Marisa Raquel; Lira, Ariel Jorge; Villarreal, Gonzalo Lujan

    2008-01-01

    Simulation is the process of executing a model that describes a system with enough detail; this model has its entities, an internal state, some input and output variables and a list of processes bound to these variables. Teaching a simulation language such as general purpose simulation system (GPSS) is always a challenge, because of the way it…

  3. Abdominal surgery process modeling framework for simulation using spreadsheets.

    PubMed

    Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja

    2015-08-01

    We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. PMID:26004999

  4. Source-to-Outcome Microbial Exposure and Risk Modeling Framework

    EPA Science Inventory

    A Quantitative Microbial Risk Assessment (QMRA) is a computer-based data-delivery and modeling approach that integrates interdisciplinary fate/transport, exposure, and impact models and databases to characterize potential health impacts/risks due to pathogens. As such, a QMRA ex...

  5. A MULTISCALE, CELL-BASED FRAMEWORK FOR MODELING CANCER DEVELOPMENT

    SciTech Connect

    JIANG, YI

    2007-01-16

    Cancer remains to be one of the leading causes of death due to diseases. We use a systems approach that combines mathematical modeling, numerical simulation, in vivo and in vitro experiments, to develop a predictive model that medical researchers can use to study and treat cancerous tumors. The multiscale, cell-based model includes intracellular regulations, cellular level dynamics and intercellular interactions, and extracellular level chemical dynamics. The intracellular level protein regulations and signaling pathways are described by Boolean networks. The cellular level growth and division dynamics, cellular adhesion and interaction with the extracellular matrix is described by a lattice Monte Carlo model (the Cellular Potts Model). The extracellular dynamics of the signaling molecules and metabolites are described by a system of reaction-diffusion equations. All three levels of the model are integrated through a hybrid parallel scheme into a high-performance simulation tool. The simulation results reproduce experimental data in both avasular tumors and tumor angiogenesis. By combining the model with experimental data to construct biologically accurate simulations of tumors and their vascular systems, this model will enable medical researchers to gain a deeper understanding of the cellular and molecular interactions associated with cancer progression and treatment.

  6. Developing an Interdisciplinary Curriculum Framework for Aquatic-Ecosystem Modeling

    ERIC Educational Resources Information Center

    Saito, Laurel; Segale, Heather M.; DeAngelis, Donald L.; Jenkins, Stephen H.

    2007-01-01

    This paper presents results from a July 2005 workshop and course aimed at developing an interdisciplinary course on modeling aquatic ecosystems that will provide the next generation of practitioners with critical skills for which formal training is presently lacking. Five different course models were evaluated: (1) fundamentals/general principles…

  7. An integrated hydrologic modeling framework for coupling SWAT with MODFLOW

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Soil and Water Assessment Tool (SWAT), MODFLOW, and Energy Balance based Evapotranspiration (EB_ET) models are extensively used to estimate different components of the hydrological cycle. Surface and subsurface hydrological processes are modeled in SWAT but limited to the extent of shallow aquif...

  8. KINEROS2 and the AGWA Modeling Framework 2013

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Kinematic Runoff and Erosion Model, KINEROS2, is a distributed, physically-based, event model describing the processes of interception, dynamic infiltration, surface runoff, and erosion from watersheds characterized by predominantly overland flow. The watershed is conceptualized as a cascade of...

  9. The Relational-Cultural Model: A Framework for Group Process

    ERIC Educational Resources Information Center

    Comstock, Dana L.; Duffey, Thelma; St. George, Holly

    2002-01-01

    The relational-cultural model of psychotherapy has been evolving for the past 20 years. Within this model, difficult group dynamics are conceptualized as the playing out of the central relational paradox. This paradox recognizes that an individual may yearn for connection but, out of a sense of fear, simultaneously employ strategies that restrict…

  10. A framework for multi-criteria assessment of model enhancements

    NASA Astrophysics Data System (ADS)

    Francke, Till; Foerster, Saskia; Brosinsky, Arlena; Delgado, José; Güntner, Andreas; López-Tarazón, José A.; Bronstert, Axel

    2016-04-01

    Modellers are often faced with unsatisfactory model performance for a specific setup of a hydrological model. In these cases, the modeller may try to improve the setup by addressing selected causes for the model errors (i.e. data errors, structural errors). This leads to adding certain "model enhancements" (MEs), e.g. climate data based on more monitoring stations, improved calibration data, modifications in process formulations. However, deciding on which MEs to implement remains a matter of expert knowledge, guided by some sensitivity analysis at best. When multiple MEs have been implemented, a resulting improvement in model performance is not easily attributed, especially when considering different aspects of this improvement (e.g. better performance dynamics vs. reduced bias). In this study we present an approach for comparing the effect of multiple MEs in the face of multiple improvement aspects. A stepwise selection approach and structured plots help in addressing the multidimensionality of the problem. The approach is applied to a case study, which employs the meso-scale hydrosedimentological model WASA-SED for a sub-humid catchment. The results suggest that the effect of the MEs is quite diverse, with some MEs (e.g. augmented rainfall data) cause improvements for almost all aspects, while the effect of other MEs is restricted to few aspects or even deteriorate some. These specific results may not be generalizable. However, we suggest that based on studies like this, identifying the most promising MEs to implement may be facilitated.

  11. The Conceptual Framework of Factors Affecting Shared Mental Model

    ERIC Educational Resources Information Center

    Lee, Miyoung; Johnson, Tristan; Lee, Youngmin; O'Connor, Debra; Khalil, Mohammed

    2004-01-01

    Many researchers have paid attention to the potentiality and possibility of the shared mental model because it enables teammates to perform their job better by sharing team knowledge, skills, attitudes, dynamics and environments. Even though theoretical and experimental evidences provide a close relationship between the shared mental model and…

  12. a Spatio-Temporal Framework for Modeling Active Layer Thickness

    NASA Astrophysics Data System (ADS)

    Touyz, J.; Streletskiy, D. A.; Nelson, F. E.; Apanasovich, T. V.

    2015-07-01

    The Arctic is experiencing an unprecedented rate of environmental and climate change. The active layer (the uppermost layer of soil between the atmosphere and permafrost that freezes in winter and thaws in summer) is sensitive to both climatic and environmental changes, and plays an important role in the functioning, planning, and economic activities of Arctic human and natural ecosystems. This study develops a methodology for modeling and estimating spatial-temporal variations in active layer thickness (ALT) using data from several sites of the Circumpolar Active Layer Monitoring network, and demonstrates its use in spatial-temporal interpolation. The simplest model's stochastic component exhibits no spatial or spatio-temporal dependency and is referred to as the naïve model, against which we evaluate the performance of the other models, which assume that the stochastic component exhibits either spatial or spatio-temporal dependency. The methods used to fit the models are then discussed, along with point forecasting. We compare the predicted fit of the various models at key study sites located in the North Slope of Alaska and demonstrate the advantages of space-time models through a series of error statistics such as mean squared error, mean absolute and percent deviance from observed data. We find the difference in performance between the spatio-temporal and remaining models is significant for all three error statistics. The best stochastic spatio-temporal model increases predictive accuracy, compared to the naïve model, of 33.3%, 36.2% and 32.5% on average across the three error metrics at the key sites for a one-year hold out period.

  13. A general framework for application of prestrain to computational models of biological materials.

    PubMed

    Maas, Steve A; Erdemir, Ahmet; Halloran, Jason P; Weiss, Jeffrey A

    2016-08-01

    It is often important to include prestress in computational models of biological tissues. The prestress can represent residual stresses (stresses that exist after the tissue is excised from the body) or in situ stresses (stresses that exist in vivo, in the absence of loading). A prestressed reference configuration may also be needed when modeling the reference geometry of biological tissues in vivo. This research developed a general framework for representing prestress in finite element models of biological materials. It is assumed that the material is elastic, allowing the prestress to be represented via a prestrain. For prestrain fields that are not compatible with the reference geometry, the computational framework provides an iterative algorithm for updating the prestrain until equilibrium is satisfied. The iterative framework allows for enforcement of two different constraints: elimination of distortion in order to address the incompatibility issue, and enforcing a specified in situ fiber strain field while allowing for distortion. The framework was implemented as a plugin in FEBio (www.febio.org), making it easy to maintain the software and to extend the framework if needed. Several examples illustrate the application and effectiveness of the approach, including the application of in situ strains to ligaments in the Open Knee model (simtk.org/home/openknee). A novel method for recovering the stress-free configuration from the prestrain deformation gradient is also presented. This general purpose theoretical and computational framework for applying prestrain will allow analysts to overcome the challenges in modeling this important aspect of biological tissue mechanics. PMID:27131609

  14. Incorporating physically-based microstructures in materials modeling: Bridging phase field and crystal plasticity frameworks

    NASA Astrophysics Data System (ADS)

    Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.; Hanks, Byron W.; Foulk, James W.; Battaile, Corbett C.

    2016-05-01

    The mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FE meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.

  15. Incorporating physically-based microstructures in materials modeling: Bridging phase field and crystal plasticity frameworks

    DOE PAGESBeta

    Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.; Hanks, Byron W.; Foulk, James W.; Battaile, Corbett C.

    2016-04-25

    Here, the mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FEmore » meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.« less

  16. A FRAMEWORK FOR FINE-SCALE COMPUTATIONAL FLUID DYNAMICS AIR QUALITY MODELING AND ANALYSIS

    EPA Science Inventory

    This paper discusses a framework for fine-scale CFD modeling that may be developed to complement the present Community Multi-scale Air Quality (CMAQ) modeling system which itself is a computational fluid dynamics model. A goal of this presentation is to stimulate discussions on w...

  17. The Community Earth System Model: A Framework for Collaborative Research

    SciTech Connect

    Hurrell, Jim; Holland, Marika M.; Gent, Peter R.; Ghan, Steven J.; Kay, Jennifer; Kushner, P.; Lamarque, J.-F.; Large, William G.; Lawrence, David M.; Lindsay, Keith; Lipscomb, William; Long , Matthew; Mahowald, N.; Marsh, D.; Neale, Richard; Rasch, Philip J.; Vavrus, Steven J.; Vertenstein, Mariana; Bader, David C.; Collins, William D.; Hack, James; Kiehl, J. T.; Marshall, Shawn

    2013-09-30

    The Community Earth System Model (CESM) is a flexible and extensible community tool used to investigate a diverse set of earth system interactions across multiple time and space scales. This global coupled model is a natural evolution from its predecessor, the Community Climate System Model, following the incorporation of new earth system capabilities. These include the ability to simulate biogeochemical cycles, atmospheric chemistry, ice sheets, and a high-top atmosphere. These and other new model capabilities are enabling investigations into a wide range of pressing scientific questions, providing new predictive capabilities and increasing our collective knowledge about the behavior and interactions of the earth system. Simulations with numerous configurations of the CESM have been provided to the Coupled Model Intercomparison Project Phase 5 (CMIP5) and are being analyzed by the broader community of scientists. Additionally, the model source code and associated documentation are freely available to the scientific community to use for earth system studies, making it a true community tool. Here we describe this earth modeling system, its various possible configurations, and illustrate its capabilities with a few science highlights.

  18. Brokering as a framework for hydrological model repeatability

    NASA Astrophysics Data System (ADS)

    Fuka, Daniel; Collick, Amy; MacAlister, Charlotte; Braeckel, Aaron; Wright, Dawn; Jodha Khalsa, Siri; Boldrini, Enrico; Easton, Zachary

    2015-04-01

    Data brokering aims to provide those in the the sciences with quick and repeatable access to data that represents physical, biological, and chemical characteristics; specifically to accelerate scientific discovery. Environmental models are useful tools to understand the behavior of hydrological systems. Unfortunately, parameterization of these hydrological models requires many different data, from different sources, and from different disciplines (e.g., atmospheric, geoscience, ecology). In basin scale hydrological modeling, the traditional procedure for model initialization starts with obtaining elevation models, land-use characterizations, soils maps, and weather data. It is often the researcher's past experience with these datasets that determines which datasets will be used in a study, and often newer, or more suitable data products will exist. An added complexity is that various science communities have differing data formats, storage protocols, and manipulation methods, which makes use by a non native user exceedingly difficult and time consuming. We demonstrate data brokering as a means to address several of these challenges. We present two test case scenarios in which researchers attempt to reproduce hydrological model results using 1) general internet based data gathering techniques, and 2) a scientific data brokering interface. We show that data brokering can increase the efficiency with which data are obtained, models are initialized, and results are analyzed. As an added benefit, it appears brokering can significantly increase the repeatability of a given study.

  19. Bayesian model selection framework for identifying growth patterns in filamentous fungi.

    PubMed

    Lin, Xiao; Terejanu, Gabriel; Shrestha, Sajan; Banerjee, Sourav; Chanda, Anindya

    2016-06-01

    This paper describes a rigorous methodology for quantification of model errors in fungal growth models. This is essential to choose the model that best describes the data and guide modeling efforts. Mathematical modeling of growth of filamentous fungi is necessary in fungal biology for gaining systems level understanding on hyphal and colony behaviors in different environments. A critical challenge in the development of these mathematical models arises from the indeterminate nature of their colony architecture, which is a result of processing diverse intracellular signals induced in response to a heterogeneous set of physical and nutritional factors. There exists a practical gap in connecting fungal growth models with measurement data. Here, we address this gap by introducing the first unified computational framework based on Bayesian inference that can quantify individual model errors and rank the statistical models based on their descriptive power against data. We show that this Bayesian model comparison is just a natural formalization of Occam׳s razor. The application of this framework is discussed in comparing three models in the context of synthetic data generated from a known true fungal growth model. This framework of model comparison achieves a trade-off between data fitness and model complexity and the quantified model error not only helps in calibrating and comparing the models, but also in making better predictions and guiding model refinements. PMID:27000772

  20. A Model Framework for Science and Other Course Materials Construction.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    A model is presented to provide guidance for Coast Guard writers, curriculum developers, course coordinators, and instructors who intend to update, or draft course materials. Detailed instructions are provided for developing instructor's guides and student's guides. (CS)

  1. A model integration framework for linking SWAT and MODFLOW

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Hydrological response and transport phenomena are driven by atmospheric, surface and subsurface processes. These complex processes occur at different spatiotemporal scales requiring comprehensive modeling to assess the impact of anthropogenic activity on hydrology and fate and transport of chemical ...

  2. A PROBABILISTIC MODELING FRAMEWORK FOR PREDICTING POPULATION EXPOSURES TO BENZENE

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) is modifying their probabilistic Stochastic Human Exposure Dose Simulation (SHEDS) model to assess aggregate exposures to air toxics. Air toxics include urban Hazardous Air Pollutants (HAPS) such as benzene from mobile sources, part...

  3. CONCEPTUAL MODEL DEVELOPMENT AND INFORMATION MANAGEMENT FRAMEWORK FOR DIAGNOSTICS RESEARCH

    EPA Science Inventory

    Conceptual model development will focus on the effects of habitat alteration, nutrients,suspended and bedded sediments, and toxic chemicals on appropriate endpoints (individuals, populations, communities, ecosystems) across spatial scales (habitats, water body, watershed, region)...

  4. Effective Thermal Conductivity Modeling of Sandstones: SVM Framework Analysis

    NASA Astrophysics Data System (ADS)

    Rostami, Alireza; Masoudi, Mohammad; Ghaderi-Ardakani, Alireza; Arabloo, Milad; Amani, Mahmood

    2016-06-01

    Among the most significant physical characteristics of porous media, the effective thermal conductivity (ETC) is used for estimating the thermal enhanced oil recovery process efficiency, hydrocarbon reservoir thermal design, and numerical simulation. This paper reports the implementation of an innovative least square support vector machine (LS-SVM) algorithm for the development of enhanced model capable of predicting the ETCs of dry sandstones. By means of several statistical parameters, the validity of the presented model was evaluated. The prediction of the developed model for determining the ETCs of dry sandstones was in excellent agreement with the reported data with a coefficient of determination value ({R}2) of 0.983 and an average absolute relative deviation of 0.35 %. Results from present research show that the proposed LS-SVM model is robust, reliable, and efficient in calculating the ETCs of sandstones.

  5. Model-driven CDA Clinical Document Development Framework.

    PubMed

    Li, Jingdong; Lincoln, Michael J

    2007-01-01

    The Health Level 7 (HL7) Clinical Document Architecture, Release 2 (CDA R2) standardizes the structure and semantics of clinical documents in order to permit interchange. We have applied this standard to generate a platform independent CDA model and write a toolset that permits model specialization, generation of XML implementation artifacts, and provides an interface for clinical data managers. The resulting work was tested using US Department of Veterans Affairs Operative Note templates. PMID:18694129

  6. The Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP): project framework.

    PubMed

    Warszawski, Lila; Frieler, Katja; Huber, Veronika; Piontek, Franziska; Serdeczny, Olivia; Schewe, Jacob

    2014-03-01

    The Inter-Sectoral Impact Model Intercomparison Project offers a framework to compare climate impact projections in different sectors and at different scales. Consistent climate and socio-economic input data provide the basis for a cross-sectoral integration of impact projections. The project is designed to enable quantitative synthesis of climate change impacts at different levels of global warming. This report briefly outlines the objectives and framework of the first, fast-tracked phase of Inter-Sectoral Impact Model Intercomparison Project, based on global impact models, and provides an overview of the participating models, input data, and scenario set-up. PMID:24344316

  7. The Inter-Sectoral Impact Model Intercomparison Project (ISI–MIP): Project framework

    PubMed Central

    Warszawski, Lila; Frieler, Katja; Huber, Veronika; Piontek, Franziska; Serdeczny, Olivia; Schewe, Jacob

    2014-01-01

    The Inter-Sectoral Impact Model Intercomparison Project offers a framework to compare climate impact projections in different sectors and at different scales. Consistent climate and socio-economic input data provide the basis for a cross-sectoral integration of impact projections. The project is designed to enable quantitative synthesis of climate change impacts at different levels of global warming. This report briefly outlines the objectives and framework of the first, fast-tracked phase of Inter-Sectoral Impact Model Intercomparison Project, based on global impact models, and provides an overview of the participating models, input data, and scenario set-up. PMID:24344316

  8. Multiscale Model of Colorectal Cancer Using the Cellular Potts Framework

    PubMed Central

    Osborne, James M

    2015-01-01

    Colorectal cancer (CRC) is one of the major causes of death in the developed world and forms a canonical example of tumorigenesis. CRC arises from a string of mutations of individual cells in the colorectal crypt, making it particularly suited for multiscale multicellular modeling, where mutations of individual cells can be clearly represented and their effects readily tracked. In this paper, we present a multicellular model of the onset of colorectal cancer, utilizing the cellular Potts model (CPM). We use the model to investigate how, through the modification of their mechanical properties, mutant cells colonize the crypt. Moreover, we study the influence of mutations on the shape of cells in the crypt, suggesting possible cell- and tissue-level indicators for identifying early-stage cancerous crypts. Crucially, we discuss the effect that the motility parameters of the model (key factors in the behavior of the CPM) have on the distribution of cells within a homeostatic crypt, resulting in an optimal parameter regime that accurately reflects biological assumptions. In summary, the key results of this paper are 1) how to couple the CPM with processes occurring on other spatial scales, using the example of the crypt to motivate suitable motility parameters; 2) modeling mutant cells with the CPM; 3) and investigating how mutations influence the shape of cells in the crypt. PMID:26461973

  9. Model Adaptation for Prognostics in a Particle Filtering Framework

    NASA Technical Reports Server (NTRS)

    Saha, Bhaskar; Goebel, Kai Frank

    2011-01-01

    One of the key motivating factors for using particle filters for prognostics is the ability to include model parameters as part of the state vector to be estimated. This performs model adaptation in conjunction with state tracking, and thus, produces a tuned model that can used for long term predictions. This feature of particle filters works in most part due to the fact that they are not subject to the "curse of dimensionality", i.e. the exponential growth of computational complexity with state dimension. However, in practice, this property holds for "well-designed" particle filters only as dimensionality increases. This paper explores the notion of wellness of design in the context of predicting remaining useful life for individual discharge cycles of Li-ion batteries. Prognostic metrics are used to analyze the tradeoff between different model designs and prediction performance. Results demonstrate how sensitivity analysis may be used to arrive at a well-designed prognostic model that can take advantage of the model adaptation properties of a particle filter.

  10. A full annual cycle modeling framework for American black ducks

    USGS Publications Warehouse

    Robinson, Orin J.; McGowan, Conor; Devers, Patrick K.; Brook, Rodney W.; Huang, Min; Jones, Malcom; McAuley, Daniel G.; Zimmerman, Guthrie

    2016-01-01

    American black ducks (Anas rubripes) are a harvested, international migratory waterfowl species in eastern North America. Despite an extended period of restrictive harvest regulations, the black duck population is still below the population goal identified in the North American Waterfowl Management Plan (NAWMP). It has been hypothesized that density-dependent factors restrict population growth in the black duck population and that habitat management (increases, improvements, etc.) may be a key component of growing black duck populations and reaching the prescribed NAWMP population goal. Using banding data from 1951 to 2011 and breeding population survey data from 1990 to 2014, we developed a full annual cycle population model for the American black duck. This model uses the seven management units as set by the Black Duck Joint Venture, allows movement into and out of each unit during each season, and models survival and fecundity for each region separately. We compare model population trajectories with observed population data and abundance estimates from the breeding season counts to show the accuracy of this full annual cycle model. With this model, we then show how to simulate the effects of habitat management on the continental black duck population.

  11. Design theoretic analysis of three system modeling frameworks.

    SciTech Connect

    McDonald, Michael James

    2007-05-01

    This paper analyzes three simulation architectures from the context of modeling scalability to address System of System (SoS) and Complex System problems. The paper first provides an overview of the SoS problem domain and reviews past work in analyzing model and general system complexity issues. It then identifies and explores the issues of vertical and horizontal integration as well as coupling and hierarchical decomposition as the system characteristics and metrics against which the tools are evaluated. In addition, it applies Nam Suh's Axiomatic Design theory as a construct for understanding coupling and its relationship to system feasibility. Next it describes the application of MATLAB, Swarm, and Umbra (three modeling and simulation approaches) to modeling swarms of Unmanned Flying Vehicle (UAV) agents in relation to the chosen characteristics and metrics. Finally, it draws general conclusions for analyzing model architectures that go beyond those analyzed. In particular, it identifies decomposition along phenomena of interaction and modular system composition as enabling features for modeling large heterogeneous complex systems.

  12. Data-Model Comparisons of the October, 2002 Event Using the Space Weather Modeling Framework

    NASA Astrophysics Data System (ADS)

    Welling, D. T.; Chappell, C. R.; Schunk, R. W.; Barakat, A. R.; Eccles, V.; Glocer, A.; Kistler, L. M.; Haaland, S.; Moore, T. E.

    2014-12-01

    The September 27 - October 4, 2002 time period has been selected by the Geospace Environment Modeling Ionospheric Outflow focus group for community collaborative study because of its high magnetospheric activity and extensive data coverage. The FAST, Polar, and Cluster missions, as well as others, all made key observations during this period, creating a prime event for data-model comparisons. The GEM community has come together to simulate this period using many different methods in order to evaluate models, compare results, and expand our knowledge of ionospheric outflow and its effects on global dynamics. This paper presents Space Weather Modeling Framework (SWMF) simulations of this important period compared against observations from the Polar TIDE, Cluster CODIF and EFW instruments. Density and velocity of oxygen and hydrogen throughout the lobes, plasmasheet, and inner magnetosphere will be the focus of these comparisons. For these simulations, the SWMF couples the multifluid version of BATS-R-US MHD to a variety of ionospheric outflow models of varying complexity. The simplest is outflow arising from constant MHD inner boundary conditions. Two first-principles-based models are also leveraged: the Polar Wind Outflow Model (PWOM), a fluid treatment of outflow dynamics, and the Generalized Polar Wind (GPW) model, which combines fluid and particle-in-cell approaches. Each model is capable of capturing a different set of energization mechanisms, yielding different outflow results. The data-model comparisons will illustrate how well each approach captures reality and which energization mechanisms are most important. This work will also assess our current capability to reproduce ionosphere-magnetosphere mass coupling.

  13. Surgical model-view-controller simulation software framework for local and collaborative applications

    PubMed Central

    Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2010-01-01

    Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933

  14. Development of a practical modeling framework for estimating the impact of wind technology on bird populations

    SciTech Connect

    Morrison, M.L.; Pollock, K.H.

    1997-11-01

    One of the most pressing environmental concerns related to wind project development is the potential for avian fatalities caused by the turbines. The goal of this project is to develop a useful, practical modeling framework for evaluating potential wind power plant impacts that can be generalized to most bird species. This modeling framework could be used to get a preliminary understanding of the likelihood of significant impacts to birds, in a cost-effective way. The authors accomplish this by (1) reviewing the major factors that can influence the persistence of a wild population; (2) briefly reviewing various models that can aid in estimating population status and trend, including methods of evaluating model structure and performance; (3) reviewing survivorship and population projections; and (4) developing a framework for using models to evaluate the potential impacts of wind development on birds.

  15. Model Components of the Certification Framework for Geologic Carbon Sequestration Risk Assessment

    SciTech Connect

    Oldenburg, Curtis M.; Bryant, Steven L.; Nicot, Jean-Philippe; Kumar, Navanit; Zhang, Yingqi; Jordan, Preston; Pan, Lehua; Granvold, Patrick; Chow, Fotini K.

    2009-06-01

    We have developed a framework for assessing the leakage risk of geologic carbon sequestration sites. This framework, known as the Certification Framework (CF), emphasizes wells and faults as the primary potential leakage conduits. Vulnerable resources are grouped into compartments, and impacts due to leakage are quantified by the leakage flux or concentrations that could potentially occur in compartments under various scenarios. The CF utilizes several model components to simulate leakage scenarios. One model component is a catalog of results of reservoir simulations that can be queried to estimate plume travel distances and times, rather than requiring CF users to run new reservoir simulations for each case. Other model components developed for the CF and described here include fault characterization using fault-population statistics; fault connection probability using fuzzy rules; well-flow modeling with a drift-flux model implemented in TOUGH2; and atmospheric dense-gas dispersion using a mesoscale weather prediction code.

  16. A Reusable Framework for Regional Climate Model Evaluation

    NASA Astrophysics Data System (ADS)

    Hart, A. F.; Goodale, C. E.; Mattmann, C. A.; Lean, P.; Kim, J.; Zimdars, P.; Waliser, D. E.; Crichton, D. J.

    2011-12-01

    Climate observations are currently obtained through a diverse network of sensors and platforms that include space-based observatories, airborne and seaborne platforms, and distributed, networked, ground-based instruments. These global observational measurements are critical inputs to the efforts of the climate modeling community and can provide a corpus of data for use in analysis and validation of climate models. The Regional Climate Model Evaluation System (RCMES) is an effort currently being undertaken to address the challenges of integrating this vast array of observational climate data into a coherent resource suitable for performing model analysis at the regional level. Developed through a collaboration between the NASA Jet Propulsion Laboratory (JPL) and the UCLA Joint Institute for Regional Earth System Science and Engineering (JIFRESSE), the RCMES uses existing open source technologies (MySQL, Apache Hadoop, and Apache OODT), to construct a scalable, parametric, geospatial data store that incorporates decades of observational data from a variety of NASA Earth science missions, as well as other sources into a consistently annotated, highly available scientific resource. By eliminating arbitrary partitions in the data (individual file boundaries, differing file formats, etc), and instead treating each individual observational measurement as a unique, geospatially referenced data point, the RCMES is capable of transforming large, heterogeneous collections of disparate observational data into a unified resource suitable for comparison to climate model output. This facility is further enhanced by the availability of a model evaluation toolkit which consists of a set of Python libraries, a RESTful web service layer, and a browser-based graphical user interface that allows for orchestration of model-to-data comparisons by composing them visually through web forms. This combination of tools and interfaces dramatically simplifies the process of interacting with and

  17. Building an Open Source Framework for Integrated Catchment Modeling

    NASA Astrophysics Data System (ADS)

    Jagers, B.; Meijers, E.; Villars, M.

    2015-12-01

    In order to develop effective strategies and associated policies for environmental management, we need to understand the dynamics of the natural system as a whole and the human role therein. This understanding is gained by comparing our mental model of the world with observations from the field. However, to properly understand the system we should look at dynamics of water, sediments, water quality, and ecology throughout the whole system from catchment to coast both at the surface and in the subsurface. Numerical models are indispensable in helping us understand the interactions of the overall system, but we need to be able to update and adjust them to improve our understanding and test our hypotheses. To support researchers around the world with this challenging task we started a few years ago with the development of a new open source modeling environment DeltaShell that integrates distributed hydrological models with 1D, 2D, and 3D hydraulic models including generic components for the tracking of sediment, water quality, and ecological quantities throughout the hydrological cycle composed of the aforementioned components. The open source approach combined with a modular approach based on open standards, which allow for easy adjustment and expansion as demands and knowledge grow, provides an ideal starting point for addressing challenging integrated environmental questions.

  18. A Flexible Atmospheric Modeling Framework for the CESM

    SciTech Connect

    Randall, David; Heikes, Ross; Konor, Celal

    2014-11-12

    We have created two global dynamical cores based on the unified system of equations and Z-grid staggering on an icosahedral grid, which are collectively called UZIM (Unified Z-grid Icosahedral Model). The z-coordinate version (UZIM-height) can be run in hydrostatic and nonhydrostatic modes. The sigma-coordinate version (UZIM-sigma) runs in only hydrostatic mode. The super-parameterization has been included as a physics option in both models. The UZIM versions with the super-parameterization are called SUZI. With SUZI-height, we have completed aquaplanet runs. With SUZI-sigma, we are making aquaplanet runs and realistic climate simulations. SUZI-sigma includes realistic topography and a SiB3 model to parameterize the land-surface processes.

  19. ORCHESTRA: an object-oriented framework for implementing chemical equilibrium models.

    PubMed

    Meeussen, Johannes C L

    2003-03-15

    This work presents a new object-oriented structure for chemical equilibrium calculations that is used in the modeling framework ORCHESTRA (Objects Representing CHEmical Speciation and TRAnsport). In contrast to standard chemical equilibrium algorithms, such as MINEQL, MINTEQ2A, PHREEQC, and ECOSAT, model equations are not hard-coded in the source code, but instead all equations are defined in text format and read by the ORCHESTRA calculation kernel at run time. This makes model definitions easily accessible and extendible by users. Furthermore, it results in a very compact and efficient calculation kernel that is easy to use as a submodel within mass transport or kinetic models. Finally, the object-oriented structure of the chemical model definitions makes it possible to implement a new object-oriented framework for implementing chemical models. This framework consists of three basic object types, entities, reactions, and phases, that form the building blocks from which other chemical models are composed. The hierarchical approach ensures consistent and compact model definitions and is illustrated here by discussing the implementation of a number of commonly used chemical models such as aqueous complexation, activity correction, precipitation, surface complexation ion exchange, and several more sophisticated adsorption models including electrostatic interactions, NICA, and CD-MUSIC. The ORCHESTRA framework is electronically available from www.macaulay.ac.uk/ORCHESTRA. PMID:12680672

  20. Introducing a boreal wetland model within the Earth System model framework

    NASA Astrophysics Data System (ADS)

    Getzieh, R. J.; Brovkin, V.; Reick, C.; Kleinen, T.; Raddatz, T.; Raivonen, M.; Sevanto, S.

    2009-04-01

    Wetlands of the northern high latitudes with their low temperatures and waterlogged conditions are prerequisite for peat accumulation. They store at least 25% of the global soil organic carbon and constitute currently the largest natural source of methane. These boreal and subarctic peat carbon pools are sensitive to climate change since the ratio of carbon sequestration and emission is closely dependent on hydrology and temperature. Global biogeochemistry models used for simulations of CO2 dynamics in the past and future climates usually ignore changes in the peat storages. Our approach aims at the evaluation of the boreal wetland feedback to climate through the CO2 and CH4 fluxes on decadal to millennial time scales. A generic model of organic matter accumulation and decay in boreal wetlands is under development in the MPI for Meteorology in cooperation with the University of Helsinki. Our approach is to develop a wetland model which is consistent with the physical and biogeochemical components of the land surface module JSBACH as a part of the Earth System model framework ECHAM5-MPIOM-JSBACH. As prototypes, we use modelling approach by Frolking et al. (2001) for the peat dynamics and the wetland model by Wania (2007) for vegetation cover and plant productivity. An initial distribution of wetlands follows the GLWD-3 map by Lehner and Döll (2004). First results of the modelling approach will be presented. References: Frolking, S. E., N. T. Roulet, T. R. Moore, P. J. H. Richard, M. Lavoie and S. D. Muller (2001): Modeling Northern Peatland Decomposition and Peat Accumulation, Ecosystems, 4, 479-498. Lehner, B., Döll P. (2004): Development and validation of a global database of lakes, reservoirs and wetlands. Journal of Hydrology 296 (1-4), 1-22. Wania, R. (2007): Modelling northern peatland land surface processes, vegetation dynamics and methane emissions. PhD thesis, University of Bristol, 122 pp.

  1. Towards uncertainty quantification and parameter estimation for Earth system models in a component-based modeling framework

    NASA Astrophysics Data System (ADS)

    Peckham, Scott D.; Kelbert, Anna; Hill, Mary C.; Hutton, Eric W. H.

    2016-05-01

    Component-based modeling frameworks make it easier for users to access, configure, couple, run and test numerical models. However, they do not typically provide tools for uncertainty quantification or data-based model verification and calibration. To better address these important issues, modeling frameworks should be integrated with existing, general-purpose toolkits for optimization, parameter estimation and uncertainty quantification. This paper identifies and then examines the key issues that must be addressed in order to make a component-based modeling framework interoperable with general-purpose packages for model analysis. As a motivating example, one of these packages, DAKOTA, is applied to a representative but nontrivial surface process problem of comparing two models for the longitudinal elevation profile of a river to observational data. Results from a new mathematical analysis of the resulting nonlinear least squares problem are given and then compared to results from several different optimization algorithms in DAKOTA.

  2. a Positive Test for Fermi-Dirac Distributions of Quark-Partons

    NASA Astrophysics Data System (ADS)

    Buccella, Franco; Pisanti, Ofelia; Rosa, Luigi; Dorsner, Ilya; Santorelli, Pietro

    By describing a large class of deep inelastic processes with standard parametrization for the different parton species, we check the characteristic relationship dictated by Pauli principle: broader shapes for higher first moments. Indeed, the ratios between the second and the first moments and the one between the third and the second moments for the valence partons is an increasing function of the first moment and agrees quantitatively with the values found with Fermi-Dirac distributions.

  3. A framework for modelling kinematic measurements in gravity field applications

    NASA Technical Reports Server (NTRS)

    Schwarz, K. P.; Wei, M.

    1989-01-01

    To assess the resolution of the local gravity field from kinematic measurements, a state model for motion in the gravity field of the earth is formulated. The resulting set of equations can accommodate gravity gradients, specific force, acceleration, velocity and position as input data and can take into account approximation errors as well as sensor errors.

  4. A Framework for Modelling Connective Tissue Changes in VIIP Syndrome

    NASA Technical Reports Server (NTRS)

    Ethier, C. R.; Best, L.; Gleason, R.; Mulugeta, L.; Myers, J. G.; Nelson, E. S.; Samuels, B. C.

    2014-01-01

    Insertion of astronauts into microgravity induces a cascade of physiological adaptations, notably including a cephalad fluid shift. Longer-duration flights carry an increased risk of developing Visual Impairment and Intracranial Pressure (VIIP) syndrome, a spectrum of ophthalmic changes including posterior globe flattening, choroidal folds, distension of the optic nerve sheath, kinking of the optic nerve and potentially permanent degradation of visual function. The slow onset of changes in VIIP, their chronic nature, and the similarity of certain clinical features of VIIP to ophthalmic findings in patients with raised intracranial pressure strongly suggest that: (i) biomechanical factors play a role in VIIP, and (ii) connective tissue remodeling must be accounted for if we wish to understand the pathology of VIIP. Our goal is to elucidate the pathophysiology of VIIP and suggest countermeasures based on biomechanical modeling of ocular tissues, suitably informed by experimental data, and followed by validation and verification. We specifically seek to understand the quasi-homeostatic state that evolves over weeks to months in space, during which ocular tissue remodeling occurs. This effort is informed by three bodies of work: (i) modeling of cephalad fluid shifts; (ii) modeling of ophthalmic tissue biomechanics in glaucoma; and (iii) modeling of connective tissue changes in response to biomechanical loading.

  5. Bilingual Education Program Models: A Framework for Understanding.

    ERIC Educational Resources Information Center

    Roberts, Cheryl A.

    1995-01-01

    Examines the goals, outcomes, and educational costs and benefits of various models of bilingual education: "submersion" (mainstreaming without support); pull-out classes for English as a Second Language; transitional bilingual education; maintenance bilingual education; enrichment, two-way, or developmental bilingual education; and the Canadian…

  6. COMPASS: A Framework for Automated Performance Modeling and Prediction

    SciTech Connect

    Lee, Seyong; Meredith, Jeremy S; Vetter, Jeffrey S

    2015-01-01

    Flexible, accurate performance predictions offer numerous benefits such as gaining insight into and optimizing applications and architectures. However, the development and evaluation of such performance predictions has been a major research challenge, due to the architectural complexities. To address this challenge, we have designed and implemented a prototype system, named COMPASS, for automated performance model generation and prediction. COMPASS generates a structured performance model from the target application's source code using automated static analysis, and then, it evaluates this model using various performance prediction techniques. As we demonstrate on several applications, the results of these predictions can be used for a variety of purposes, such as design space exploration, identifying performance tradeoffs for applications, and understanding sensitivities of important parameters. COMPASS can generate these predictions across several types of applications from traditional, sequential CPU applications to GPU-based, heterogeneous, parallel applications. Our empirical evaluation demonstrates a maximum overhead of 4%, flexibility to generate models for 9 applications, speed, ease of creation, and very low relative errors across a diverse set of architectures.

  7. An Empirical Generative Framework for Computational Modeling of Language Acquisition

    ERIC Educational Resources Information Center

    Waterfall, Heidi R.; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-01-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of…

  8. Understanding cardiac alternans: A piecewise linear modeling framework

    NASA Astrophysics Data System (ADS)

    Thul, R.; Coombes, S.

    2010-12-01

    Cardiac alternans is a beat-to-beat alternation in action potential duration (APD) and intracellular calcium (Ca2+) cycling seen in cardiac myocytes under rapid pacing that is believed to be a precursor to fibrillation. The cellular mechanisms of these rhythms and the coupling between cellular Ca2+ and voltage dynamics have been extensively studied leading to the development of a class of physiologically detailed models. These have been shown numerically to reproduce many of the features of myocyte response to pacing, including alternans, and have been analyzed mathematically using various approximation techniques that allow for the formulation of a low dimensional map to describe the evolution of APDs. The seminal work by Shiferaw and Karma is of particular interest in this regard [Shiferaw, Y. and Karma, A., "Turing instability mediated by voltage and calcium diffusion in paced cardiac cells," Proc. Natl. Acad. Sci. U.S.A. 103, 5670-5675 (2006)]. Here, we establish that the key dynamical behaviors of the Shiferaw-Karma model are arranged around a set of switches. These are shown to be the main elements for organizing the nonlinear behavior of the model. Exploiting this observation, we show that a piecewise linear caricature of the Shiferaw-Karma model, with a set of appropriate switching manifolds, can be constructed that preserves the physiological interpretation of the original model while being amenable to a systematic mathematical analysis. In illustration of this point, we formulate the dynamics of Ca2+ cycling (in response to pacing) and compute the properties of periodic orbits in terms of a stroboscopic map that can be constructed without approximation. Using this, we show that alternans emerge via a period-doubling instability and track this bifurcation in terms of physiologically important parameters. We also show that when coupled to a spatially extended model for Ca2+ transport, the model supports spatially varying patterns of alternans. We analyze

  9. An advanced model framework for solid electrolyte intercalation batteries.

    PubMed

    Landstorfer, Manuel; Funken, Stefan; Jacob, Timo

    2011-07-28

    Recent developments of solid electrolytes, especially lithium ion conductors, led to all solid state batteries for various applications. In addition, mathematical models sprout for different electrode materials and battery types, but are missing for solid electrolyte cells. We present a mathematical model for ion flux in solid electrolytes, based on non-equilibrium thermodynamics and functional derivatives. Intercalated ion diffusion within the electrodes is further considered, allowing the computation of the ion concentration at the electrode/electrolyte interface. A generalized Frumkin-Butler-Volmer equation describes the kinetics of (de-)intercalation reactions and is here extended to non-blocking electrodes. Using this approach, numerical simulations were carried out to investigate the space charge region at the interface. Finally, discharge simulations were performed to study different limitations of an all solid state battery cell. PMID:21681301

  10. An efficient framework for modeling clouds from Landsat8 images

    NASA Astrophysics Data System (ADS)

    Yuan, Chunqiang; Guo, Jing

    2015-03-01

    Cloud plays an important role in creating realistic outdoor scenes for video game and flight simulation applications. Classic methods have been proposed for cumulus cloud modeling. However, these methods are not flexible for modeling large cloud scenes with hundreds of clouds in that the user must repeatedly model each cloud and adjust its various properties. This paper presents a meteorologically based method to reconstruct cumulus clouds from high resolution Landsat8 satellite images. From these input satellite images, the clouds are first segmented from the background. Then, the cloud top surface is estimated from the temperature of the infrared image. After that, under a mild assumption of flat base for cumulus cloud, the base height of each cloud is computed by averaging the top height for pixels on the cloud edge. Then, the extinction is generated from the visible image. Finally, we enrich the initial shapes of clouds using a fractal method and represent the recovered clouds as a particle system. The experimental results demonstrate our method can yield realistic cloud scenes resembling those in the satellite images.

  11. Computational fluid dynamics framework for aerodynamic model assessment

    NASA Astrophysics Data System (ADS)

    Vallespin, D.; Badcock, K. J.; Da Ronch, A.; White, M. D.; Perfect, P.; Ghoreyshi, M.

    2012-07-01

    This paper reviews the work carried out at the University of Liverpool to assess the use of CFD methods for aircraft flight dynamics applications. Three test cases are discussed in the paper, namely, the Standard Dynamic Model, the Ranger 2000 jet trainer and the Stability and Control Unmanned Combat Air Vehicle. For each of these, a tabular aerodynamic model based on CFD predictions is generated along with validation against wind tunnel experiments and flight test measurements. The main purpose of the paper is to assess the validity of the tables of aerodynamic data for the force and moment prediction of realistic aircraft manoeuvres. This is done by generating a manoeuvre based on the tables of aerodynamic data, and then replaying the motion through a time-accurate computational fluid dynamics calculation. The resulting forces and moments from these simulations were compared with predictions from the tables. As the latter are based on a set of steady-state predictions, the comparisons showed perfect agreement for slow manoeuvres. As manoeuvres became more aggressive some disagreement was seen, particularly during periods of large rates of change in attitudes. Finally, the Ranger 2000 model was used on a flight simulator.

  12. Generic framework for mining cellular automata models on protein-folding simulations.

    PubMed

    Diaz, N; Tischer, I

    2016-01-01

    Cellular automata model identification is an important way of building simplified simulation models. In this study, we describe a generic architectural framework to ease the development process of new metaheuristic-based algorithms for cellular automata model identification in protein-folding trajectories. Our framework was developed by a methodology based on design patterns that allow an improved experience for new algorithms development. The usefulness of the proposed framework is demonstrated by the implementation of four algorithms, able to obtain extremely precise cellular automata models of the protein-folding process with a protein contact map representation. Dynamic rules obtained by the proposed approach are discussed, and future use for the new tool is outlined. PMID:27323045

  13. Periodic model of LTA framework containing various non-tetrahedral cations.

    PubMed

    Koleżyński, A; Mikuła, A; Król, M

    2016-03-15

    A simplified periodic model of Linde Type A zeolite (LTA) structure with various selected mono- and di-valent extra-framework cations was formulated. Ab initio calculations (geometry optimization and vibrational spectra calculations) using the proposed model were carried out by means of Crystal09 program. The resulting structures and simulated spectra were analyzed in detail and compared with the experimental ones. The presented results show that in most cases the proposed model agrees well with experimental results. Individual bands were assigned to respective normal modes of vibration and the changes resulting from the selective substitution of extra framework cations were described and explained. PMID:26702792

  14. Periodic model of LTA framework containing various non-tetrahedral cations

    NASA Astrophysics Data System (ADS)

    Koleżyński, A.; Mikuła, A.; Król, M.

    2016-03-01

    A simplified periodic model of Linde Type A zeolite (LTA) structure with various selected mono- and di-valent extra-framework cations was formulated. Ab initio calculations (geometry optimization and vibrational spectra calculations) using the proposed model were carried out by means of Crystal09 program. The resulting structures and simulated spectra were analyzed in detail and compared with the experimental ones. The presented results show that in most cases the proposed model agrees well with experimental results. Individual bands were assigned to respective normal modes of vibration and the changes resulting from the selective substitution of extra framework cations were described and explained.

  15. A framework to assess the realism of model structures using hydrological signatures

    NASA Astrophysics Data System (ADS)

    Euser, T.; Winsemius, H. C.; Hrachowitz, M.; Fenicia, F.; Uhlenbrook, S.; Savenije, H. H. G.

    2013-05-01

    The use of flexible hydrological model structures for hypothesis testing requires an objective and diagnostic method to identify whether a rainfall-runoff model structure is suitable for a certain catchment. To determine if a model structure is realistic, i.e. if it captures the relevant runoff processes, both performance and consistency are important. We define performance as the ability of a model structure to mimic a specific part of the hydrological behaviour in a specific catchment. This can be assessed based on evaluation criteria, such as the goodness of fit of specific hydrological signatures obtained from hydrological data. Consistency is defined as the ability of a model structure to adequately reproduce several hydrological signatures simultaneously while using the same set of parameter values. In this paper we describe and demonstrate a new evaluation Framework for Assessing the Realism of Model structures (FARM). The evaluation framework tests for both performance and consistency using a principal component analysis on a range of evaluation criteria, all emphasizing different hydrological behaviour. The utility of this evaluation framework is demonstrated in a case study of two small headwater catchments (Maimai, New Zealand, and Wollefsbach, Luxembourg). Eight different hydrological signatures and eleven model structures have been used for this study. The results suggest that some model structures may reveal the same degree of performance for selected evaluation criteria while showing differences in consistency. The results also show that some model structures have a higher performance and consistency than others. The principal component analysis in combination with several hydrological signatures is shown to be useful to visualise the performance and consistency of a model structure for the study catchments. With this framework performance and consistency are evaluated to identify which model structure suits a catchment better compared to other model

  16. A review of the quantification and communication of uncertainty associated with geological framework models

    NASA Astrophysics Data System (ADS)

    Mathers, Steve; Lark, Murray

    2015-04-01

    Digital Geological Framework Models show geology in three dimensions, they can most easily be thought of as 3D geological maps. The volume of the model is divided into distinct geological units using a suitable rock classification in the same way that geological maps are. Like geological maps the models are generic and many are intended to be fit for any geoscience purpose. Over the last decade many Geological Survey Organisations (GSO's) worldwide have begun to communicate their geological understanding of the subsurface through Geological Framework Models and themed derivatives, and the traditional printed geological map has been increasingly phased out. Building Geological Framework Models entails the assembly of all the known geospatial information into a single workspace for interpretation. The calculated models are commonly displayed as either a stack of geological surfaces or boundaries (unit tops, bases, unconformities) or as solid calculated blocks of 3D geology with the unit volumes infilled in with colour or symbols. The studied volume however must be completely populated so decisions on the subsurface distribution of units must be made even where considerable uncertainty exists There is naturally uncertainty associated with any Geological Framework Model and this is composed of two main components; the uncertainty in the geospatial data used to constrain the model, and the uncertainty related to the model construction, this includes factors such as choice of modeller(s), choice of software(s), and modelling workflow. Uncertainty is the inverse of confidence, reliability or certainty, other closely related terms include risk commonly used in preference to uncertainty where financial or safety matters are presented and probability used as a statistical measure of uncertainty. We can consider uncertainty in geological framework models to be of two main types: Uncertainty in the geospatial data used to constrain the model; this differs with the distinct

  17. Using a guided inquiry and modeling instructional framework (EIMA) to support preservice K-8 science teaching

    NASA Astrophysics Data System (ADS)

    Schwarz, Christina V.; Gwekwerere, Yovita N.

    2007-01-01

    This paper presents results from a study aimed at helping preservice elementary and middle school teachers incorporate model-centered scientific inquiry into their science teaching practices. Specifically, the authors studied the effect of using a guided inquiry and modeling instructional framework (EIMA) and accompanying science methods instruction on preservice elementary teachers' science lesson design skills, scientific model use, and teaching orientations. Analysis of preservice teachers' pre-posttests, classroom artifacts, peer interviews, and lesson plans throughout the semester indicates that the framework successfully built on preservice teachers' prior instructional ideas, and that the majority of preservice teachers learned and used the framework in their lesson plans and teaching. Additionally, analysis of pre-posttest differences indicates an increase in posttest lesson plans that focused on engaging students in scientific inquiry using several kinds of models. Most importantly, the framework and accompanying instruction enabled two thirds of the class to move their teaching orientations away from discovery or didactic approaches toward reform-based approaches such as conceptual change, inquiry, and guided inquiry. Results from this study show that using instructional frameworks such as EIMA can enable preservice teachers to socially construct, synthesize, and apply their knowledge for enacting reform-oriented science teaching approaches such as model-centered scientific inquiry.

  18. Development of a framework for reporting health service models for managing rheumatoid arthritis.

    PubMed

    O'Donnell, Siobhan; Li, Linda C; King, Judy; Lauzon, Chantal; Finn, Heather; Vliet Vlieland, Theodora P M

    2010-02-01

    The purpose of this study was to develop a framework for reporting health service models for managing rheumatoid arthritis (RA). We conducted a search of the health sciences literature for primary studies that described interventions which aimed to improve the implementation of health services in adults with RA. Thereafter, a nominal group consensus process was used to synthesize the evidence for the development of the reporting framework. Of the 2,033 citations screened, 68 primary studies were included which described 93 health service models for RA. The origin and meaning of the labels given to these health service delivery models varied widely and, in general, the reporting of their components lacked detail or was absent. The six dimensions underlying the framework for reporting RA health service delivery models are: (1) Why was it founded? (2) Who was involved? (3) What were the roles of those participating? (4) When were the services provided? (5) Where were the services provided/received? (6) How were the services/interventions accessed and implemented, how long was the intervention, how did individuals involved communicate, and how was the model supported/sustained? The proposed framework has the potential to facilitate knowledge exchange among clinicians, researchers, and decision makers in the area of health service delivery. Future work includes the validation of the framework with national and international stakeholders such as clinicians, health care administrators, and health services researchers. PMID:19865842

  19. Evaluation of Hydrometeor Occurrence Profiles in the Multiscale Modeling Framework Climate Model using Atmospheric Classification

    SciTech Connect

    Marchand, Roger T.; Beagley, Nathaniel; Ackerman, Thomas P.

    2009-09-01

    Vertical profiles of hydrometeor occurrence from the Multiscale Modeling Framework (MMF) climate model are compared with profiles observed by a vertically pointing millimeter wavelength cloud-radar (located in the U.S. Southern Great Plains) as a function of the largescale atmospheric state. The atmospheric state is determined by classifying (or clustering) the large-scale (synoptic) fields produced by the MMF and a numerical weather prediction model using a neural network approach. The comparison shows that for cold frontal and post-cold frontal conditions the MMF produces profiles of hydrometeor occurrence that compare favorably with radar observations, while for warm frontal conditions the model tends to produce hydrometeor fractions that are too large with too much cloud (non-precipitating hydrometeors) above 7 km and too much precipitating hydrometeor coverage below 7 km. We also find that the MMF has difficulty capturing the formation of low clouds and that for all atmospheric states that occur during June, July, and August, the MMF produces too much high and thin cloud, especially above 10 km.

  20. Parameter Estimation for Differential Equation Models Using a Framework of Measurement Error in Regression Models.

    PubMed

    Liang, Hua; Wu, Hulin

    2008-12-01

    Differential equation (DE) models are widely used in many scientific fields that include engineering, physics and biomedical sciences. The so-called "forward problem", the problem of simulations and predictions of state variables for given parameter values in the DE models, has been extensively studied by mathematicians, physicists, engineers and other scientists. However, the "inverse problem", the problem of parameter estimation based on the measurements of output variables, has not been well explored using modern statistical methods, although some least squares-based approaches have been proposed and studied. In this paper, we propose parameter estimation methods for ordinary differential equation models (ODE) based on the local smoothing approach and a pseudo-least squares (PsLS) principle under a framework of measurement error in regression models. The asymptotic properties of the proposed PsLS estimator are established. We also compare the PsLS method to the corresponding SIMEX method and evaluate their finite sample performances via simulation studies. We illustrate the proposed approach using an application example from an HIV dynamic study. PMID:19956350

  1. Using the Bifocal Modeling Framework to Resolve "Discrepant Events" between Physical Experiments and Virtual Models in Biology

    ERIC Educational Resources Information Center

    Blikstein, Paulo; Fuhrmann, Tamar; Salehi, Shima

    2016-01-01

    In this paper, we investigate an approach to supporting students' learning in science through a combination of physical experimentation and virtual modeling. We present a study that utilizes a scientific inquiry framework, which we call "bifocal modeling," to link student-designed experiments and computer models in real time. In this…

  2. A modeling framework for characterizing near-road air pollutant concentration at community scales

    EPA Science Inventory

    In this study, we combine information from transportation network, traffic emissions, and dispersion model to develop a framework to inform exposure estimates for traffic-related air pollutants (TRAPs) with a high spatial resolution. A Research LINE source dispersion model (R-LIN...

  3. A Theoretical Framework for Research in Algebra: Modification of Janvier's "Star" Model of Function Understanding.

    ERIC Educational Resources Information Center

    Bowman, Anita H.

    A pentagonal model, based on the star model of function understanding of C. Janvier (1987), is presented as a framework for the design and interpretation of research in the area of learning the concept of mathematical function. The five vertices of the pentagon correspond to five common representations of mathematical function: (1) graph; (2)…

  4. PARCC Model Content Frameworks: English Language Arts/Literacy--Grades 3-11

    ERIC Educational Resources Information Center

    Partnership for Assessment of Readiness for College and Careers (NJ1), 2011

    2011-01-01

    As part of its proposal to the U.S. Department of Education, the Partnership for Assessment of Readiness for College and Careers (PARCC) committed to developing model content frameworks for English language arts/literacy (ELA/Literacy) to serve as a bridge between the Common Core State Standards and the PARCC assessments. The PARCC Model Content…

  5. An Odds Ratio Approach for Detecting DDF under the Nested Logit Modeling Framework

    ERIC Educational Resources Information Center

    Terzi, Ragip; Suh, Youngsuk

    2015-01-01

    An odds ratio approach (ORA) under the framework of a nested logit model was proposed for evaluating differential distractor functioning (DDF) in multiple-choice items and was compared with an existing ORA developed under the nominal response model. The performances of the two ORAs for detecting DDF were investigated through an extensive…

  6. HexSim - A general purpose framework for spatially-explicit, individual-based modeling

    EPA Science Inventory

    HexSim is a framework for constructing spatially-explicit, individual-based computer models designed for simulating terrestrial wildlife population dynamics and interactions. HexSim is useful for a broad set of modeling applications. This talk will focus on a subset of those ap...

  7. Modeling framework for exploring emission impacts of alternative future scenarios

    NASA Astrophysics Data System (ADS)

    Loughlin, D. H.; Benjey, W. G.; Nolte, C. G.

    2010-11-01

    This article presents an approach for creating anthropogenic emission scenarios that can be used to simulate future regional air quality. The approach focuses on energy production and use since these are principal sources of air pollution. We use the MARKAL model to characterize alternative realizations of the US energy system through 2050. Emission growth factors are calculated for major energy system categories using MARKAL, while growth factors from non-energy sectors are based on economic and population projections. The SMOKE model uses these factors to grow a base-year 2002 inventory to future years through 2050. The approach is demonstrated for two emission scenarios: Scenario 1 extends current air regulations through 2050, while Scenario 2 applies a hypothetical policy that limits carbon dioxide (CO2) emissions from the energy system. Although both scenarios show significant reductions in air pollutant emissions through time, these reductions are more pronounced in Scenario 2, where the CO2 policy results in the adoption of technologies with lower emissions of both CO2 and traditional air pollutants. The methodology is expected to play an important role in investigations of linkages among emission drivers, climate and air quality by the U.S. EPA and others.

  8. A Bayesian modelling framework for tornado occurrences in North America

    NASA Astrophysics Data System (ADS)

    Cheng, Vincent Y. S.; Arhonditsis, George B.; Sills, David M. L.; Gough, William A.; Auld, Heather

    2015-03-01

    Tornadoes represent one of nature’s most hazardous phenomena that have been responsible for significant destruction and devastating fatalities. Here we present a Bayesian modelling approach for elucidating the spatiotemporal patterns of tornado activity in North America. Our analysis shows a significant increase in the Canadian Prairies and the Northern Great Plains during the summer, indicating a clear transition of tornado activity from the United States to Canada. The linkage between monthly-averaged atmospheric variables and likelihood of tornado events is characterized by distinct seasonality; the convective available potential energy is the predominant factor in the summer; vertical wind shear appears to have a strong signature primarily in the winter and secondarily in the summer; and storm relative environmental helicity is most influential in the spring. The present probabilistic mapping can be used to draw inference on the likelihood of tornado occurrence in any location in North America within a selected time period of the year.

  9. HART: An Efficient Modeling Framework for Simulated Solar Imaging

    NASA Astrophysics Data System (ADS)

    Benkevitch, L. V.; Oberoi, D.; Benjamin, M. D.; Sokolov, I. V.

    2012-09-01

    The Haystack & AOSS Ray Tracer (HART) is a software tool for modeling propagation of electromagnetic radiation through a realistic description of the magnetized solar corona, along with the associated radiative transfer effects. Its primary outputs are solar brightness temperature (or flux density) images corresponding to a user-specified coronal description and radio frequency. HART is based on native high-efficiency algorithms coded in the C language, and provides convenient command-line (Python) and graphical user interfaces. HART is a necessary tool for enabling the extraction of solar physics from the images that will be produced by the new generation of low radio frequency arrays like the Murchison Widefield Array (MWA), Low Frequency Array (LOFAR) and Long Wavelength Array (LWA).

  10. A Bayesian modelling framework for tornado occurrences in North America.

    PubMed

    Cheng, Vincent Y S; Arhonditsis, George B; Sills, David M L; Gough, William A; Auld, Heather

    2015-01-01

    Tornadoes represent one of nature's most hazardous phenomena that have been responsible for significant destruction and devastating fatalities. Here we present a Bayesian modelling approach for elucidating the spatiotemporal patterns of tornado activity in North America. Our analysis shows a significant increase in the Canadian Prairies and the Northern Great Plains during the summer, indicating a clear transition of tornado activity from the United States to Canada. The linkage between monthly-averaged atmospheric variables and likelihood of tornado events is characterized by distinct seasonality; the convective available potential energy is the predominant factor in the summer; vertical wind shear appears to have a strong signature primarily in the winter and secondarily in the summer; and storm relative environmental helicity is most influential in the spring. The present probabilistic mapping can be used to draw inference on the likelihood of tornado occurrence in any location in North America within a selected time period of the year. PMID:25807465

  11. The Modular Modeling System (MMS): A modeling framework for water- and environmental-resources management

    USGS Publications Warehouse

    Leavesley, G.H.; Markstrom, S.L.; Viger, R.J.

    2004-01-01

    The interdisciplinary nature and increasing complexity of water- and environmental-resource problems require the use of modeling approaches that can incorporate knowledge from a broad range of scientific disciplines. The large number of distributed hydrological and ecosystem models currently available are composed of a variety of different conceptualizations of the associated processes they simulate. Assessment of the capabilities of these distributed models requires evaluation of the conceptualizations of the individual processes, and the identification of which conceptualizations are most appropriate for various combinations of criteria, such as problem objectives, data constraints, and spatial and temporal scales of application. With this knowledge, "optimal" models for specific sets of criteria can be created and applied. The U.S. Geological Survey (USGS) Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide these model development and application capabilities. MMS supports the integration of models and tools at a variety of levels of modular design. These include individual process models, tightly coupled models, loosely coupled models, and fully-integrated decision support systems. A variety of visualization and statistical tools are also provided. MMS has been coupled with the Bureau of Reclamation (BOR) object-oriented reservoir and river-system modeling framework, RiverWare, under a joint USGS-BOR program called the Watershed and River System Management Program. MMS and RiverWare are linked using a shared relational database. The resulting database-centered decision support system provides tools for evaluating and applying optimal resource-allocation and management strategies to complex, operational decisions on multipurpose reservoir systems and watersheds. Management issues being addressed include efficiency of water-resources management, environmental concerns such as meeting flow needs for

  12. A new fit-for-purpose model testing framework: Decision Crash Tests

    NASA Astrophysics Data System (ADS)

    Tolson, Bryan; Craig, James

    2016-04-01

    Decision-makers in water resources are often burdened with selecting appropriate multi-million dollar strategies to mitigate the impacts of climate or land use change. Unfortunately, the suitability of existing hydrologic simulation models to accurately inform decision-making is in doubt because the testing procedures used to evaluate model utility (i.e., model validation) are insufficient. For example, many authors have identified that a good standard framework for model testing called the Klemes Crash Tests (KCTs), which are the classic model validation procedures from Klemeš (1986) that Andréassian et al. (2009) rename as KCTs, have yet to become common practice in hydrology. Furthermore, Andréassian et al. (2009) claim that the progression of hydrological science requires widespread use of KCT and the development of new crash tests. Existing simulation (not forecasting) model testing procedures such as KCTs look backwards (checking for consistency between simulations and past observations) rather than forwards (explicitly assessing if the model is likely to support future decisions). We propose a fundamentally different, forward-looking, decision-oriented hydrologic model testing framework based upon the concept of fit-for-purpose model testing that we call Decision Crash Tests or DCTs. Key DCT elements are i) the model purpose (i.e., decision the model is meant to support) must be identified so that model outputs can be mapped to management decisions ii) the framework evaluates not just the selected hydrologic model but the entire suite of model-building decisions associated with model discretization, calibration etc. The framework is constructed to directly and quantitatively evaluate model suitability. The DCT framework is applied to a model building case study on the Grand River in Ontario, Canada. A hypothetical binary decision scenario is analysed (upgrade or not upgrade the existing flood control structure) under two different sets of model building

  13. Spatiotemporal nonpoint source pollution water quality management framework using bi-directional model-GIS linkage

    SciTech Connect

    Faizullabhoy, M.S.; Yoon, J.

    1999-07-01

    A framework for water quality assessment and management purposes was developed. In this framework, a bilateral linkage was implemented between the distributed model, Agricultural Nonpoint Source Pollution Model (AGNPS) and the Geographic Information System (GIS) to investigate a spatiotemporal nonpoint source pollution problem from a 750-acre watershed in the NSGA (Naval Security Group Activity) Northwest base at the Virginia/North Carolina border. AGNPS is an event-based, distributed parameter model that simulates runoff and the transport of sediment and nutrients (nitrogen and phosphorus) from predominantly agricultural watersheds. In this study rather than manually implementing AGNPS simulation, extracted data are integrated in an automated fashion through a direct bilateral linkage framework between the AGNPS model engine and the GIS. This bilateral linkage framework resulted in a powerful, up-to-date tool that would be capable of monitoring and instantaneously visualizing the transport of any pollutant as well as effectively identifying critical areas of the nonpoint source (NPS) pollution. The framework also allowed the various what if scenarios to support the decision-making processes. Best Management Practices (BMP) for the watershed can be generated in a close loop iterative scheme, until predefined management objectives are achieved. Simulated results showed that the optimal BMP scenario achieved an average reduction of about 41% in soluble and sediment-attached nitrogen and about 62% reduction in soluble and sediment phosphorus from current NPS pollution levels.

  14. A Satellite Based Modeling Framework for Estimating Seasonal Carbon Fluxes Over Agricultural Lands

    NASA Astrophysics Data System (ADS)

    Bandaru, V.; Houborg, R.; Izaurralde, R. C.

    2014-12-01

    Croplands are typically characterized by fine-scale heterogeneity, which makes it difficult to accurately estimate cropland carbon fluxes over large regions given the fairly coarse spatial resolution of high-frequency satellite observations. It is, however, important that we improve our ability to estimate spatially and temporally resolved carbon fluxes because croplands constitute a large land area and have a large impact on global carbon cycle. A Satellite based Dynamic Cropland Carbon (SDCC) modeling framework was developed to estimate spatially resolved crop specific daily carbon fluxes over large regions. This modeling framework uses the REGularized canopy reFLECtance (REGFLEC) model to estimate crop specific leaf area index (LAI) using downscaled MODIS reflectance data, and subsequently LAI estimates are integrated into the Environmental Policy Integrated Model (EPIC) model to determine daily net primary productivity (NPP) and net ecosystem productivity (NEP). Firstly, we evaluate the performance of this modeling framework over three eddy covariance flux tower sites (Bondville, IL; Fermi Agricultural Site, IL; and Rosemount site, MN). Daily NPP and NEP of corn and soybean crops are estimated (based on REGFLEC LAI) for year 2007 and 2008 over the flux tower sites and compared against flux tower observations and model estimates based on in-situ LAI. Secondly, we apply the SDCC framework for estimating regional NPP and NEP for corn, soybean and sorghum crops in Nebraska during year 2007 and 2008. The methods and results will be presented.

  15. A Satellite Based Modeling Framework for Estimating Seasonal Carbon Fluxes Over Agricultural Lands

    NASA Astrophysics Data System (ADS)

    Bandaru, V.; Izaurralde, R. C.; Sahajpal, R.; Houborg, R.; Milla, Z.

    2013-12-01

    Croplands are typically characterized by fine-scale heterogeneity, which makes it difficult to accurately estimate cropland carbon fluxes over large regions given the fairly coarse spatial resolution of high-frequency satellite observations. It is, however, important that we improve our ability to estimate spatially and temporally resolved carbon fluxes because croplands constitute a large land area and have a large impact on global carbon cycle. A Satellite based Dynamic Cropland Carbon (SDCC) modeling framework was developed to estimate spatially resolved crop specific daily carbon fluxes over large regions. This modeling framework uses the REGularized canopy reFLECtance (REGFLEC) model to estimate crop specific leaf area index (LAI) using downscaled MODIS reflectance data, and subsequently LAI estimates are integrated into the Environmental Policy Integrated Model (EPIC) model to determine daily net primary productivity (NPP) and net ecosystem productivity (NEP). Firstly, we evaluate the performance of this modeling framework over three eddy covariance flux tower sites (Bondville, IL; Fermi Agricultural Site, IL; and Rosemount site, MN). Daily NPP and NEP of corn and soybean crops are estimated (based on REGFLEC LAI) for year 2007 and 2008 over the flux tower sites and compared against flux tower observations and model estimates based on in-situ LAI. Secondly, we apply the SDCC framework for estimating regional NPP and NEP for corn, soybean and sorghum crops in Nebraska during year 2007 and 2008. The methods and results will be presented.

  16. Modeling Complex Biological Flows in Multi-Scale Systems using the APDEC Framework

    SciTech Connect

    Trebotich, D

    2006-06-24

    We have developed advanced numerical algorithms to model biological fluids in multiscale flow environments using the software framework developed under the SciDAC APDEC ISIC. The foundation of our computational effort is an approach for modeling DNA-laden fluids as ''bead-rod'' polymers whose dynamics are fully coupled to an incompressible viscous solvent. The method is capable of modeling short range forces and interactions between particles using soft potentials and rigid constraints. Our methods are based on higher-order finite difference methods in complex geometry with adaptivity, leveraging algorithms and solvers in the APDEC Framework. Our Cartesian grid embedded boundary approach to incompressible viscous flow in irregular geometries has also been interfaced to a fast and accurate level-sets method within the APDEC Framework for extracting surfaces from volume renderings of medical image data and used to simulate cardio-vascular and pulmonary flows in critical anatomies.

  17. A modeling framework for investment planning in interdependent infrastructures in multi-hazard environments.

    SciTech Connect

    Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.; Nozick, Linda Karen; Prince, Michael

    2013-09-01

    Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research&Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorist's actions based on assumptions regarding their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.

  18. Modeling framework to link climate, hydrology and flood hazards: An application to Sacramento, California

    NASA Astrophysics Data System (ADS)

    Kim, B.; David, C. H.; Druffel-Rodriguez, R.; Sanders, B. F.; Famiglietti, J. S.

    2013-12-01

    The City of Sacramento and the broader delta region may be the most flood vulnerable urbanized area in the United States. Management of flood risk here and elsewhere requires an understanding of flooding hazards, which is in turn linked to California hydrology, climate, development and flood control infrastructure. A modeling framework is presented here to make predictions of flooding hazards (e.g., depth and velocity) at the household scale (personalized flood risk information), and to study how these predictions could change under different climate change, land-use change, and infrastructure adaptation scenarios. The framework couples a statewide hydrologic model (RAPID) that predicts runoff and streamflow to a city-scale hydrodynamic model (BreZo) capable of predicting levee-breach flows and overland flows into urbanized lowlands. Application of the framework to the Sacramento area is presented here, with a focus on data needs, computational demands, results and hazard communication strategies, for selected flooding scenarios.

  19. A new framework for modeling decisions about changing information: The Piecewise Linear Ballistic Accumulator model.

    PubMed

    Holmes, William R; Trueblood, Jennifer S; Heathcote, Andrew

    2016-03-01

    In the real world, decision making processes must be able to integrate non-stationary information that changes systematically while the decision is in progress. Although theories of decision making have traditionally been applied to paradigms with stationary information, non-stationary stimuli are now of increasing theoretical interest. We use a random-dot motion paradigm along with cognitive modeling to investigate how the decision process is updated when a stimulus changes. Participants viewed a cloud of moving dots, where the motion switched directions midway through some trials, and were asked to determine the direction of motion. Behavioral results revealed a strong delay effect: after presentation of the initial motion direction there is a substantial time delay before the changed motion information is integrated into the decision process. To further investigate the underlying changes in the decision process, we developed a Piecewise Linear Ballistic Accumulator model (PLBA). The PLBA is efficient to simulate, enabling it to be fit to participant choice and response-time distribution data in a hierarchal modeling framework using a non-parametric approximate Bayesian algorithm. Consistent with behavioral results, PLBA fits confirmed the presence of a long delay between presentation and integration of new stimulus information, but did not support increased response caution in reaction to the change. We also found the decision process was not veridical, as symmetric stimulus change had an asymmetric effect on the rate of evidence accumulation. Thus, the perceptual decision process was slow to react to, and underestimated, new contrary motion information. PMID:26760448

  20. A Fuzzy Logic Framework for Integrating Multiple Learned Models

    SciTech Connect

    Bobi Kai Den Hartog

    1999-03-01

    The Artificial Intelligence field of Integrating Multiple Learned Models (IMLM) explores ways to combine results from sets of trained programs. Aroclor Interpretation is an ill-conditioned problem in which trained programs must operate in scenarios outside their training ranges because it is intractable to train them completely. Consequently, they fail in ways related to the scenarios. We developed a general-purpose IMLM solution, the Combiner, and applied it to Aroclor Interpretation. The Combiner's first step, Scenario Identification (M), learns rules from very sparse, synthetic training data consisting of results from a suite of trained programs called Methods. S1 produces fuzzy belief weights for each scenario by approximately matching the rules. The Combiner's second step, Aroclor Presence Detection (AP), classifies each of three Aroclors as present or absent in a sample. The third step, Aroclor Quantification (AQ), produces quantitative values for the concentration of each Aroclor in a sample. AP and AQ use automatically learned empirical biases for each of the Methods in each scenario. Through fuzzy logic, AP and AQ combine scenario weights, automatically learned biases for each of the Methods in each scenario, and Methods' results to determine results for a sample.

  1. A climate robust integrated modelling framework for regional impact assessment of climate change

    NASA Astrophysics Data System (ADS)

    Janssen, Gijs; Bakker, Alexander; van Ek, Remco; Groot, Annemarie; Kroes, Joop; Kuiper, Marijn; Schipper, Peter; van Walsum, Paul; Wamelink, Wieger; Mol, Janet

    2013-04-01

    Decision making towards climate proofing the water management of regional catchments can benefit greatly from the availability of a climate robust integrated modelling framework, capable of a consistent assessment of climate change impacts on the various interests present in the catchments. In the Netherlands, much effort has been devoted to developing state-of-the-art regional dynamic groundwater models with a very high spatial resolution (25x25 m2). Still, these models are not completely satisfactory to decision makers because the modelling concepts do not take into account feedbacks between meteorology, vegetation/crop growth, and hydrology. This introduces uncertainties in forecasting the effects of climate change on groundwater, surface water, agricultural yields, and development of groundwater dependent terrestrial ecosystems. These uncertainties add to the uncertainties about the predictions on climate change itself. In order to create an integrated, climate robust modelling framework, we coupled existing model codes on hydrology, agriculture and nature that are currently in use at the different research institutes in the Netherlands. The modelling framework consists of the model codes MODFLOW (groundwater flow), MetaSWAP (vadose zone), WOFOST (crop growth), SMART2-SUMO2 (soil-vegetation) and NTM3 (nature valuation). MODFLOW, MetaSWAP and WOFOST are coupled online (i.e. exchange information on time step basis). Thus, changes in meteorology and CO2-concentrations affect crop growth and feedbacks between crop growth, vadose zone water movement and groundwater recharge are accounted for. The model chain WOFOST-MetaSWAP-MODFLOW generates hydrological input for the ecological prediction model combination SMART2-SUMO2-NTM3. The modelling framework was used to support the regional water management decision making process in the 267 km2 Baakse Beek-Veengoot catchment in the east of the Netherlands. Computations were performed for regionalized 30-year climate change

  2. A Physics-Based Modeling Framework for Prognostic Studies

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.

    2014-01-01

    Prognostics and Health Management (PHM) methodologies have emerged as one of the key enablers for achieving efficient system level maintenance as part of a busy operations schedule, and lowering overall life cycle costs. PHM is also emerging as a high-priority issue in critical applications, where the focus is on conducting fundamental research in the field of integrated systems health management. The term diagnostics relates to the ability to detect and isolate faults or failures in a system. Prognostics on the other hand is the process of predicting health condition and remaining useful life based on current state, previous conditions and future operating conditions. PHM methods combine sensing, data collection, interpretation of environmental, operational, and performance related parameters to indicate systems health under its actual application conditions. The development of prognostics methodologies for the electronics field has become more important as more electrical systems are being used to replace traditional systems in several applications in the aeronautics, maritime, and automotive fields. The development of prognostics methods for electronics presents several challenges due to the great variety of components used in a system, a continuous development of new electronics technologies, and a general lack of understanding of how electronics fail. Similarly with electric unmanned aerial vehicles, electrichybrid cars, and commercial passenger aircraft, we are witnessing a drastic increase in the usage of batteries to power vehicles. However, for battery-powered vehicles to operate at maximum efficiency and reliability, it becomes crucial to both monitor battery health and performance and to predict end of discharge (EOD) and end of useful life (EOL) events. We develop an electrochemistry-based model of Li-ion batteries that capture the significant electrochemical processes, are computationally efficient, capture the effects of aging, and are of suitable

  3. A framework for evaluating model error using asymptotic convergence in the Eady model

    NASA Astrophysics Data System (ADS)

    Visram, Abeed; Cotter, Colin; Cullen, Mike

    2013-04-01

    Operational weather forecasting requires the accurate simulation of atmospheric motions on scales ranging from the synoptic down to tens of kilometers. Weather fronts, ubiquitous of mid-latitude weather systems, are generated through baroclinic instability on the large scale but are characteristically "sharp" features in which temperature and winds can vary rapidly on the short scale. The Eady model of baroclinic instability, Eady (1949), captures the important aspects of the frontogenesis process in an idealised system. Discontinuous solutions arise in finite time from an initially smooth, large scale flow. Long term solutions have been shown using the semigeostrophic equations and a fully Lagrangian model, Cullen (2007), which exhibit multiple lifecycles after the initial frontogenesis. Previous Eulerian solutions have relied on the addition of explicit viscosity to continue past the point at which the front collapses down to the scale of the grid spacing, e.g. Snyder et al. (1993), Nakamura (1994), but the artificial diffusion renders the subsequent lifecycles much less pronounced. We present a framework for evaluating model error in terms of asymptotic convergence using the Eady model; by rescaling in one spatial dimension we are able to approach solutions of a balanced model, given by the semigeostrophic equations, using the non-hydrostatic, incompressible Euler-Boussinesq Eady equations. Using this approach we are able to validate the numerical implementation and assess the long term performance in terms of solution lifecycles. We present results using a finite difference method with semi-implicit time-stepping and semi-Lagrangian transport, and show that without any explicit viscosity we are able to proceed past the point of frontal collapse and recover the theoretical convergence rate. We propose that artificial diffusion of potential vorticity after collapse is detrimental to the long term evolution of the solution.

  4. A Framework of Multi Objectives Negotiation for Dynamic Supply Chain Model

    NASA Astrophysics Data System (ADS)

    Chai, Jia Yee; Sakaguchi, Tatsuhiko; Shirase, Keiichi

    Trends of globalization and advances in Information Technology (IT) have created opportunity in collaborative manufacturing across national borders. A dynamic supply chain utilizes these advances to enable more flexibility in business cooperation. This research proposes a concurrent decision making framework for a three echelons dynamic supply chain model. The dynamic supply chain is formed by autonomous negotiation among agents based on multi agents approach. Instead of generating negotiation aspects (such as amount, price and due date) arbitrary, this framework proposes to utilize the information available at operational level of an organization in order to generate realistic negotiation aspect. The effectiveness of the proposed model is demonstrated by various case studies.

  5. The Melanoma MAICare Framework: A Microsimulation Model for the Assessment of Individualized Cancer Care

    PubMed Central

    van der Meijde, Elisabeth; van den Eertwegh, Alfons J. M.; Linn, Sabine C.; Meijer, Gerrit A.; Fijneman, Remond J. A.; Coupé, Veerle M. H.

    2016-01-01

    Recently, new but expensive treatments have become available for metastatic melanoma. These improve survival, but in view of the limited funds available, cost-effectiveness needs to be evaluated. Most cancer cost-effectiveness models are based on the observed clinical events such as recurrence- free and overall survival. Times at which events are recorded depend not only on the effectiveness of treatment but also on the timing of examinations and the types of tests performed. Our objective was to construct a microsimulation model framework that describes the melanoma disease process using a description of underlying tumor growth as well as its interaction with diagnostics, treatments, and surveillance. The framework should allow for exploration of the impact of simultaneously altering curative treatment approaches in different phases of the disease as well as altering diagnostics. The developed framework consists of two components, namely, the disease model and the clinical management module. The disease model consists of a tumor level, describing growth and metastasis of the tumor, and a patient level, describing clinically observed states, such as recurrence and death. The clinical management module consists of the care patients receive. This module interacts with the disease process, influencing the rate of transition between tumor growth states at the tumor level and the rate of detecting a recurrence at the patient level. We describe the framework as the required input and the model output. Furthermore, we illustrate model calibration using registry data and data from the literature. PMID:27346945

  6. Modeling overland flow-driven erosion across a watershed DEM using the Landlab modeling framework.

    NASA Astrophysics Data System (ADS)

    Adams, J. M.; Gasparini, N. M.; Tucker, G. E.; Hobley, D. E. J.; Hutton, E. W. H.; Nudurupati, S. S.; Istanbulluoglu, E.

    2015-12-01

    Many traditional landscape evolution models assume steady-state hydrology when computing discharge, and generally route flow in a single direction, along the path of steepest descent. Previous work has demonstrated that, for larger watersheds or short-duration storms, hydrologic steady-state may not be achieved. In semiarid regions, often dominated by convective summertime storms, landscapes are likely heavily influenced by these short-duration but high-intensity periods of rainfall. To capture these geomorphically significant bursts of rain, a new overland flow method has been implemented in the Landlab modeling framework. This overland flow method routes a hydrograph across a landscape, and allows flow to travel in multiple directions out of a given grid node. This study compares traditional steady-state flow routing and incision methods to the new, hydrograph-driven overland flow and erosion model in Landlab. We propose that for short-duration, high-intensity precipitation events, steady-state, single-direction flow routing models will significantly overestimate discharge and erosion when compared with non-steady, multiple flow direction model solutions. To test this hypothesis, discharge and erosion are modeled using both steady-state and hydrograph methods. A stochastic storm generator is used to generate short-duration, high-intensity precipitation intervals, which drive modeled discharge and erosion across a watershed imported from a digital elevation model, highlighting Landlab's robust raster-gridding library and watershed modeling capabilities. For each storm event in this analysis, peak discharge at the outlet, incision rate at the outlet, as well as total discharge and erosion depth are compared between methods. Additionally, these results are organized by storm duration and intensity to understand how erosion rates scale with precipitation between both flow routing methods. Results show that in many cases traditional steady-state methods overestimate

  7. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  8. Models and frameworks: a synergistic association for developing component-based applications.

    PubMed

    Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858

  9. Models and Frameworks: A Synergistic Association for Developing Component-Based Applications

    PubMed Central

    Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858

  10. A framework to assess the realism of model structures using hydrological signatures

    NASA Astrophysics Data System (ADS)

    Euser, T.; Winsemius, H. C.; Hrachowitz, M.; Fenicia, F.; Uhlenbrook, S.; Savenije, H. H. G.

    2012-11-01

    The use of flexible hydrological model structures for hypothesis testing requires an objective and diagnostic method to identify whether a rainfall-runoff model structure is suitable for a certain catchment. To determine if a model structure is realistic, i.e. if it captures the relevant runoff processes, both performance and consistency are important. Performance describes the ability of a model structure to mimic a specific part of the hydrological behaviour in a specific catchment. This can be assessed based on evaluation criteria, such as the goodness of fit of specific hydrological signatures obtained from hydrological data. Consistency describes the ability of a model structure to adequately reproduce several hydrological signatures simultaneously, while using the same set of parameter values. In this paper we describe and demonstrate a new evaluation Framework for Assessing the Realism of Model structures (FARM). The evaluation framework tests for both performance and consistency using a principal component analysis on a range of evaluation criteria, all emphasizing different hydrological behaviour. The utility of this evaluation framework is demonstrated in a case study of two small headwater catchments (Maimai, New Zealand and Wollefsbach, Luxembourg). Eight different hydrological signatures and eleven model structures have been used for this study. The results suggest that some model structures may reveal the same degree of performance for selected evaluation criteria, while showing differences in consistency. The results also show that some model structures have a higher performance and consistency than others. The principal component analysis in combination with several hydrological signatures is shown to be useful to visualize the performance and consistency of a model structure for the study catchments. With this framework performance and consistency can be tested to identify which model structures suit a catchment better than other model structures.

  11. Building a Framework Earthquake Cycle Deformational Model for Subduction Megathrust Zones: Integrating Observations with Numerical Models

    NASA Astrophysics Data System (ADS)

    Furlong, Kevin P.; Govers, Rob; Herman, Matthew

    2016-04-01

    last for decades after a major event (e.g. Alaska 1964) We have integrated the observed patterns of upper-plate displacements (and deformation) with models of subduction zone evolution that allow us to incorporate both the transient behavior associated with post-earthquake viscous re-equilibration and the underlying long term, relatively constant elastic strain accumulation. Modeling the earthquake cycle through the use of a visco-elastic numerical model over numerous earthquake cycles, we have developed a framework model for the megathrust cycle that is constrained by observations made at a variety of plate boundary zones at different stages in their earthquake cycle (see paper by Govers et al., this meeting). Our results indicate that the observed patterns of co- and post- and inter-seismic deformation are largely controlled by interplay between elastic and viscous processes. Observed displacements represent the competition between steady elastic-strain accumulation driven by plate boundary coupling, and post-earthquake viscous behavior in response to the coseismic loading of the system by the rapid elastic rebound. The application of this framework model to observations from subduction zone observatories points up the dangers of simply extrapolating current deformation observations to the overall strain accumulation state of the subduction zoned allows us to develop improved assessments of the slip deficit accumulating within the seismogenic zone, and the near-future earthquake potential of different segments of the subduction plate boundary.

  12. Predicting the resilience and recovery of aquatic systems: A framework for model evolution within environmental observatories

    NASA Astrophysics Data System (ADS)

    Hipsey, Matthew R.; Hamilton, David P.; Hanson, Paul C.; Carey, Cayelan C.; Coletti, Janaine Z.; Read, Jordan S.; Ibelings, Bas W.; Valesini, Fiona J.; Brookes, Justin D.

    2015-09-01

    Maintaining the health of aquatic systems is an essential component of sustainable catchment management, however, degradation of water quality and aquatic habitat continues to challenge scientists and policy-makers. To support management and restoration efforts aquatic system models are required that are able to capture the often complex trajectories that these systems display in response to multiple stressors. This paper explores the abilities and limitations of current model approaches in meeting this challenge, and outlines a strategy based on integration of flexible model libraries and data from observation networks, within a learning framework, as a means to improve the accuracy and scope of model predictions. The framework is comprised of a data assimilation component that utilizes diverse data streams from sensor networks, and a second component whereby model structural evolution can occur once the model is assessed against theoretically relevant metrics of system function. Given the scale and transdisciplinary nature of the prediction challenge, network science initiatives are identified as a means to develop and integrate diverse model libraries and workflows, and to obtain consensus on diagnostic approaches to model assessment that can guide model adaptation. We outline how such a framework can help us explore the theory of how aquatic systems respond to change by bridging bottom-up and top-down lines of enquiry, and, in doing so, also advance the role of prediction in aquatic ecosystem management.

  13. ToPS: a framework to manipulate probabilistic models of sequence data.

    PubMed

    Kashiwabara, André Yoshiaki; Bonadio, Igor; Onuchic, Vitor; Amado, Felipe; Mathias, Rafael; Durham, Alan Mitchell

    2013-01-01

    Discrete Markovian models can be used to characterize patterns in sequences of values and have many applications in biological sequence analysis, including gene prediction, CpG island detection, alignment, and protein profiling. We present ToPS, a computational framework that can be used to implement different applications in bioinformatics analysis by combining eight kinds of models: (i) independent and identically distributed process; (ii) variable-length Markov chain; (iii) inhomogeneous Markov chain; (iv) hidden Markov model; (v) profile hidden Markov model; (vi) pair hidden Markov model; (vii) generalized hidden Markov model; and (viii) similarity based sequence weighting. The framework includes functionality for training, simulation and decoding of the models. Additionally, it provides two methods to help parameter setting: Akaike and Bayesian information criteria (AIC and BIC). The models can be used stand-alone, combined in Bayesian classifiers, or included in more complex, multi-model, probabilistic architectures using GHMMs. In particular the framework provides a novel, flexible, implementation of decoding in GHMMs that detects when the architecture can be traversed efficiently. PMID:24098098

  14. Quantification of a Framework to Assess the Realism of Model structures (FARM)

    NASA Astrophysics Data System (ADS)

    Euser, Tanja; Hrachowitz, Markus; Winsemius, Hessel; Savenije, Huub

    2013-04-01

    The use of flexible hydrological model structures for hypothesis testing requires an objective and diagnostic method to identify whether a rainfall-runoff model structure is suitable for a certain catchment. To determine if a model structure is realistic, i.e. if it captures the relevant runoff processes, both performance and consistency are important. Performance describes the ability of a model structure to mimic a specific part of the hydrological behaviour in a specific catchment. Consistency describes the ability of a model structure to adequately reproduce several hydrological signatures simultaneously. The FARM framework can be used to evaluate this performance and consistency, using different hydrological signatures. Results from FARM presented previously, based on Principal Component Analysis with two principal components, are only qualitative and for a limited number of hydrological signatures, therefore, the research questions of this study are (1) How can the results from FARM be quantified to provide a more objective framework?, and (2) How does the use of different hydrological signatures influence the usefulness of FARM? For this study a case study is performed in the Ourthe catchment, a tributary of the Meuse. Different options of quantification are compared, such as, determining consistency based on higher than 2-dimension Principal Component Analysis and summing up the loadings of the different Principal Components weighted to their variance explained. Also the effect of adding different signatures to the framework is tested. This adaptations of the FARM framework can help to make it more objective and therefore more useful.

  15. Alternative Model-Based and Design-Based Frameworks for Inference from Samples to Populations: From Polarization to Integration

    ERIC Educational Resources Information Center

    Sterba, Sonya K.

    2009-01-01

    A model-based framework, due originally to R. A. Fisher, and a design-based framework, due originally to J. Neyman, offer alternative mechanisms for inference from samples to populations. We show how these frameworks can utilize different types of samples (nonrandom or random vs. only random) and allow different kinds of inference (descriptive vs.…

  16. A structured continuum modelling framework for martensitic transformation and reorientation in shape memory materials.

    PubMed

    Bernardini, Davide; Pence, Thomas J

    2016-04-28

    Models for shape memory material behaviour can be posed in the framework of a structured continuum theory. We study such a framework in which a scalar phase fraction field and a tensor field of martensite reorientation describe the material microstructure, in the context of finite strains. Gradients of the microstructural descriptors naturally enter the formulation and offer the possibility to describe and resolve phase transformation localizations. The constitutive theory is thoroughly described by a single free energy function in conjunction with a path-dependent dissipation function. Balance laws in the form of differential equations are obtained and contain both bulk and surface terms, the latter in terms of microstreses. A natural constraint on the tensor field for martensite reorientation gives rise to reactive fields in these balance laws. Conditions ensuring objectivity as well as the relation of this framework to that provided by currently used models for shape memory alloy behaviour are discussed. PMID:27002064

  17. Framework for non-coherent interface models at finite displacement jumps and finite strains

    NASA Astrophysics Data System (ADS)

    Ottosen, Niels Saabye; Ristinmaa, Matti; Mosler, Jörn

    2016-05-01

    This paper deals with a novel constitutive framework suitable for non-coherent interfaces, such as cracks, undergoing large deformations in a geometrically exact setting. For this type of interface, the displacement field shows a jump across the interface. Within the engineering community, so-called cohesive zone models are frequently applied in order to describe non-coherent interfaces. However, for existing models to comply with the restrictions imposed by (a) thermodynamical consistency (e.g., the second law of thermodynamics), (b) balance equations (in particular, balance of angular momentum) and (c) material frame indifference, these models are essentially fiber models, i.e. models where the traction vector is collinear with the displacement jump. This constraints the ability to model shear and, in addition, anisotropic effects are excluded. A novel, extended constitutive framework which is consistent with the above mentioned fundamental physical principles is elaborated in this paper. In addition to the classical tractions associated with a cohesive zone model, the main idea is to consider additional tractions related to membrane-like forces and out-of-plane shear forces acting within the interface. For zero displacement jump, i.e. coherent interfaces, this framework degenerates to existing formulations presented in the literature. For hyperelasticity, the Helmholtz energy of the proposed novel framework depends on the displacement jump as well as on the tangent vectors of the interface with respect to the current configuration - or equivalently - the Helmholtz energy depends on the displacement jump and the surface deformation gradient. It turns out that by defining the Helmholtz energy in terms of the invariants of these variables, all above-mentioned fundamental physical principles are automatically fulfilled. Extensions of the novel framework necessary for material degradation (damage) and plasticity are also covered.

  18. A new multiscale routing framework and its evaluation for land surface modeling applications

    NASA Astrophysics Data System (ADS)

    Wen, Zhiqun; Liang, Xu; Yang, Shengtian

    2012-08-01

    A new multiscale routing framework is developed and coupled with the Hydrologically based Three-layer Variable Infiltration Capacity (VIC-3L) land surface model (LSM). This new routing framework has a characteristic of reducing impacts of different scales (both in space and time) on the routing results. The new routing framework has been applied to three different river basins with six different spatial resolutions and two different temporal resolutions. Their results have also been compared to the D8-based (eight direction based) routing scheme, whose flow network is generated from the widely used eight direction (D8) method, to evaluate the new framework's capability of reducing the impacts of spatial and temporal resolutions on the routing results. Results from the new routing framework show that they are significantly less affected by the spatial resolutions than those from the D8-based routing scheme. Comparing the results at the basins' outlets to those obtained from the instantaneous unit hydrograph (IUH) method which has, in principle, the least spatial resolution impacts on the routing results, the new routing framework provides results similar to those by the IUH method. However, the new routing framework has an advantage over the IUH method of providing routing information within the interior locations of a basin and along the river channels, while the IUH method cannot. The new routing framework also reduces impacts of different temporal resolutions on the routing results. The problem of spiky hydrographs caused by a typical routing method, due to the impacts of different temporal resolutions, can be significantly reduced.

  19. Integrated Modeling, Mapping, and Simulation (IMMS) Framework for Exercise and Response Planning

    NASA Technical Reports Server (NTRS)

    Mapar, Jalal; Hoette, Trisha; Mahrous, Karim; Pancerella, Carmen M.; Plantenga, Todd; Yang, Christine; Yang, Lynn; Hopmeier, Michael

    2011-01-01

    EmergenCy management personnel at federal, stale, and local levels can benefit from the increased situational awareness and operational efficiency afforded by simulation and modeling for emergency preparedness, including planning, training and exercises. To support this goal, the Department of Homeland Security's Science & Technology Directorate is funding the Integrated Modeling, Mapping, and Simulation (IMMS) program to create an integrating framework that brings together diverse models for use by the emergency response community. SUMMIT, one piece of the IMMS program, is the initial software framework that connects users such as emergency planners and exercise developers with modeling resources, bridging the gap in expertise and technical skills between these two communities. SUMMIT was recently deployed to support exercise planning for National Level Exercise 2010. Threat, casualty. infrastructure, and medical surge models were combined within SUMMIT to estimate health care resource requirements for the exercise ground truth.

  20. Predicting the resilience and recovery of aquatic systems: a framework for model evolution within environmental observatories

    USGS Publications Warehouse

    Hipsey, Matthew R.; Hamilton, David P.; Hanson, Paul C.; Carey, Cayelan C; Coletti, Janaine Z; Read, Jordan S.; Ibelings, Bas W; Valensini, Fiona J; Brookes, Justin D

    2015-01-01

    Maintaining the health of aquatic systems is an essential component of sustainable catchmentmanagement, however, degradation of water quality and aquatic habitat continues to challenge scientistsand policy-makers. To support management and restoration efforts aquatic system models are requiredthat are able to capture the often complex trajectories that these systems display in response to multiplestressors. This paper explores the abilities and limitations of current model approaches in meeting this chal-lenge, and outlines a strategy based on integration of flexible model libraries and data from observationnetworks, within a learning framework, as a means to improve the accuracy and scope of model predictions.The framework is comprised of a data assimilation component that utilizes diverse data streams from sensornetworks, and a second component whereby model structural evolution can occur once the model isassessed against theoretically relevant metrics of system function. Given the scale and transdisciplinarynature of the prediction challenge, network science initiatives are identified as a means to develop and inte-grate diverse model libraries and workflows, and to obtain consensus on diagnostic approaches to modelassessment that can guide model adaptation. We outline how such a framework can help us explore thetheory of how aquatic systems respond to change by bridging bottom-up and top-down lines of enquiry,and, in doing so, also advance the role of prediction in aquatic ecosystem management.

  1. Smart Frameworks and Self-Describing Models: Model Metadata for Automated Coupling of Hydrologic Process Components (Invited)

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.

    2013-12-01

    Model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that allow heterogeneous sets of process models to be assembled in a plug-and-play manner to create composite "system models". These mechanisms facilitate code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers, e.g. by requiring them to provide their output in specific forms that meet the input requirements of other models. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can compare the answers to these queries with similar answers from other process models in a collection and then automatically call framework

  2. Coupled model of INM-IO global ocean model, CICE sea ice model and SCM OIAS framework

    NASA Astrophysics Data System (ADS)

    Bayburin, Ruslan; Rashit, Ibrayev; Konstantin, Ushakov; Vladimir, Kalmykov; Gleb, Dyakonov

    2015-04-01

    Status of coupled Arctic model of ocean and sea ice is presented. Model consists of INM IO global ocean component of high resolution, Los Alamos National Laboratory CICE sea ice model and a framework SCM OIAS for the ocean-ice-atmosphere-land coupled modeling on massively-parallel architectures. Model is currently under development at the Institute of Numerical Mathematics (INM), Hydrometeorological Center (HMC) and P.P. Shirshov Institute of Oceanology (IO). Model is aimed at modeling of intra-annual variability of hydrodynamics in Arctic and. The computational characteristics of the world ocean-sea ice coupled model governed by SCM OIAS are presented. The model is parallelized using MPI technologies and currently can use efficiently up to 5000 cores. Details of programming implementation, computational configuration and physical phenomena parametrization are analyzed in terms of intercoupling complex. Results of five year computational experiment of sea ice, snow and ocean state evolution in Arctic region on tripole grid with horizontal resolution of 3-5 kilometers, closed by atmospheric forcing field from repeating "normal" annual course taken from CORE1 experiment data base are presented and analyzed in terms of the state of vorticity and warm Atlantic water expansion.

  3. A common framework for the development and analysis of process-based hydrological models

    NASA Astrophysics Data System (ADS)

    Clark, Martyn; Kavetski, Dmitri; Fenicia, Fabrizio; Gupta, Hoshin

    2013-04-01

    provide a common framework for model development and analysis. We recognize that the majority of process-based hydrological models use the same set of physics - most models use Darcy's Law to represent the flow of water through the soil matrix and Fourier's Law for thermodynamics. Our numerical model uses robust solutions of the hydrology and thermodynamic governing equations as the structural core, and incorporates multiple options to represent the impact of different modeling decisions, including different methods to represent spatial variability and different parameterizations of surface fluxes and shallow groundwater. Our analysis isolates individual modeling decisions and uses orthogonal diagnostic signatures to evaluate model behavior. Application of this framework in research basins demonstrates that the combination of (1) flexibility in the numerical model and (2) comprehensive scrutiny of orthogonal signatures provides a powerful approach to identify the suitability of different modeling options and different model parameter values. We contend that this common framework has general utility, and its widespread application in both research basins and at larger spatial scales will help accelerate the development of process-based hydrologic models.

  4. Beyond a Definition: Toward a Framework for Designing and Specifying Mentoring Models

    ERIC Educational Resources Information Center

    Dawson, Phillip

    2014-01-01

    More than three decades of mentoring research has yet to converge on a unifying definition of mentoring; this is unsurprising given the diversity of relationships classified as mentoring. This article advances beyond a definition toward a common framework for specifying mentoring models. Sixteen design elements were identified from the literature…

  5. A Supervisory Issue When Utilizing the ASCA National Model Framework in School Counseling

    ERIC Educational Resources Information Center

    Bryant-Young, Necole; Bell, Catherine A.; Davis, Kalena M.

    2014-01-01

    The authors discuss a supervisory issue, in that, the ASCA National Model: A Framework for School Counseling Programs does not emphasize on-going supervision where ethical expectations of supervisors and supervisees in a school setting are clearly defined. Subsequently, the authors highlight supervisor expectations stated with the ASCA National…

  6. An Analytical Framework for Evaluating E-Commerce Business Models and Strategies.

    ERIC Educational Resources Information Center

    Lee, Chung-Shing

    2001-01-01

    Considers electronic commerce as a paradigm shift, or a disruptive innovation, and presents an analytical framework based on the theories of transaction costs and switching costs. Topics include business transformation process; scale effect; scope effect; new sources of revenue; and e-commerce value creation model and strategy. (LRW)

  7. Argumentation, Dialogue Theory, and Probability Modeling: Alternative Frameworks for Argumentation Research in Education

    ERIC Educational Resources Information Center

    Nussbaum, E. Michael

    2011-01-01

    Toulmin's model of argumentation, developed in 1958, has guided much argumentation research in education. However, argumentation theory in philosophy and cognitive science has advanced considerably since 1958. There are currently several alternative frameworks of argumentation that can be useful for both research and practice in education. These…

  8. A Model Driven Framework to Address Challenges in a Mobile Learning Environment

    ERIC Educational Resources Information Center

    Khaddage, Ferial; Christensen, Rhonda; Lai, Wing; Knezek, Gerald; Norris, Cathie; Soloway, Elliot

    2015-01-01

    In this paper a review of the pedagogical, technological, policy and research challenges and concepts underlying mobile learning is presented, followed by a brief description of categories of implementations. A model Mobile learning framework and dynamic criteria for mobile learning implementations are proposed, along with a case study of one site…

  9. A Quality Framework for Continuous Improvement of e-Learning: The e-Learning Maturity Model

    ERIC Educational Resources Information Center

    Marshall, Stephen

    2010-01-01

    The E-Learning Maturity Model (eMM) is a quality improvement framework designed to help institutional leaders assess their institution's e-learning maturity. This paper reviews the eMM, drawing on examples of assessments conducted in New Zealand, Australia, the UK and the USA to show how it helps institutional leaders assess and compare their…

  10. The 3C3R Model: A Conceptual Framework for Designing Problems in PBL

    ERIC Educational Resources Information Center

    Hung, Woei

    2006-01-01

    Well-designed problems are crucial for the success of problem-based learning (PBL). Previous discussions about designing problems for PBL have been rather general and inadequate in guiding educators and practitioners to design effective PBL problems. This paper introduces the 3C3R PBL problem design model as a conceptual framework for…

  11. Map Resource Packet: Course Models for the History-Social Science Framework, Grade Seven.

    ERIC Educational Resources Information Center

    California State Dept. of Education, Sacramento.

    This packet of maps is an auxiliary resource to the "World History and Geography: Medieval and Early Modern Times. Course Models for the History-Social Science Framework, Grade Seven." The set includes: outline, precipitation, and elevation maps; maps for locating key places; landform maps; and historical maps. The list of maps are grouped under…

  12. Description of light charged particle multiplicities in the framework of dinuclear system model

    NASA Astrophysics Data System (ADS)

    Kalandarov, Sh. A.; Adamian, G. G.; Antonenko, N. V.

    2012-12-01

    In the framework of dinuclear system (DNS) model we calculate the light charged particle (LCP) multiplicities produced in fusion and quasifission reactions and their kinetic energy spectra. Calculations indicate that with increasing bombarding energy the ratio of LCP multiplicity from fragments MFF to corresponding LCP multiplicity from compound nucleus (CN) MCN strongly increases.

  13. Instructional Dissent in the College Classroom: Using the Instructional Beliefs Model as a Framework

    ERIC Educational Resources Information Center

    LaBelle, Sara; Martin, Matthew M.; Weber, Keith

    2013-01-01

    We examined the impact of instructor characteristics and student beliefs on students' decisions to enact instructional dissent using the Instructional Beliefs Model (IBM) as a framework. Students (N = 244) completed survey questionnaires assessing their perceptions of instructors' clarity, nonverbal immediacy, and affirming style, as well as their…

  14. PIRPOSAL Model of Integrative STEM Education: Conceptual and Pedagogical Framework for Classroom Implementation

    ERIC Educational Resources Information Center

    Wells, John G.

    2016-01-01

    The PIRPOSAL model is both a conceptual and pedagogical framework intended for use as a pragmatic guide to classroom implementation of Integrative STEM Education. Designerly questioning prompted by a "need to know" serves as the basis for transitioning student designers within and among multiple phases while they progress toward an…

  15. Toward a Model Framework of Generalized Parallel Componential Processing of Multi-Symbol Numbers

    ERIC Educational Resources Information Center

    Huber, Stefan; Cornelsen, Sonja; Moeller, Korbinian; Nuerk, Hans-Christoph

    2015-01-01

    In this article, we propose and evaluate a new model framework of parallel componential multi-symbol number processing, generalizing the idea of parallel componential processing of multi-digit numbers to the case of negative numbers by considering the polarity signs similar to single digits. In a first step, we evaluated this account by defining…

  16. Spatial optimization of watershed management practices for nitrogen load reduction using a modeling-optimization framework

    EPA Science Inventory

    Best management practices (BMPs) are perceived as being effective in reducing nutrient loads transported from non-point sources (NPS) to receiving water bodies. The objective of this study was to develop a modeling-optimization framework that can be used by watershed management p...

  17. A Study of the Model of Mastery as a Theoretical Framework for Coaching Teachers Writing Workshop

    ERIC Educational Resources Information Center

    Kimbrell, Jennifer L.

    2010-01-01

    The study investigated a coach's use of a theoretical framework called the Model of Mastery to assist three teachers in becoming self-regulated in the teaching of writing workshop by moving them through three settings: acquisition, consolidation, and consultation. The goal of the coach was to assist teachers in developing expertise in procedural,…

  18. Adapting to Students' Social and Health Needs: Suggested Framework for Building Inclusive Models of Practice

    ERIC Educational Resources Information Center

    Schwitzer, Alan M.

    2009-01-01

    Objective: This article builds on earlier discussions about college health research. The author suggests a 5-step framework that research practitioners can use to build models of practice that accurately address the needs of diverse campus populations. Methods: The author provides 3 illustrations, drawn from published research examining college…

  19. MODELING FRAMEWORK FOR EVALUATING SEDIMENTATION IN STREAM NETWORKS: FOR USE IN SEDIMENT TMDL ANALYSIS

    EPA Science Inventory

    A modeling framework that can be used to evaluate sedimentation in stream networks is described. This methodology can be used to determine sediment Total Maximum Daily Loads (TMDLs) in sediment impaired waters, and provide the necessary hydrodynamic and sediment-related data t...

  20. Evaluating and Improving Cloud Processes in the Multi-Scale Modeling Framework

    SciTech Connect

    Ackerman, Thomas P.

    2015-03-01

    The research performed under this grant was intended to improve the embedded cloud model in the Multi-scale Modeling Framework (MMF) for convective clouds by using a 2-moment microphysics scheme rather than the single moment scheme used in all the MMF runs to date. The technical report and associated documents describe the results of testing the cloud resolving model with fixed boundary conditions and evaluation of model results with data. The overarching conclusion is that such model evaluations are problematic because errors in the forcing fields control the results so strongly that variations in parameterization values cannot be usefully constrained

  1. Modeling Nonlinear Change via Latent Change and Latent Acceleration Frameworks: Examining Velocity and Acceleration of Growth Trajectories

    ERIC Educational Resources Information Center

    Grimm, Kevin; Zhang, Zhiyong; Hamagami, Fumiaki; Mazzocco, Michele

    2013-01-01

    We propose the use of the latent change and latent acceleration frameworks for modeling nonlinear growth in structural equation models. Moving to these frameworks allows for the direct identification of "rates of change" and "acceleration" in latent growth curves--information available indirectly through traditional growth curve models when change…

  2. A New Object-Oriented MODFLOW Framework for Coupling Multiple Hydrologic Models

    NASA Astrophysics Data System (ADS)

    Langevin, C.; Hughes, J. D.; Panday, S. M.; Banta, E. R.; Niswonger, R. G.

    2014-12-01

    MODFLOW is a popular open-source groundwater flow model distributed by the U.S. Geological Survey. For 30 years, the MODFLOW program has been widely used by academic researchers, private consultants, and government scientists to accurately, reliably, and efficiently simulate groundwater flow. With time, growing interest in surface and groundwater interactions, local refinement with nested and unstructured grids, karst groundwater flow, solute transport, and saltwater intrusion, has led to the development of numerous MODFLOW versions. Although these MODFLOW versions are often based on the core version (presently MODFLOW-2005), there are often incompatibilities that restrict their use with one another. In many cases, development of these alternative versions has been challenging due to the underlying MODFLOW structure, which was designed for simulation with a single groundwater flow model using a rectilinear grid. A new object-oriented framework is being developed for MODFLOW to provide a platform for supporting multiple models and multiple types of models within the same simulation. In the new design, any number of numerical models can be tightly coupled at the matrix level by adding them to the same numerical solution, or they can be iteratively coupled until there is convergence between them. Transfer of information between models is isolated to exchange objects, which allow models to be developed and used independently. For existing MODFLOW users, this means that the program can function in the same way it always has for a single groundwater flow model. Within this new framework, a regional-scale groundwater model may be coupled with multiple local-scale groundwater models. Or, a surface water flow model can be coupled to multiple groundwater flow models. The framework naturally allows for the simulation of solute transport. Presently, unstructured control-volume finite-difference models have been implemented in the framework for three-dimensional groundwater

  3. TLS and photogrammetry for the modeling of a historic wooden framework

    NASA Astrophysics Data System (ADS)

    Koehl, M.; Viale, M.

    2012-04-01

    The building which is the object of the study is located in the center of Andlau, France. This mansion that was built in 1582 was the residence of the Lords of Andlau from the XVIth century until the French Revolution. Its architecture represents the Renaissance style of the XVIth century in particular by its volutes and its spiral staircase inside the polygonal turret. In January 2005, the municipality of Andlau became the owner of this Seigneury which is intended to welcome the future Heritage Interpretation Center (HIC), a museum is also going to be created there. Three levels of attic of this building are going to be restored and isolated, the historic framework will that way be masked and the last three levels will not be accessible any more. In this context, our lab was asked to model the framework to allow to make diagnoses there, to learn to know and to consolidate the knowledge on this type of historic framework. Finally, next to a virtual visualization, we provided other applications in particular the creation of an accurate 3D model of the framework for animations, as well as for foundation of an historical information system and for supplying the future museum and HIC with digital data. The project contains different phases: the data acquisition, the model creation and data structuring, the creation of an interactive model and the integration in a historic information system. All levels of the attic were acquired: a 3D Trimble GX scanner and partially a Trimble CX scanner were used in particular for the acquisition of data in the highest part of the framework. The various scans were directly georeferenced in the field thanks to control points, then merged together in an unique point cloud covering the whole structure. Several panoramic photos were also realized to create a virtual tour of the framework and the surroundings of the Seigneury. The purpose of the project was to supply a 3D model allowing the creation of scenographies and interactive

  4. A multi-scale modeling framework for instabilities of film/substrate systems

    NASA Astrophysics Data System (ADS)

    Xu, Fan; Potier-Ferry, Michel

    2016-01-01

    Spatial pattern formation in stiff thin films on soft substrates is investigated from a multi-scale point of view based on a technique of slowly varying Fourier coefficients. A general macroscopic modeling framework is developed and then a simplified macroscopic model is derived. The model incorporates Asymptotic Numerical Method (ANM) as a robust path-following technique to trace the post-buckling evolution path and to predict secondary bifurcations. The proposed multi-scale finite element framework allows sinusoidal and square checkerboard patterns as well as their bifurcation portraits to be described from a quantitative standpoint. Moreover, it provides an efficient way to compute large-scale instability problems with a significant reduction of computational cost compared to full models.

  5. An approximate framework for quantum transport calculation with model order reduction

    SciTech Connect

    Chen, Quan; Li, Jun; Yam, Chiyung; Zhang, Yu; Wong, Ngai; Chen, Guanhua

    2015-04-01

    A new approximate computational framework is proposed for computing the non-equilibrium charge density in the context of the non-equilibrium Green's function (NEGF) method for quantum mechanical transport problems. The framework consists of a new formulation, called the X-formulation, for single-energy density calculation based on the solution of sparse linear systems, and a projection-based nonlinear model order reduction (MOR) approach to address the large number of energy points required for large applied biases. The advantages of the new methods are confirmed by numerical experiments.

  6. A Framework for Different Levels of Integration of Computational Models Into Web-Based Virtual Patients

    PubMed Central

    Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-01

    Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the

  7. Representing natural and manmade drainage systems in an earth system modeling framework

    SciTech Connect

    Li, Hongyi; Wu, Huan; Huang, Maoyi; Leung, Lai-Yung R.

    2012-08-27

    Drainage systems can be categorized into natural or geomorphological drainage systems, agricultural drainage systems and urban drainage systems. They interact closely among themselves and with climate and human society, particularly under extreme climate and hydrological events such as floods. This editorial articulates the need to holistically understand and model drainage systems in the context of climate change and human influence, and discusses the requirements and examples of feasible approaches to representing natural and manmade drainage systems in an earth system modeling framework.

  8. On Developing a Conceptual Modeling Framework for Nitrate Transport in the Subsurface

    NASA Astrophysics Data System (ADS)

    Dwivedi, D.; Mohanty, B. P.

    2012-12-01

    Nitrate is the most ubiquitous contaminant in groundwater. Once nitrate enters the subsurface environment, it is subjected to a variety of coupled hydrological, geochemical, and biological processes. There is significant uncertainty associated with geochemical and microbiological processes due to a lack of easily available data and reactive heterogeneity of the subsurface systems. Since most hydrologic analyses focus exclusively on the optimization of model parameters and ignore inadequate model structure (structural uncertainty), we present a conceptual framework that incorporates different model structures for complex biogeochemical processes. We simulate nitrate transport using a conceptual modeling framework where physical processes (e.g., advection, dispersion) are modeled as deterministic partial differential equations, while bio-chemical processes (e.g., nitrification) are modeled as stochastic differential equations. We focus here on capturing the influence of bio-chemical processes under deterministic hydrological feedbacks on nitrate transport in a 1-D soil column. We also provide an understanding of the nitrate dynamics under perturbed conditions of soil temperature and pH. Results demonstrate that the predictions of ammonium, nitrite, and nitrate by the conceptual modeling framework are in agreement with the analytical solution. Moreover, the conceptual model provides a broader view of the integrated system behavior as it simulates bio-chemical processes in a stochastic framework. Uncertainty analysis shows that there is higher uncertainty in predicting ammonium concentrations in the soil column as compared to nitrate and nitrite concentrations. Soil temperature variations cause nitrification rates to vary along the soil profile and consecutively, nitrate concentrations arrive earlier at greater depths, and ammonium concentrations are smaller along the soil profile. In addition, soil pH variations cause ammonium concentrations reach deeper in the column.

  9. A Framework for Modeling Workflow Execution by an Interdisciplinary Healthcare Team.

    PubMed

    Kezadri-Hamiaz, Mounira; Rosu, Daniela; Wilk, Szymon; Kuziemsky, Craig; Michalowski, Wojtek; Carrier, Marc

    2015-01-01

    The use of business workflow models in healthcare is limited because of insufficient capture of complexities associated with behavior of interdisciplinary healthcare teams that execute healthcare workflows. In this paper we present a novel framework that builds on the well-founded business workflow model formalism and related infrastructures and introduces a formal semantic layer that describes selected aspects of team dynamics and supports their real-time operationalization. PMID:26262399

  10. Development of an integrated modelling framework: comparing client-server and demand-driven control flow for model execution

    NASA Astrophysics Data System (ADS)

    Schmitz, Oliver; Karssenberg, Derek; de Jong, Kor; de Kok, Jean-Luc; de Jong, Steven M.

    2014-05-01

    The construction of hydrological models at the catchment or global scale depends on the integration of component models representing various environmental processes, often operating at different spatial and temporal discretisations. A flexible construction of spatio-temporal model components, a means to specify aggregation or disaggregation to bridge discretisation discrepancies, ease of coupling these into complex integrated models, and support for stochastic modelling and the assessment of model outputs are the desired functionalities for the development of integrated models. These functionalities are preferably combined into one modelling framework such that domain specialists can perform exploratory model development without the need to change their working environment. We implemented an integrated modelling framework in the Python programming language, providing support for 1) model construction and 2) model execution. The framework enables modellers to represent spatio-temporal processes or to specify spatio-temporal (dis)aggregation with map algebra operations provided by the PCRaster library. Model algebra operations can be used by the modeller to specify the exchange of data and therefore the coupling of components. The framework determines the control flow for the ordered execution based on the time steps and couplings of the model components given by the modeller. We implemented two different control flow mechanisms. First, a client-server approach is used with a central entity controlling the execution of the component models and steering the data exchange. Second, a demand-driven approach is used that triggers the execution of a component model when data is requested by a coupled component model. We show that both control flow mechanisms allow for the execution of stochastic, multi-scale integrated models. We examine the implications of each control flow mechanism on the terminology used by the modeller to specify integrated models, and illustrate the

  11. A Modeling Framework to Quantify Dilution Enhancement in Spatially Heterogeneous Aquifers

    NASA Astrophysics Data System (ADS)

    de Barros, Felipe; Fiori, Aldo; Boso, Francesca; Bellin, Alberto

    2016-04-01

    Solute dilution rates are strongly affected by the spatial fluctuations of the permeability. Current challenges consist of establishing a quantitative link between the statistical properties of the heterogeneous porous media and the concentration field. Proper quantification of solute dilution is crucial for the success of a remediation campaign and for risk assessment. In this work, we provide a modeling framework to quantify the dilution of a non-reactive solute. More precisely, we model that heterogeneity induced dilution enhancement within a steady state flow. Adopting the Lagrangian framework, we obtain semi-analytical solutions for the dilution index as a function of the structural parameters characterizing the permeability field. The solutions provided are valid for uniform-in-the-mean steady flow fields, small injection source and weak-to-mild heterogeneity in the log-permeability. Results show how the dilution enhancement of the solute plume depends the statistical anisotropy ratio and the heterogeneity level of the porous medium. The modeling framework also captures the temporal evolution of the dilution rate at distinct time regimes thus recovering previous results from the literature. Finally, the performance of the framework is verified with high resolution numerical results and successfully tested against the Cape Cod field data.

  12. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    SciTech Connect

    Hadjidoukas, P.E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.

    2015-03-01

    We present Π4U,{sup 1} an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  13. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    NASA Astrophysics Data System (ADS)

    Hadjidoukas, P. E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.

    2015-03-01

    We present Π4U, an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  14. Discrete Element Framework for Modelling Extracellular Matrix, Deformable Cells and Subcellular Components

    PubMed Central

    Gardiner, Bruce S.; Wong, Kelvin K. L.; Joldes, Grand R.; Rich, Addison J.; Tan, Chin Wee; Burgess, Antony W.; Smith, David W.

    2015-01-01

    This paper presents a framework for modelling biological tissues based on discrete particles. Cell components (e.g. cell membranes, cell cytoskeleton, cell nucleus) and extracellular matrix (e.g. collagen) are represented using collections of particles. Simple particle to particle interaction laws are used to simulate and control complex physical interaction types (e.g. cell-cell adhesion via cadherins, integrin basement membrane attachment, cytoskeletal mechanical properties). Particles may be given the capacity to change their properties and behaviours in response to changes in the cellular microenvironment (e.g., in response to cell-cell signalling or mechanical loadings). Each particle is in effect an ‘agent’, meaning that the agent can sense local environmental information and respond according to pre-determined or stochastic events. The behaviour of the proposed framework is exemplified through several biological problems of ongoing interest. These examples illustrate how the modelling framework allows enormous flexibility for representing the mechanical behaviour of different tissues, and we argue this is a more intuitive approach than perhaps offered by traditional continuum methods. Because of this flexibility, we believe the discrete modelling framework provides an avenue for biologists and bioengineers to explore the behaviour of tissue systems in a computational laboratory. PMID:26452000

  15. An Automated Application Framework to Model Disordered Materials Based on a High Throughput First Principles Approach

    NASA Astrophysics Data System (ADS)

    Oses, Corey; Yang, Kesong; Curtarolo, Stefano; Duke Univ Collaboration; UC San Diego Collaboration

    Predicting material properties of disordered systems remains a long-standing and formidable challenge in rational materials design. To address this issue, we introduce an automated software framework capable of modeling partial occupation within disordered materials using a high-throughput (HT) first principles approach. At the heart of the approach is the construction of supercells containing a virtually equivalent stoichiometry to the disordered material. All unique supercell permutations are enumerated and material properties of each are determined via HT electronic structure calculations. In accordance with a canonical ensemble of supercell states, the framework evaluates ensemble average properties of the system as a function of temperature. As proof of concept, we examine the framework's final calculated properties of a zinc chalcogenide (ZnS1-xSex), a wide-gap oxide semiconductor (MgxZn1-xO), and an iron alloy (Fe1-xCux) at various stoichiometries.

  16. A dynamic water-quality modeling framework for the Neuse River estuary, North Carolina

    USGS Publications Warehouse

    Bales, Jerad D.; Robbins, Jeanne C.

    1999-01-01

    As a result of fish kills in the Neuse River estuary in 1995, nutrient reduction strategies were developed for point and nonpoint sources in the basin. However, because of the interannual variability in the natural system and the resulting complex hydrologic-nutrient inter- actions, it is difficult to detect through a short-term observational program the effects of management activities on Neuse River estuary water quality and aquatic health. A properly constructed water-quality model can be used to evaluate some of the potential effects of manage- ment actions on estuarine water quality. Such a model can be used to predict estuarine response to present and proposed nutrient strategies under the same set of meteorological and hydrologic conditions, thus removing the vagaries of weather and streamflow from the analysis. A two-dimensional, laterally averaged hydrodynamic and water-quality modeling framework was developed for the Neuse River estuary by using previously collected data. Development of the modeling framework consisted of (1) computational grid development, (2) assembly of data for model boundary conditions and model testing, (3) selection of initial values of model parameters, and (4) limited model testing. The model domain extends from Streets Ferry to Oriental, N.C., includes seven lateral embayments that have continual exchange with the main- stem of the estuary, three point-source discharges, and three tributary streams. Thirty-five computational segments represent the mainstem of the estuary, and the entire framework contains a total of 60 computa- tional segments. Each computational cell is 0.5 meter thick; segment lengths range from 500 meters to 7,125 meters. Data that were used to develop the modeling framework were collected during March through October 1991 and represent the most comprehensive data set available prior to 1997. Most of the data were collected by the North Carolina Division of Water Quality, the University of North Carolina

  17. A probabilistic model framework for evaluating year-to-year variation in crop productivity

    NASA Astrophysics Data System (ADS)

    Yokozawa, M.; Iizumi, T.; Tao, F.

    2008-12-01

    Most models describing the relation between crop productivity and weather condition have so far been focused on mean changes of crop yield. For keeping stable food supply against abnormal weather as well as climate change, evaluating the year-to-year variations in crop productivity rather than the mean changes is more essential. We here propose a new framework of probabilistic model based on Bayesian inference and Monte Carlo simulation. As an example, we firstly introduce a model on paddy rice production in Japan. It is called PRYSBI (Process- based Regional rice Yield Simulator with Bayesian Inference; Iizumi et al., 2008). The model structure is the same as that of SIMRIW, which was developed and used widely in Japan. The model includes three sub- models describing phenological development, biomass accumulation and maturing of rice crop. These processes are formulated to include response nature of rice plant to weather condition. This model inherently was developed to predict rice growth and yield at plot paddy scale. We applied it to evaluate the large scale rice production with keeping the same model structure. Alternatively, we assumed the parameters as stochastic variables. In order to let the model catch up actual yield at larger scale, model parameters were determined based on agricultural statistical data of each prefecture of Japan together with weather data averaged over the region. The posterior probability distribution functions (PDFs) of parameters included in the model were obtained using Bayesian inference. The MCMC (Markov Chain Monte Carlo) algorithm was conducted to numerically solve the Bayesian theorem. For evaluating the year-to-year changes in rice growth/yield under this framework, we firstly iterate simulations with set of parameter values sampled from the estimated posterior PDF of each parameter and then take the ensemble mean weighted with the posterior PDFs. We will also present another example for maize productivity in China. The

  18. Intrinsic flexibility of porous materials; theory, modelling and the flexibility window of the EMT zeolite framework

    PubMed Central

    Fletcher, Rachel E.; Wells, Stephen A.; Leung, Ka Ming; Edwards, Peter P.; Sartbaeva, Asel

    2015-01-01

    Framework materials have structures containing strongly bonded polyhedral groups of atoms connected through their vertices. Typically the energy cost for variations of the inter-polyhedral geometry is much less than the cost of distortions of the polyhedra themselves – as in the case of silicates, where the geometry of the SiO4 tetrahedral group is much more strongly constrained than the Si—O—Si bridging angle. As a result, framework materials frequently display intrinsic flexibility, and their dynamic and static properties are strongly influenced by low-energy collective motions of the polyhedra. Insight into these motions can be obtained in reciprocal space through the ‘rigid unit mode’ (RUM) model, and in real-space through template-based geometric simulations. We briefly review the framework flexibility phenomena in energy-relevant materials, including ionic conductors, perovskites and zeolites. In particular we examine the ‘flexibility window’ phenomenon in zeolites and present novel results on the flexibility window of the EMT framework, which shed light on the role of structure-directing agents. Our key finding is that the crown ether, despite its steric bulk, does not limit the geometric flexibility of the framework. PMID:26634720

  19. Making the distinction between water scarcity and drought using an observation-modeling framework

    NASA Astrophysics Data System (ADS)

    Loon, A. F.; Lanen, H. A. J.

    2013-03-01

    Drought and water scarcity are keywords for river basin management in water-stressed regions. "Drought" is a natural hazard, caused by large-scale climatic variability, and cannot be prevented by local water management. "Water scarcity" refers to the long-term unsustainable use of water resources, which water managers can influence. Making the distinction between drought and water scarcity is not trivial, because they often occur simultaneously. In this paper, we propose an observation-modeling framework to separate natural (drought) and human (water scarcity) effects on the hydrological system. The basis of the framework is simulation of the situation that would have occurred without human influence, the "naturalized" situation, using a hydrological model. The resulting time series of naturalized state variables and fluxes are then compared to observed time series. As second, more important and novel step, anomalies (i.e., deviations from a threshold) are determined from both time series and compared. We demonstrate the use of the proposed observation-modeling framework in the Upper-Guadiana catchment in Spain. Application of the framework to the period 1980-2000 shows that the impact of groundwater abstraction on the hydrological system was, on average, four times as high as the impact of drought. Water scarcity resulted in disappearance of the winter high-flow period, even in relatively wet years, and a nonlinear response of groundwater. The proposed observation-modeling framework helps water managers in water-stressed regions to quantify the relative impact of drought and water scarcity on a transient basis and, consequently, to make decisions regarding adaptation to drought and combating water scarcity.

  20. A general science-based framework for dynamical spatio-temporal models

    USGS Publications Warehouse

    Wikle, C.K.; Hooten, M.B.

    2010-01-01

    Spatio-temporal statistical models are increasingly being used across a wide variety of scientific disciplines to describe and predict spatially-explicit processes that evolve over time. Correspondingly, in recent years there has been a significant amount of research on new statistical methodology for such models. Although descriptive models that approach the problem from the second-order (covariance) perspective are important, and innovative work is being done in this regard, many real-world processes are dynamic, and it can be more efficient in some cases to characterize the associated spatio-temporal dependence by the use of dynamical models. The chief challenge with the specification of such dynamical models has been related to the curse of dimensionality. Even in fairly simple linear, first-order Markovian, Gaussian error settings, statistical models are often over parameterized. Hierarchical models have proven invaluable in their ability to deal to some extent with this issue by allowing dependency among groups of parameters. In addition, this framework has allowed for the specification of science based parameterizations (and associated prior distributions) in which classes of deterministic dynamical models (e. g., partial differential equations (PDEs), integro-difference equations (IDEs), matrix models, and agent-based models) are used to guide specific parameterizations. Most of the focus for the application of such models in statistics has been in the linear case. The problems mentioned above with linear dynamic models are compounded in the case of nonlinear models. In this sense, the need for coherent and sensible model parameterizations is not only helpful, it is essential. Here, we present an overview of a framework for incorporating scientific information to motivate dynamical spatio-temporal models. First, we illustrate the methodology with the linear case. We then develop a general nonlinear spatio-temporal framework that we call general quadratic

  1. Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model

    SciTech Connect

    Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.

    2001-11-09

    Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.

  2. More performance results and implementation of an object oriented track reconstruction model in different OO frameworks

    NASA Astrophysics Data System (ADS)

    Gaines, Irwin; Qian, Sijin

    2001-08-01

    This is an update of the report about an Object Oriented (OO) track reconstruction model, which was presented in the previous AIHENP'99 at Crete, Greece. The OO model for the Kalman filtering method has been designed for high energy physics experiments at high luminosity hadron colliders. It has been coded in the C++ programming language and successfully implemented into a few different OO computing environments of the CMS and ATLAS experiments at the future Large Hadron Collider at CERN. We shall report: (1) more performance result: (2) implementing the OO model into the new SW OO framework "Athena" of ATLAS experiment and some upgrades of the OO model itself.

  3. Combining the Strengths of Physically Based Models with Statistical Modelling Tools Using a Hierarchical Mixture of Experts Framework

    NASA Astrophysics Data System (ADS)

    Marshall, L. A.; Sharma, A.; Nott, D.

    2005-12-01

    Rigidity in a modelling framework has been known to result in considerable bias in cases where the system behaviour is closely linked to the catchment antecedent conditions. An alternative to accommodate such variations in the system makeup is to enable the model to be flexible enough to evolve as antecedent conditions change. We present a framework that incorporates such flexibility by expressing the model through the combination of a number of different model structures. Each structure is adopted at a given time with a probability that depends on the current hydrologic state of the catchment. This framework is known as a Hierarchical Mixture of Experts (HME). When applied in a hydrological context, the HME approach has two major functions. It can act as a powerful predictive tool where simulation is extended beyond the calibration period. It also offers a basis for model development and building based on interpretation of the final model architecture in calibration. The probabilistic nature of HME means that it is ideally specified using Bayesian inference. The Bayesian approach also formalises the incorporation of uncertainty in the model specification. The interpretability of the overall HME framework is largely influenced by the individual model structures. One model which can be applied in the HME context is the popular Topmodel. Topmodel is a modelling tool that allows the simulation of distributed catchment response to rainfall. Many different versions of the basic model structure exist as the underlying concepts are challenged by different catchment studies. One modification often made is to the description of the baseflow recession. This study will investigate the predictive capability of Topmodel when the model is specified using both a Bayesian and HME approach. The specification of the distribution of model errors is investigated by definition of several different probability distributions. The HME approach is applied in a framework that compares two

  4. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  5. Solving system integration and interoperability problems using a model reference systems engineering framework

    NASA Astrophysics Data System (ADS)

    Makhlouf, Mahmoud A.

    2001-09-01

    This paper presents a model-reference systems engineering framework, which is applied on a number of ESC projects. This framework provides an architecture-driven system engineering process supported by a tool kit. This kit is built incrementally using an integrated set of commercial and government developed tools. These tools include project management, systems engineering, military worth-analysis and enterprise collaboration tools. Products developed using these tools enable the specification and visualization of an executable model of the integrated system architecture as it evolves from a low fidelity concept into a high fidelity system model. This enables end users of system products, system designers, and decision-makers; to perform what if analyses on system design alternatives before making costly final system acquisition decisions.

  6. A general ecophysiological framework for modelling the impact of pests and pathogens on forest ecosystems.

    PubMed Central

    Dietze, Michael C; Matthes, Jaclyn Hatala

    2014-01-01

    Forest insects and pathogens (FIPs) have enormous impacts on community dynamics, carbon storage and ecosystem services, however, ecosystem modelling of FIPs is limited due to their variability in severity and extent. We present a general framework for modelling FIP disturbances through their impacts on tree ecophysiology. Five pathways are identified as the basis for functional groupings: increases in leaf, stem and root turnover, and reductions in phloem and xylem transport. A simple ecophysiological model was used to explore the sensitivity of forest growth, mortality and ecosystem fluxes to varying outbreak severity. Across all pathways, low infection was associated with growth reduction but limited mortality. Moderate infection led to individual tree mortality, whereas high levels led to stand-level die-offs delayed over multiple years. Delayed mortality is consistent with observations and critical for capturing biophysical, biogeochemical and successional responses. This framework enables novel predictions under present and future global change scenarios. PMID:25168168

  7. A general ecophysiological framework for modelling the impact of pests and pathogens on forest ecosystems.

    PubMed

    Dietze, Michael C; Matthes, Jaclyn Hatala

    2014-11-01

    Forest insects and pathogens (FIPs) have enormous impacts on community dynamics, carbon storage and ecosystem services, however, ecosystem modelling of FIPs is limited due to their variability in severity and extent. We present a general framework for modelling FIP disturbances through their impacts on tree ecophysiology. Five pathways are identified as the basis for functional groupings: increases in leaf, stem and root turnover, and reductions in phloem and xylem transport. A simple ecophysiological model was used to explore the sensitivity of forest growth, mortality and ecosystem fluxes to varying outbreak severity. Across all pathways, low infection was associated with growth reduction but limited mortality. Moderate infection led to individual tree mortality, whereas high levels led to stand-level die-offs delayed over multiple years. Delayed mortality is consistent with observations and critical for capturing biophysical, biogeochemical and successional responses. This framework enables novel predictions under present and future global change scenarios. PMID:25168168

  8. An advanced object-based software framework for complex ecosystem modeling and simulation

    SciTech Connect

    Sydelko, P. J.; Dolph, J. E.; Majerus, K. A.; Taxon, T. N.

    2000-06-29

    Military land managers and decision makers face an ever increasing challenge to balance maximum flexibility for the mission with a diverse set of multiple land use, social, political, and economic goals. In addition, these goals encompass environmental requirements for maintaining ecosystem health and sustainability over the long term. Spatiotemporal modeling and simulation in support of adaptive ecosystem management can be best accomplished through a dynamic, integrated, and flexible approach that incorporates scientific and technological components into a comprehensive ecosystem modeling framework. The Integrated Dynamic Landscape Analysis and Modeling System (IDLAMS) integrates ecological models and decision support techniques through a geographic information system (GIS)-based backbone. Recently, an object-oriented (OO) architectural framework was developed for IDLAMS (OO-IDLAMS). This OO-IDLAMS Prototype was built upon and leverages from the Dynamic Information Architecture System (DIAS) developed by Argonne National Laboratory. DIAS is an object-based architectural framework that affords a more integrated, dynamic, and flexible approach to comprehensive ecosystem modeling than was possible with the GIS-based integration approach of the original IDLAMS. The flexibility, dynamics, and interoperability demonstrated through this case study of an object-oriented approach have the potential to provide key technology solutions for many of the military's multiple-use goals and needs for integrated natural resource planning and ecosystem management.

  9. The Importance of Communicating Uncertainty to the 3D Geological Framework Model of Alberta

    NASA Astrophysics Data System (ADS)

    MacCormack, Kelsey

    2015-04-01

    The Alberta Geological Survey (AGS) has been tasked with developing a 3-dimensional (3D) geological framework for Alberta (660,000 km2). Our goal is to develop 'The Framework' as a sophisticated platform, capable of integrating a variety of data types from multiple sources enabling the development of multi-scale, interdisciplinary models with built-in feedback mechanisms, allowing the individual components of the model to adapt and evolve over time as our knowledge and understanding of the subsurface increases. The geoscience information within these models is often taken at face value and assumed that the attribute accuracy is equivalent to the digital accuracy recorded by the computer, which can lead to overconfidence in the model results. We need to make sure that decision makers understand that models are simply versions of reality and all contain a certain amount of error and uncertainty. More importantly, it is necessary to convey that error and uncertainty are not bad, and should be quantified and understood rather than ignored. This presentation will focus on how the AGS is quantifying and communicating uncertainty within the Geologic Framework to decision makers and the general public, as well as utilizing uncertainty results to strategically prioritize future work.

  10. A model framework to represent plant-physiology and rhizosphere processes in soil profile simulation models

    NASA Astrophysics Data System (ADS)

    Vanderborght, J.; Javaux, M.; Couvreur, V.; Schröder, N.; Huber, K.; Abesha, B.; Schnepf, A.; Vereecken, H.

    2013-12-01

    of plant transpiration by root-zone produced plant hormones, and (iv) the impact of salt accumulation at the soil-root interface on root water uptake. We further propose a framework how this process knowledge could be implemented in root zone simulation models that do not resolve small scale processes.

  11. A Framework for Dealing With Uncertainty due to Model Structure Error

    NASA Astrophysics Data System (ADS)

    van der Keur, P.; Refsgaard, J.; van der Sluijs, J.; Brown, J.

    2004-12-01

    Although uncertainty about structures of environmental models (conceptual uncertainty) has been recognised often to be the main source of uncertainty in model predictions, it is rarely considered in environmental modelling. Rather, formal uncertainty analyses have traditionally focused on model parameters and input data as the principal source of uncertainty in model predictions. The traditional approach to model uncertainty analysis that considers only a single conceptual model, fails to adequately sample the relevant space of plausible models. As such, it is prone to modelling bias and underestimation of model uncertainty. In this paper we review a range of strategies for assessing structural uncertainties. The existing strategies fall into two categories depending on whether field data are available for the variable of interest. Most research attention has until now been devoted to situations, where model structure uncertainties can be assessed directly on the basis of field data. This corresponds to a situation of `interpolation'. However, in many cases environmental models are used for `extrapolation' beyond the situation and the field data available for calibration. A framework is presented for assessing the predictive uncertainties of environmental models used for extrapolation. The key elements are the use of alternative conceptual models and assessment of their pedigree and the adequacy of the samples of conceptual models to represent the space of plausible models by expert elicitation. Keywords: model error, model structure, conceptual uncertainty, scenario analysis, pedigree

  12. Toward University Modeling Instruction—Biology: Adapting Curricular Frameworks from Physics to Biology

    PubMed Central

    Manthey, Seth; Brewe, Eric

    2013-01-01

    University Modeling Instruction (UMI) is an approach to curriculum and pedagogy that focuses instruction on engaging students in building, validating, and deploying scientific models. Modeling Instruction has been successfully implemented in both high school and university physics courses. Studies within the physics education research (PER) community have identified UMI's positive impacts on learning gains, equity, attitudinal shifts, and self-efficacy. While the success of this pedagogical approach has been recognized within the physics community, the use of models and modeling practices is still being developed for biology. Drawing from the existing research on UMI in physics, we describe the theoretical foundations of UMI and how UMI can be adapted to include an emphasis on models and modeling for undergraduate introductory biology courses. In particular, we discuss our ongoing work to develop a framework for the first semester of a two-semester introductory biology course sequence by identifying the essential basic models for an introductory biology course sequence. PMID:23737628

  13. A Biophysical Modeling Framework for Assessing the Environmental Impact of Biofuel Production

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Izaurradle, C.; Manowitz, D.; West, T. O.; Post, W. M.; Thomson, A. M.; Nichols, J.; Bandaru, V.; Williams, J. R.

    2009-12-01

    Long-term sustainability of a biofuel economy necessitates environmentally friendly biofuel production systems. We describe a biophysical modeling framework developed to understand and quantify the environmental value and impact (e.g. water balance, nutrients balance, carbon balance, and soil quality) of different biomass cropping systems. This modeling framework consists of three major components: 1) a Geographic Information System (GIS) based data processing system, 2) a spatially-explicit biophysical modeling approach, and 3) a user friendly information distribution system. First, we developed a GIS to manage the large amount of geospatial data (e.g. climate, land use, soil, and hydrograhy) and extract input information for the biophysical model. Second, the Environmental Policy Integrated Climate (EPIC) biophysical model is used to predict the impact of various cropping systems and management intensities on productivity, water balance, and biogeochemical variables. Finally, a geo-database is developed to distribute the results of ecosystem service variables (e.g. net primary productivity, soil carbon balance, soil erosion, nitrogen and phosphorus losses, and N2O fluxes) simulated by EPIC for each spatial modeling unit online using PostgreSQL. We applied this framework in a Regional Intensive Management Area (RIMA) of 9 counties in Michigan. A total of 4,833 spatial units with relatively homogeneous biophysical properties were derived using SSURGO, Crop Data Layer, County, and 10-digit watershed boundaries. For each unit, EPIC was executed from 1980 to 2003 under 54 cropping scenarios (eg. corn, switchgrass, and hybrid poplar). The simulation results were compared with historical crop yields from USDA NASS. Spatial mapping of the results show high variability among different cropping scenarios in terms of the simulated ecosystem services variables. Overall, the framework developed in this study enables the incorporation of environmental factors into economic and

  14. Modelling framework for dynamic interaction between multiple pedestrians and vertical vibrations of footbridges

    NASA Astrophysics Data System (ADS)

    Venuti, Fiammetta; Racic, Vitomir; Corbetta, Alessandro

    2016-09-01

    After 15 years of active research on the interaction between moving people and civil engineering structures, there is still a lack of reliable models and adequate design guidelines pertinent to vibration serviceability of footbridges due to multiple pedestrians. There are three key issues that a new generation of models should urgently address: pedestrian "intelligent" interaction with the surrounding people and environment, effect of human bodies on dynamic properties of unoccupied structure and inter-subject and intra-subject variability of pedestrian walking loads. This paper presents a modelling framework of human-structure interaction in the vertical direction which addresses all three issues. The framework comprises two main models: (1) a microscopic model of multiple pedestrian traffic that simulates time varying position and velocity of each individual pedestrian on the footbridge deck, and (2) a coupled dynamic model of a footbridge and multiple walking pedestrians. The footbridge is modelled as a SDOF system having the dynamic properties of the unoccupied structure. Each walking pedestrian in a group or crowd is modelled as a SDOF system with an adjacent stochastic vertical force that moves along the footbridge following the trajectory and the gait pattern simulated by the microscopic model of pedestrian traffic. Performance of the suggested modelling framework is illustrated by a series of simulated vibration responses of a virtual footbridge due to light, medium and dense pedestrian traffic. Moreover, the Weibull distribution is shown to fit well the probability density function of the local peaks in the acceleration response. Considering the inherent randomness of the crowd, this makes it possible to determine the probability of exceeding any given acceleration value of the occupied bridge.

  15. An analytical framework to assist decision makers in the use of forest ecosystem model predictions

    USGS Publications Warehouse

    Larocque, Guy R.; Bhatti, Jagtar S.; Ascough, J.C.; Liu, J.; Luckai, N.; Mailly, D.; Archambault, L.; Gordon, Andrew M.

    2011-01-01

    The predictions from most forest ecosystem models originate from deterministic simulations. However, few evaluation exercises for model outputs are performed by either model developers or users. This issue has important consequences for decision makers using these models to develop natural resource management policies, as they cannot evaluate the extent to which predictions stemming from the simulation of alternative management scenarios may result in significant environmental or economic differences. Various numerical methods, such as sensitivity/uncertainty analyses, or bootstrap methods, may be used to evaluate models and the errors associated with their outputs. However, the application of each of these methods carries unique challenges which decision makers do not necessarily understand; guidance is required when interpreting the output generated from each model. This paper proposes a decision flow chart in the form of an analytical framework to help decision makers apply, in an orderly fashion, different steps involved in examining the model outputs. The analytical framework is discussed with regard to the definition of problems and objectives and includes the following topics: model selection, identification of alternatives, modelling tasks and selecting alternatives for developing policy or implementing management scenarios. Its application is illustrated using an on-going exercise in developing silvicultural guidelines for a forest management enterprise in Ontario, Canada. ?? 2010.

  16. A framework for implementation of organ effect models in TOPAS with benchmarks extended to proton therapy.

    PubMed

    Ramos-Méndez, J; Perl, J; Schümann, J; Shin, J; Paganetti, H; Faddegon, B

    2015-07-01

    The aim of this work was to develop a framework for modeling organ effects within TOPAS (TOol for PArticle Simulation), a wrapper of the Geant4 Monte Carlo toolkit that facilitates particle therapy simulation. The DICOM interface for TOPAS was extended to permit contour input, used to assign voxels to organs. The following dose response models were implemented: The Lyman-Kutcher-Burman model, the critical element model, the population based critical volume model, the parallel-serial model, a sigmoid-based model of Niemierko for normal tissue complication probability and tumor control probability (TCP), and a Poisson-based model for TCP. The framework allows easy manipulation of the parameters of these models and the implementation of other models. As part of the verification, results for the parallel-serial and Poisson model for x-ray irradiation of a water phantom were compared to data from the AAPM Task Group 166. When using the task group dose-volume histograms (DVHs), results were found to be sensitive to the number of points in the DVH, with differences up to 2.4%, some of which are attributable to differences between the implemented models. New results are given with the point spacing specified. When using Monte Carlo calculations with TOPAS, despite the relatively good match to the published DVH's, differences up to 9% were found for the parallel-serial model (for a maximum DVH difference of 2%) and up to 0.5% for the Poisson model (for a maximum DVH difference of 0.5%). However, differences of 74.5% (in Rectangle1), 34.8% (in PTV) and 52.1% (in Triangle) for the critical element, critical volume and the sigmoid-based models were found respectively. We propose a new benchmark for verification of organ effect models in proton therapy. The benchmark consists of customized structures in the spread out Bragg peak plateau, normal tissue, tumor, penumbra and in the distal region. The DVH's, DVH point spacing, and results of the organ effect models are provided

  17. A framework for implementation of organ effect models in TOPAS with benchmarks extended to proton therapy

    NASA Astrophysics Data System (ADS)

    Ramos-Méndez, J.; Perl, J.; Schümann, J.; Shin, J.; Paganetti, H.; Faddegon, B.

    2015-07-01

    The aim of this work was to develop a framework for modeling organ effects within TOPAS (TOol for PArticle Simulation), a wrapper of the Geant4 Monte Carlo toolkit that facilitates particle therapy simulation. The DICOM interface for TOPAS was extended to permit contour input, used to assign voxels to organs. The following dose response models were implemented: The Lyman-Kutcher-Burman model, the critical element model, the population based critical volume model, the parallel-serial model, a sigmoid-based model of Niemierko for normal tissue complication probability and tumor control probability (TCP), and a Poisson-based model for TCP. The framework allows easy manipulation of the parameters of these models and the implementation of other models. As part of the verification, results for the parallel-serial and Poisson model for x-ray irradiation of a water phantom were compared to data from the AAPM Task Group 166. When using the task group dose-volume histograms (DVHs), results were found to be sensitive to the number of points in the DVH, with differences up to 2.4%, some of which are attributable to differences between the implemented models. New results are given with the point spacing specified. When using Monte Carlo calculations with TOPAS, despite the relatively good match to the published DVH’s, differences up to 9% were found for the parallel-serial model (for a maximum DVH difference of 2%) and up to 0.5% for the Poisson model (for a maximum DVH difference of 0.5%). However, differences of 74.5% (in Rectangle1), 34.8% (in PTV) and 52.1% (in Triangle) for the critical element, critical volume and the sigmoid-based models were found respectively. We propose a new benchmark for verification of organ effect models in proton therapy. The benchmark consists of customized structures in the spread out Bragg peak plateau, normal tissue, tumor, penumbra and in the distal region. The DVH’s, DVH point spacing, and results of the organ effect models are provided

  18. Tailored motivational message generation: A model and practical framework for real-time physical activity coaching.

    PubMed

    Op den Akker, Harm; Cabrita, Miriam; Op den Akker, Rieks; Jones, Valerie M; Hermens, Hermie J

    2015-06-01

    This paper presents a comprehensive and practical framework for automatic generation of real-time tailored messages in behavior change applications. Basic aspects of motivational messages are time, intention, content and presentation. Tailoring of messages to the individual user may involve all aspects of communication. A linear modular system is presented for generating such messages. It is explained how properties of user and context are taken into account in each of the modules of the system and how they affect the linguistic presentation of the generated messages. The model of motivational messages presented is based on an analysis of existing literature as well as the analysis of a corpus of motivational messages used in previous studies. The model extends existing 'ontology-based' approaches to message generation for real-time coaching systems found in the literature. Practical examples are given on how simple tailoring rules can be implemented throughout the various stages of the framework. Such examples can guide further research by clarifying what it means to use e.g. user targeting to tailor a message. As primary example we look at the issue of promoting daily physical activity. Future work is pointed out in applying the present model and framework, defining efficient ways of evaluating individual tailoring components, and improving effectiveness through the creation of accurate and complete user- and context models. PMID:25843359

  19. Modelling tissue electrophysiology with multiple cell types: applications of the extended bidomain framework.

    PubMed

    Corrias, Alberto; Pathmanathan, Pras; Gavaghan, David J; Buist, Martin L

    2012-02-01

    The bidomain framework has been extensively used to model tissue electrophysiology in a variety of applications. One limitation of the bidomain model is that it describes the activity of only one cell type interacting with the extracellular space. If more than one cell type contributes to the tissue electrophysiology, then the bidomain model is not sufficient. Recently, evidence has suggested that this is the case for at least two important applications: cardiac and gastrointestinal tissue electrophysiology. In the heart, fibroblasts ubiquitously interact with myocytes and are believed to play an important role in the organ electrophysiology. Along the GI tract, interstitial cells of Cajal (ICC) generate electrical waves that are passed on to surrounding smooth muscle cells (SMC), which are interconnected with the ICC and with each other. Because of the contribution of more than one cell type to the overall organ electrophysiology, investigators in different fields have independently proposed similar extensions of the bidomain model to incorporate multiple cell types and tested it on simplified geometries. In this paper, we provide a general derivation of such an extended bidomain framework applicable to any tissue and provide a generic and efficient implementation applicable to any geometry. Proof-of-concept results of tissue electrophysiology on realistic 3D organ geometries using the extended bidomain framework are presented for the heart and the stomach. PMID:22222297

  20. A modern solver framework to manage solution algorithms in the Community Earth System Model

    SciTech Connect

    Evans, Katherine J; Worley, Patrick H; Nichols, Dr Jeff A; WhiteIII, James B; Salinger, Andy; Price, Stephen; Lemieux, Jean-Francois; Lipscomb, William; Perego, Mauro; Vertenstein, Mariana; Edwards, Jim

    2012-01-01

    Global Earth-system models (ESM) can now produce simulations that resolve ~50 km features and include finer-scale, interacting physical processes. In order to achieve these scale-length solutions, ESMs require smaller time steps, which limits parallel performance. Solution methods that overcome these bottlenecks can be quite intricate, and there is no single set of algorithms that perform well across the range of problems of interest. This creates significant implementation challenges, which is further compounded by complexity of ESMs. Therefore, prototyping and evaluating new algorithms in these models requires a software framework that is flexible, extensible, and easily introduced into the existing software. We describe our efforts to create a parallel solver framework that links the Trilinos library of solvers to Glimmer-CISM, a continental ice sheet model used in the Community Earth System Model (CESM). We demonstrate this framework within both current and developmental versions of Glimmer-CISM and provide strategies for its integration into the rest of the CESM.

  1. Short-term Forecasting Ground Magnetic Perturbations with the Space Weather Modeling Framework

    NASA Astrophysics Data System (ADS)

    Welling, Daniel; Toth, Gabor; Gombosi, Tamas; Singer, Howard; Millward, George

    2016-04-01

    Predicting ground-based magnetic perturbations is a critical step towards specifying and predicting geomagnetically induced currents (GICs) in high voltage transmission lines. Currently, the Space Weather Modeling Framework (SWMF), a flexible modeling framework for simulating the multi-scale space environment, is being transitioned from research to operational use (R2O) by NOAA's Space Weather Prediction Center. Upon completion of this transition, the SWMF will provide localized dB/dt predictions using real-time solar wind observations from L1 and the F10.7 proxy for EUV as model input. This presentation describes the operational SWMF setup and summarizes the changes made to the code to enable R2O progress. The framework's algorithm for calculating ground-based magnetometer observations will be reviewed. Metrics from data-model comparisons will be reviewed to illustrate predictive capabilities. Early data products, such as regional-K index and grids of virtual magnetometer stations, will be presented. Finally, early successes will be shared, including the code's ability to reproduce the recent March 2015 St. Patrick's Day Storm.

  2. A framework for dealing with uncertainty due to model structure error

    NASA Astrophysics Data System (ADS)

    Refsgaard, Jens Christian; van der Sluijs, Jeroen P.; Brown, James; van der Keur, Peter

    2006-11-01

    Although uncertainty about structures of environmental models (conceptual uncertainty) is often acknowledged to be the main source of uncertainty in model predictions, it is rarely considered in environmental modelling. Rather, formal uncertainty analyses have traditionally focused on model parameters and input data as the principal source of uncertainty in model predictions. The traditional approach to model uncertainty analysis, which considers only a single conceptual model, may fail to adequately sample the relevant space of plausible conceptual models. As such, it is prone to modelling bias and underestimation of predictive uncertainty. In this paper we review a range of strategies for assessing structural uncertainties in models. The existing strategies fall into two categories depending on whether field data are available for the predicted variable of interest. To date, most research has focussed on situations where inferences on the accuracy of a model structure can be made directly on the basis of field data. This corresponds to a situation of 'interpolation'. However, in many cases environmental models are used for 'extrapolation'; that is, beyond the situation and the field data available for calibration. In the present paper, a framework is presented for assessing the predictive uncertainties of environmental models used for extrapolation. It involves the use of multiple conceptual models, assessment of their pedigree and reflection on the extent to which the sampled models adequately represent the space of plausible models.

  3. Modeling somite scaling in small embryos in the framework of Turing patterns.

    PubMed

    Signon, Laurence; Nowakowski, Bogdan; Lemarchand, Annie

    2016-04-01

    The adaptation of prevertebra size to embryo size is investigated in the framework of a reaction-diffusion model involving a Turing pattern. The reaction scheme and Fick's first law of diffusion are modified in order to take into account the departure from dilute conditions induced by confinement in smaller embryos. In agreement with the experimental observations of scaling in somitogenesis, our model predicts the formation of smaller prevertebrae or somites in smaller embryos. These results suggest that models based on Turing patterns cannot be automatically disregarded by invoking the question of maintaining proportions in embryonic development. Our approach highlights the nontrivial role that the solvent can play in biology. PMID:27176324

  4. Modeling somite scaling in small embryos in the framework of Turing patterns

    NASA Astrophysics Data System (ADS)

    Signon, Laurence; Nowakowski, Bogdan; Lemarchand, Annie

    2016-04-01

    The adaptation of prevertebra size to embryo size is investigated in the framework of a reaction-diffusion model involving a Turing pattern. The reaction scheme and Fick's first law of diffusion are modified in order to take into account the departure from dilute conditions induced by confinement in smaller embryos. In agreement with the experimental observations of scaling in somitogenesis, our model predicts the formation of smaller prevertebrae or somites in smaller embryos. These results suggest that models based on Turing patterns cannot be automatically disregarded by invoking the question of maintaining proportions in embryonic development. Our approach highlights the nontrivial role that the solvent can play in biology.

  5. Framework for analyzing ecological trait-based models in multidimensional niche spaces

    NASA Astrophysics Data System (ADS)

    Biancalani, Tommaso; DeVille, Lee; Goldenfeld, Nigel

    2015-05-01

    We develop a theoretical framework for analyzing ecological models with a multidimensional niche space. Our approach relies on the fact that ecological niches are described by sequences of symbols, which allows us to include multiple phenotypic traits. Ecological drivers, such as competitive exclusion, are modeled by introducing the Hamming distance between two sequences. We show that a suitable transform diagonalizes the community interaction matrix of these models, making it possible to predict the conditions for niche differentiation and, close to the instability onset, the asymptotically long time population distributions of niches. We exemplify our method using the Lotka-Volterra equations with an exponential competition kernel.

  6. Framework for analyzing ecological trait-based models in multidimensional niche spaces.

    PubMed

    Biancalani, Tommaso; DeVille, Lee; Goldenfeld, Nigel

    2015-05-01

    We develop a theoretical framework for analyzing ecological models with a multidimensional niche space. Our approach relies on the fact that ecological niches are described by sequences of symbols, which allows us to include multiple phenotypic traits. Ecological drivers, such as competitive exclusion, are modeled by introducing the Hamming distance between two sequences. We show that a suitable transform diagonalizes the community interaction matrix of these models, making it possible to predict the conditions for niche differentiation and, close to the instability onset, the asymptotically long time population distributions of niches. We exemplify our method using the Lotka-Volterra equations with an exponential competition kernel. PMID:26066119

  7. A Catchment-Based Land Surface Model for GCMs and the Framework for its Evaluation

    NASA Technical Reports Server (NTRS)

    Ducharen, A.; Koster, R. D.; Suarez, M. J.; Kumar, P.

    1998-01-01

    A new GCM-scale land surface modeling strategy that explicitly accounts for subgrid soil moisture variability and its effects on evaporation and runoff is now being explored. In a break from traditional modeling strategies, the continental surface is disaggregated into a mosaic of hydrological catchments, with boundaries that are not dictated by a regular grid but by topography. Within each catchment, the variability of soil moisture is deduced from TOP-MODEL equations with a special treatment of the unsaturated zone. This paper gives an overview of this new approach and presents the general framework for its off-line evaluation over North-America.

  8. Probabilistic Boolean Network Modelling and Analysis Framework for mRNA Translation.

    PubMed

    Zhao, Yun-Bo; Krishnan, J

    2016-01-01

    mRNA translation is a complex process involving the progression of ribosomes on the mRNA, resulting in the synthesis of proteins, and is subject to multiple layers of regulation. This process has been modelled using different formalisms, both stochastic and deterministic. Recently, we introduced a Probabilistic Boolean modelling framework for mRNA translation, which possesses the advantage of tools for numerically exact computation of steady state probability distribution, without requiring simulation. Here, we extend this model to incorporate both random sequential and parallel update rules, and demonstrate its effectiveness in various settings, including its flexibility in accommodating additional static and dynamic biological complexities and its role in parameter sensitivity analysis. In these applications, the results from the model analysis match those of TASEP model simulations. Importantly, the proposed modelling framework maintains the stochastic aspects of mRNA translation and provides a way to exactly calculate probability distributions, providing additional tools of analysis in this context. Finally, the proposed modelling methodology provides an alternative approach to the understanding of the mRNA translation process, by bridging the gap between existing approaches, providing new analysis tools, and contributing to a more robust platform for modelling and understanding translation. PMID:26390498

  9. A Conceptual Framework for SAHRA Integrated Multi-resolution Modeling in the Rio Grande Basin

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Gupta, H.; Springer, E.; Wagener, T.; Brookshire, D.; Duffy, C.

    2004-12-01

    The sustainable management of water resources in a river basin requires an integrated analysis of the social, economic, environmental and institutional dimensions of the problem. Numerical models are commonly used for integration of these dimensions and for communication of the analysis results to stakeholders and policy makers. The National Science Foundation Science and Technology Center for Sustainability of semi-Arid Hydrology and Riparian Areas (SAHRA) has been developing integrated multi-resolution models to assess impacts of climate variability and land use change on water resources in the Rio Grande Basin. These models not only couple natural systems such as surface and ground waters, but will also include engineering, economic and social components that may be involved in water resources decision-making processes. This presentation will describe the conceptual framework being developed by SAHRA to guide and focus the multiple modeling efforts and to assist the modeling team in planning, data collection and interpretation, communication, evaluation, etc. One of the major components of this conceptual framework is a Conceptual Site Model (CSM), which describes the basin and its environment based on existing knowledge and identifies what additional information must be collected to develop technically sound models at various resolutions. The initial CSM is based on analyses of basin profile information that has been collected, including a physical profile (e.g., topographic and vegetative features), a man-made facility profile (e.g., dams, diversions, and pumping stations), and a land use and ecological profile (e.g., demographics, natural habitats, and endangered species). Based on the initial CSM, a Conceptual Physical Model (CPM) is developed to guide and evaluate the selection of a model code (or numerical model) for each resolution to conduct simulations and predictions. A CPM identifies, conceptually, all the physical processes and engineering and socio

  10. A New Open Data Open Modeling Framework for the Geosciences Community (Invited)

    NASA Astrophysics Data System (ADS)

    Liang, X.; Salas, D.; Navarro, M.; Liang, Y.; Teng, W. L.; Hooper, R. P.; Restrepo, P. J.; Bales, J. D.

    2013-12-01

    A prototype Open Hydrospheric Modeling Framework (OHMF), also called Open Data Open Modeling framework, has been developed to address two key modeling challenges faced by the broad research community: (1) accessing external data from diverse sources and (2) execution, coupling, and evaluation/intercomparison of various and complex models. The former is achieved via the Open Data architecture, while the latter is achieved via the Open Modeling architecture. The Open Data architecture adopts a common internal data model and representation, to facilitate the integration of various external data sources into OHMF, using Data Agents that handle remote data access protocols (e.g., OPeNDAP, Web services), metadata standards, and source-specific implementations. These Data Agents hide the heterogeneity of the external data sources and provide a common interface to the OHMF system core. The Open Modeling architecture allows different models or modules to be easily integrated into OHMF. The OHMF architectural design offers a general many-to-many connectivity between individual models and external data sources, instead of one-to-one connectivity from data access to model simulation results. OHMF adopts a graphical scientific workflow, offers tools to re-scale in space and time, and provides multi-scale data fusion and assimilation functionality. Notably, the OHMF system employs a strategy that does not require re-compiling or adding interface codes for a user's model to be integrated. Thus, a corresponding model agent can be easily developed by a user. Once an agent is available for a model, it can be shared and used by others. An example will be presented to illustrate the prototype OHMF system and the automatic flow from accessing data to model simulation results in a user-friendly workflow-controlled environment.

  11. A framework for integrated, multi-scale model construction and uncertainty assessment

    NASA Astrophysics Data System (ADS)

    Schmitz, Oliver; de Kok, Jean-Luc; de Jong, Kor; Karssenberg, Derek

    2015-04-01

    The component-based software development practice promotes the construction of self-contained modules with defined input and output interfaces. In environmental modelling, we can adopt this development practice to construct more generic, reusable component models. Here, modellers need to implement a state transition function to describe a specific environmental process, and to specify the required external inputs and parameters to simulate the change of real-world processes over time. Depending on the usage of a component model, such as standalone execution or as part of an integrated model, the source of the external input needs to be specified. The required external inputs can thereby be obtained from disk by a file operation in case of a standalone execution; or inputs can be obtained from other component models, when the component model is used in an integrated model. Using different notations to specify input requirements, however, requires a modification of the state transition function per application case of a component model and therefore would reduce its generic nature. We propose the function object notation as a means to specify input sources of a component model and as a uniform syntax to express input requirements. At component initialisation, the function objects can be parametrised with different external sources. In addition to a uniform syntax, the function object notation allows modellers to specify a request-reply execution flow of the coupled models. We extended the request-reply execution approach to allow for Monte Carlo simulations, and implemented a software framework prototype in Python using the PCRaster module (http://www.pcraster.eu) for field-based modelling. We demonstrate the usage of the framework by building an exemplary integrated model by coupling components simulating land use change, hydrology and eucalyptus tree growth at different temporal discretisations to obtain the probability for bioenergy plantations in a hypothetical

  12. A Modelling Framework to Assess the Effect of Pressures on River Abiotic Habitat Conditions and Biota

    PubMed Central

    Kail, Jochem; Guse, Björn; Radinger, Johannes; Schröder, Maria; Kiesel, Jens; Kleinhans, Maarten; Schuurman, Filip; Fohrer, Nicola; Hering, Daniel; Wolter, Christian

    2015-01-01

    River biota are affected by global reach-scale pressures, but most approaches for predicting biota of rivers focus on river reach or segment scale processes and habitats. Moreover, these approaches do not consider long-term morphological changes that affect habitat conditions. In this study, a modelling framework was further developed and tested to assess the effect of pressures at different spatial scales on reach-scale habitat conditions and biota. Ecohydrological and 1D hydrodynamic models were used to predict discharge and water quality at the catchment scale and the resulting water level at the downstream end of a study reach. Long-term reach morphology was modelled using empirical regime equations, meander migration and 2D morphodynamic models. The respective flow and substrate conditions in the study reach were predicted using a 2D hydrodynamic model, and the suitability of these habitats was assessed with novel habitat models. In addition, dispersal models for fish and macroinvertebrates were developed to assess the re-colonization potential and to finally compare habitat suitability and the availability / ability of species to colonize these habitats. Applicability was tested and model performance was assessed by comparing observed and predicted conditions in the lowland Treene River in northern Germany. Technically, it was possible to link the different models, but future applications would benefit from the development of open source software for all modelling steps to enable fully automated model runs. Future research needs concern the physical modelling of long-term morphodynamics, feedback of biota (e.g., macrophytes) on abiotic habitat conditions, species interactions, and empirical data on the hydraulic habitat suitability and dispersal abilities of macroinvertebrates. The modelling framework is flexible and allows for including additional models and investigating different research and management questions, e.g., in climate impact research as well

  13. Developing a clinical utility framework to evaluate prediction models in radiogenomics

    PubMed Central

    Wu, Yirong; Liu, Jie; del Rio, Alejandro Munoz; Page, David C.; Alagoz, Oguzhan; Peissig, Peggy; Onitilo, Adedayo A.; Burnside, Elizabeth S.

    2016-01-01

    Combining imaging and genetic information to predict disease presence and behavior is being codified into an emerging discipline called “radiogenomics.” Optimal evaluation methodologies for radiogenomics techniques have not been established. We aim to develop a clinical decision framework based on utility analysis to assess prediction models for breast cancer. Our data comes from a retrospective case-control study, collecting Gail model risk factors, genetic variants (single nucleotide polymorphisms-SNPs), and mammographic features in Breast Imaging Reporting and Data System (BI-RADS) lexicon. We first constructed three logistic regression models built on different sets of predictive features: (1) Gail, (2) Gail+SNP, and (3) Gail+SNP+BI-RADS. Then, we generated ROC curves for three models. After we assigned utility values for each category of findings (true negative, false positive, false negative and true positive), we pursued optimal operating points on ROC curves to achieve maximum expected utility (MEU) of breast cancer diagnosis. We used McNemar’s test to compare the predictive performance of the three models. We found that SNPs and BI-RADS features augmented the baseline Gail model in terms of the area under ROC curve (AUC) and MEU. SNPs improved sensitivity of the Gail model (0.276 vs. 0.147) and reduced specificity (0.855 vs. 0.912). When additional mammographic features were added, sensitivity increased to 0.457 and specificity to 0.872. SNPs and mammographic features played a significant role in breast cancer risk estimation (p-value < 0.001). Our decision framework comprising utility analysis and McNemar’s test provides a novel framework to evaluate prediction models in the realm of radiogenomics. PMID:27095854

  14. Developing a clinical utility framework to evaluate prediction models in radiogenomics

    NASA Astrophysics Data System (ADS)

    Wu, Yirong; Liu, Jie; Munoz del Rio, Alejandro; Page, David C.; Alagoz, Oguzhan; Peissig, Peggy; Onitilo, Adedayo A.; Burnside, Elizabeth S.

    2015-03-01

    Combining imaging and genetic information to predict disease presence and behavior is being codified into an emerging discipline called "radiogenomics." Optimal evaluation methodologies for radiogenomics techniques have not been established. We aim to develop a clinical decision framework based on utility analysis to assess prediction models for breast cancer. Our data comes from a retrospective case-control study, collecting Gail model risk factors, genetic variants (single nucleotide polymorphisms-SNPs), and mammographic features in Breast Imaging Reporting and Data System (BI-RADS) lexicon. We first constructed three logistic regression models built on different sets of predictive features: (1) Gail, (2) Gail+SNP, and (3) Gail+SNP+BI-RADS. Then, we generated ROC curves for three models. After we assigned utility values for each category of findings (true negative, false positive, false negative and true positive), we pursued optimal operating points on ROC curves to achieve maximum expected utility (MEU) of breast cancer diagnosis. We used McNemar's test to compare the predictive performance of the three models. We found that SNPs and BI-RADS features augmented the baseline Gail model in terms of the area under ROC curve (AUC) and MEU. SNPs improved sensitivity of the Gail model (0.276 vs. 0.147) and reduced specificity (0.855 vs. 0.912). When additional mammographic features were added, sensitivity increased to 0.457 and specificity to 0.872. SNPs and mammographic features played a significant role in breast cancer risk estimation (p-value < 0.001). Our decision framework comprising utility analysis and McNemar's test provides a novel framework to evaluate prediction models in the realm of radiogenomics.

  15. Recent numerical and algorithmic advances within the volume tracking framework for modeling interfacial flows

    SciTech Connect

    François, Marianne M.

    2015-05-28

    A review of recent advances made in numerical methods and algorithms within the volume tracking framework is presented. The volume tracking method, also known as the volume-of-fluid method has become an established numerical approach to model and simulate interfacial flows. Its advantage is its strict mass conservation. However, because the interface is not explicitly tracked but captured via the material volume fraction on a fixed mesh, accurate estimation of the interface position, its geometric properties and modeling of interfacial physics in the volume tracking framework remain difficult. Several improvements have been made over the last decade to address these challenges. In this study, the multimaterial interface reconstruction method via power diagram, curvature estimation via heights and mean values and the balanced-force algorithm for surface tension are highlighted.

  16. A unifying framework for systems modeling, control systems design, and system operation

    NASA Technical Reports Server (NTRS)

    Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.

    2005-01-01

    Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.

  17. Recent numerical and algorithmic advances within the volume tracking framework for modeling interfacial flows

    DOE PAGESBeta

    François, Marianne M.

    2015-05-28

    A review of recent advances made in numerical methods and algorithms within the volume tracking framework is presented. The volume tracking method, also known as the volume-of-fluid method has become an established numerical approach to model and simulate interfacial flows. Its advantage is its strict mass conservation. However, because the interface is not explicitly tracked but captured via the material volume fraction on a fixed mesh, accurate estimation of the interface position, its geometric properties and modeling of interfacial physics in the volume tracking framework remain difficult. Several improvements have been made over the last decade to address these challenges.more » In this study, the multimaterial interface reconstruction method via power diagram, curvature estimation via heights and mean values and the balanced-force algorithm for surface tension are highlighted.« less

  18. Spatial optimization of watershed management practices for nitrogen load reduction using a modeling-optimization framework.

    PubMed

    Yang, Guoxiang; Best, Elly P H

    2015-09-15

    Best management practices (BMPs) can be used effectively to reduce nutrient loads transported from non-point sources to receiving water bodies. However, methodologies of BMP selection and placement in a cost-effective way are needed to assist watershed management planners and stakeholders. We developed a novel modeling-optimization framework that can be used to find cost-effective solutions of BMP placement to attain nutrient load reduction targets. This was accomplished by integrating a GIS-based BMP siting method, a WQM-TMDL-N modeling approach to estimate total nitrogen (TN) loading, and a multi-objective optimization algorithm. Wetland restoration and buffer strip implementation were the two BMP categories used to explore the performance of this framework, both differing greatly in complexity of spatial analysis for site identification. Minimizing TN load and BMP cost were the two objective functions for the optimization process. The performance of this framework was demonstrated in the Tippecanoe River watershed, Indiana, USA. Optimized scenario-based load reduction indicated that the wetland subset selected by the minimum scenario had the greatest N removal efficiency. Buffer strips were more effective for load removal than wetlands. The optimized solutions provided a range of trade-offs between the two objective functions for both BMPs. This framework can be expanded conveniently to a regional scale because the NHDPlus catchment serves as its spatial computational unit. The present study demonstrated the potential of this framework to find cost-effective solutions to meet a water quality target, such as a 20% TN load reduction, under different conditions. PMID:26188990

  19. A Subbasin-based framework to represent land surface processes in an Earth System Model

    SciTech Connect

    Tesfa, Teklu K.; Li, Hongyi; Leung, Lai-Yung R.; Huang, Maoyi; Ke, Yinghai; Sun, Yu; Liu, Ying

    2014-05-20

    Realistically representing spatial heterogeneity and lateral land surface processes within and between modeling units in earth system models is important because of their implications to surface energy and water exchange. The traditional approach of using regular grids as computational units in land surface models and earth system models may lead to inadequate representation of lateral movements of water, energy and carbon fluxes, especially when the grid resolution increases. Here a new subbasin-based framework is introduced in the Community Land Model (CLM), which is the land component of the Community Earth System Model (CESM). Local processes are represented assuming each subbasin as a grid cell on a pseudo grid matrix with no significant modifications to the existing CLM modeling structure. Lateral routing of water within and between subbasins is simulated with the subbasin version of a recently-developed physically based routing model, Model for Scale Adaptive River Routing (MOSART). As an illustration, this new framework is implemented in the topographically diverse region of the U.S. Pacific Northwest. The modeling units (subbasins) are delineated from high-resolution Digital Elevation Model while atmospheric forcing and surface parameters are remapped from the corresponding high resolution datasets. The impacts of this representation on simulating hydrologic processes are explored by comparing it with the default (grid-based) CLM representation. In addition, the effects of DEM resolution on parameterizing topography and the subsequent effects on runoff processes are investigated. Limited model evaluation and comparison showed that small difference between the averaged forcing can lead to more significant difference in the simulated runoff and streamflow because of nonlinear horizontal processes. Topographic indices derived from high resolution DEM may not improve the overall water balance, but affect the partitioning between surface and subsurface runoff

  20. Computational framework to model and design surgical meshes for hernia repair.

    PubMed

    Hernández-Gascón, B; Espés, N; Peña, E; Pascual, G; Bellón, J M; Calvo, B

    2014-08-01

    Surgical procedures for hernia surgery are usually performed using prosthetic meshes. In spite of all the improvements in these biomaterials, the perfect match between the prosthesis and the implant site has not been achieved. Thus, new designs of surgical meshes are still being developed. Previous to implantation in humans, the validity of the meshes has to be addressed, and to date experimental studies have been the gold standard in testing and validating new implants. Nevertheless, these procedures involve long periods of time and are expensive. Thus, a computational framework for the simulation of prosthesis and surgical procedures may overcome some disadvantages of the experimental methods. The computational framework includes two computational models for designing and validating the behaviour of new meshes, respectively. Firstly, the beam model, which reproduces the exact geometry of the mesh, is set to design the weave and determine the stiffness of the surgical prosthesis. However, this implies a high computational cost whereas the membrane model, defined within the framework of the large deformation hyperelasticity, is a relatively inexpensive computational tool, which also enables a prosthesis to be included in more complex geometries such as human or animal bodies. PMID:23167618

  1. A hybrid model-classifier framework for managing prediction uncertainty in expensive optimisation problems

    NASA Astrophysics Data System (ADS)

    Tenne, Yoel; Izui, Kazuhiro; Nishiwaki, Shinji

    2012-07-01

    Many real-world optimisation problems rely on computationally expensive simulations to evaluate candidate solutions. Often, such problems will contain candidate solutions for which the simulation fails, for example, due to limitations of the simulation. Such candidate solutions can hinder the effectiveness of the optimisation since they may consume a large portion of the optimisation budget without providing new information to the optimiser, leading to search stagnation and a poor final result. Existing approaches to handle such designs either discard them altogether, or assign them a penalised fitness. However, this results in loss of beneficial information, or in a model with a severely deformed landscape. To address these issues, this study proposes a hybrid classifier-model framework. The role of the classifier is to predict which candidate solutions are likely to crash the simulation, and this prediction is then used to bias the search towards valid solutions. Furthermore, the proposed framework employs a trust-region approach, and several other procedures, to manage the model and classifier, and to ensure the progress of the optimisation. Performance analysis using an engineering application of airfoil shape optimisation shows the efficacy of the proposed framework, and the possibility to use the knowledge accumulated in the classifier to gain new insights into the problem being solved.

  2. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    NASA Astrophysics Data System (ADS)

    Han, Xue; Sandels, Claes; Zhu, Kun; Nordström, Lars

    2013-08-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system operation will be changed by various parameters of DERs. This article proposed a modelling framework for an overview analysis on the correlation between DERs. Furthermore, to validate the framework, the authors described the reference models of different categories of DERs with their unique characteristics, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles.

  3. A Bioinformatics Reference Model: Towards a Framework for Developing and Organising Bioinformatic Resources

    NASA Astrophysics Data System (ADS)

    Hiew, Hong Liang; Bellgard, Matthew

    2007-11-01

    Life Science research faces the constant challenge of how to effectively handle an ever-growing body of bioinformatics software and online resources. The users and developers of bioinformatics resources have a diverse set of competing demands on how these resources need to be developed and organised. Unfortunately, there does not exist an adequate community-wide framework to integrate such competing demands. The problems that arise from this include unstructured standards development, the emergence of tools that do not meet specific needs of researchers, and often times a communications gap between those who use the tools and those who supply them. This paper presents an overview of the different functions and needs of bioinformatics stakeholders to determine what may be required in a community-wide framework. A Bioinformatics Reference Model is proposed as a basis for such a framework. The reference model outlines the functional relationship between research usage and technical aspects of bioinformatics resources. It separates important functions into multiple structured layers, clarifies how they relate to each other, and highlights the gaps that need to be addressed for progress towards a diverse, manageable, and sustainable body of resources. The relevance of this reference model to the bioscience research community, and its implications in progress for organising our bioinformatics resources, are discussed.

  4. Pursuing realistic hydrologic model under SUPERFLEX framework in a semi-humid catchment in China

    NASA Astrophysics Data System (ADS)

    Wei, Lingna; Savenije, Hubert H. G.; Gao, Hongkai; Chen, Xi

    2016-04-01

    Model realism is pursued perpetually by hydrologists for flood and drought prediction, integrated water resources management and decision support of water security. "Physical-based" distributed hydrologic models are speedily developed but they also encounter unneglectable challenges, for instance, computational time with low efficiency and parameters uncertainty. This study step-wisely tested four conceptual hydrologic models under the framework of SUPERFLEX in a small semi-humid catchment in southern Huai River basin of China. The original lumped FLEXL has hypothesized model structure of four reservoirs to represent canopy interception, unsaturated zone, subsurface flow of fast and slow components and base flow storage. Considering the uneven rainfall in space, the second model (FLEXD) is developed with same parameter set for different rain gauge controlling units. To reveal the effect of topography, terrain descriptor of height above the nearest drainage (HAND) combined with slope is applied to classify the experimental catchment into two landscapes. Then the third one (FLEXTOPO) builds different model blocks in consideration of the dominant hydrologic process corresponding to the topographical condition. The fourth one named FLEXTOPOD integrating the parallel framework of FLEXTOPO in four controlled units is designed to interpret spatial variability of rainfall patterns and topographic features. Through pairwise comparison, our results suggest that: (1) semi-distributed models (FLEXD and FLEXTOPOD) taking precipitation spatial heterogeneity into account has improved model performance with parsimonious parameter set, and (2) hydrologic model architecture with flexibility to reflect perceived dominant hydrologic processes can include the local terrain circumstances for each landscape. Hence, the modeling actions are coincided with the catchment behaviour and close to the "reality". The presented methodology is regarding hydrologic model as a tool to test our

  5. Integrating Sediment Connectivity into Water Resources Management Trough a Graph Theoretic, Stochastic Modeling Framework.

    NASA Astrophysics Data System (ADS)

    Schmitt, R. J. P.; Castelletti, A.; Bizzi, S.

    2014-12-01

    Understanding sediment transport processes at the river basin scale, their temporal spectra and spatial patterns is key to identify and minimize morphologic risks associated to channel adjustments processes. This work contributes a stochastic framework for modeling bed-load connectivity based on recent advances in the field (e.g., Bizzi & Lerner, 2013; Czubas & Foufoulas-Georgiu, 2014). It presents river managers with novel indicators from reach scale vulnerability to channel adjustment in large river networks with sparse hydrologic and sediment observations. The framework comprises three steps. First, based on a distributed hydrological model and remotely sensed information, the framework identifies a representative grain size class for each reach. Second, sediment residence time distributions are calculated for each reach in a Monte-Carlo approach applying standard sediment transport equations driven by local hydraulic conditions. Third, a network analysis defines the up- and downstream connectivity for various travel times resulting in characteristic up/downstream connectivity signatures for each reach. Channel vulnerability indicators quantify the imbalance between up/downstream connectivity for each travel time domain, representing process dependent latency of morphologic response. Last, based on the stochastic core of the model, a sensitivity analysis identifies drivers of change and major sources of uncertainty in order to target key detrimental processes and to guide effective gathering of additional data. The application, limitation and integration into a decision analytic framework is demonstrated for a major part of the Red River Basin in Northern Vietnam (179.000 km2). Here, a plethora of anthropic alterations ranging from large reservoir construction to land-use changes results in major downstream deterioration and calls for deriving concerted sediment management strategies to mitigate current and limit future morphologic alterations.

  6. A hydro-economic modelling framework for optimal management of groundwater nitrate pollution from agriculture

    NASA Astrophysics Data System (ADS)

    Peña-Haro, Salvador; Pulido-Velazquez, Manuel; Sahuquillo, Andrés

    2009-06-01

    SummaryA hydro-economic modelling framework is developed for determining optimal management of groundwater nitrate pollution from agriculture. A holistic optimization model determines the spatial and temporal fertilizer application rate that maximizes the net benefits in agriculture constrained by the quality requirements in groundwater at various control sites. Since emissions (nitrogen loading rates) are what can be controlled, but the concentrations are the policy targets, we need to relate both. Agronomic simulations are used to obtain the nitrate leached, while numerical groundwater flow and solute transport simulation models were used to develop unit source solutions that were assembled into a pollutant concentration response matrix. The integration of the response matrix in the constraints of the management model allows simulating by superposition the evolution of groundwater nitrate concentration over time at different points of interest throughout the aquifer resulting from multiple pollutant sources distributed over time and space. In this way, the modelling framework relates the fertilizer loads with the nitrate concentration at the control sites. The benefits in agriculture were determined through crop prices and crop production functions. This research aims to contribute to the ongoing policy process in the Europe Union (the Water Framework Directive) providing a tool for analyzing the opportunity cost of measures for reducing nitrogen loadings and assessing their effectiveness for maintaining groundwater nitrate concentration within the target levels. The management model was applied to a hypothetical groundwater system. Optimal solutions of fertilizer use to problems with different initial conditions, planning horizons, and recovery times were determined. The illustrative example shows the importance of the location of the pollution sources in relation to the control sites, and how both the selected planning horizon and the target recovery time can

  7. Multi-object segmentation framework using deformable models for medical imaging analysis.

    PubMed

    Namías, Rafael; D'Amato, Juan Pablo; Del Fresno, Mariana; Vénere, Marcelo; Pirró, Nicola; Bellemare, Marc-Emmanuel

    2016-08-01

    Segmenting structures of interest in medical images is an important step in different tasks such as visualization, quantitative analysis, simulation, and image-guided surgery, among several other clinical applications. Numerous segmentation methods have been developed in the past three decades for extraction of anatomical or functional structures on medical imaging. Deformable models, which include the active contour models or snakes, are among the most popular methods for image segmentation combining several desirable features such as inherent connectivity and smoothness. Even though different approaches have been proposed and significant work has been dedicated to the improvement of such algorithms, there are still challenging research directions as the simultaneous extraction of multiple objects and the integration of individual techniques. This paper presents a novel open-source framework called deformable model array (DMA) for the segmentation of multiple and complex structures of interest in different imaging modalities. While most active contour algorithms can extract one region at a time, DMA allows integrating several deformable models to deal with multiple segmentation scenarios. Moreover, it is possible to consider any existing explicit deformable model formulation and even to incorporate new active contour methods, allowing to select a suitable combination in different conditions. The framework also introduces a control module that coordinates the cooperative evolution of the snakes and is able to solve interaction issues toward the segmentation goal. Thus, DMA can implement complex object and multi-object segmentations in both 2D and 3D using the contextual information derived from the model interaction. These are important features for several medical image analysis tasks in which different but related objects need to be simultaneously extracted. Experimental results on both computed tomography and magnetic resonance imaging show that the proposed

  8. An Improved Multi-Scale Modeling Framework for WRF over Complex Terrain

    NASA Astrophysics Data System (ADS)

    Wiersema, D. J.; Lundquist, K. A.; Chow, F. K.

    2014-12-01

    Atmospheric modelers continue to push towards higher resolution simulations of the planetary boundary layer. As resolution is refined, the resolved terrain slopes increase. Atmospheric models using terrain-following coordinates, such as the Weather Research and Forecasting (WRF) model, suffer from numerical errors since steep terrain slopes lead to grid skewness, resulting in model failure. One solution to this problem is the use of an immersed boundary method, which uses a non-conforming grid, for simulations over complex terrain. Our implementation of an immersed boundary method in WRF, known as WRF-IBM, was developed for use at the micro-scale and has been shown to accurately simulate flow around complex topography, such as urban environments or mountainous terrain. The research presented here describes our newly developed framework to enable concurrently run multi-scale simulations using the WRF model at the meso-scale and the WRF-IBM model at the micro-scale. WRF and WRF-IBM use different vertical coordinates therefore it is not possible to use the existing nesting framework to pass lateral boundary conditions from a WRF parent domain to a WRF-IBM nested domain. Nesting between WRF and WRF-IBM requires "vertical grid nesting", meaning the ability to pass information between domains with different vertical levels. Our newly implemented method for vertical grid nesting, available in the public release of WRFv3.6.1, allows nested domains to utilize different vertical levels. Using our vertical grid nesting code, we are currently developing the ability to nest a domain using IBM within a domain using terrain-following coordinates. Here we present results from idealized cases displaying the functionality of the multi-scale nesting framework and the advancement towards multi-scale meteorological simulations over complex terrain.

  9. A Linear Mixed Model Spline Framework for Analysing Time Course ‘Omics’ Data

    PubMed Central

    Straube, Jasmin; Gorse, Alain-Dominique

    2015-01-01

    Time course ‘omics’ experiments are becoming increasingly important to study system-wide dynamic regulation. Despite their high information content, analysis remains challenging. ‘Omics’ technologies capture quantitative measurements on tens of thousands of molecules. Therefore, in a time course ‘omics’ experiment molecules are measured for multiple subjects over multiple time points. This results in a large, high-dimensional dataset, which requires computationally efficient approaches for statistical analysis. Moreover, methods need to be able to handle missing values and various levels of noise. We present a novel, robust and powerful framework to analyze time course ‘omics’ data that consists of three stages: quality assessment and filtering, profile modelling, and analysis. The first step consists of removing molecules for which expression or abundance is highly variable over time. The second step models each molecular expression profile in a linear mixed model framework which takes into account subject-specific variability. The best model is selected through a serial model selection approach and results in dimension reduction of the time course data. The final step includes two types of analysis of the modelled trajectories, namely, clustering analysis to identify groups of correlated profiles over time, and differential expression analysis to identify profiles which differ over time and/or between treatment groups. Through simulation studies we demonstrate the high sensitivity and specificity of our approach for differential expression analysis. We then illustrate how our framework can bring novel insights on two time course ‘omics’ studies in breast cancer and kidney rejection. The methods are publicly available, implemented in the R CRAN package lmms. PMID:26313144

  10. Verification of Advances in a Coupled Snow-runoff Modeling Framework for Operational Streamflow Forecasts

    NASA Astrophysics Data System (ADS)

    Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.

    2011-12-01

    The National Oceanic and Atmospheric Administration's (NOAA's) River Forecast Centers (RFCs) issue hydrologic forecasts related to flood events, reservoir operations for water supply, streamflow regulation, and recreation on the nation's streams and rivers. The RFCs use the National Weather Service River Forecast System (NWSRFS) for streamflow forecasting which relies on a coupled snow model (i.e. SNOW17) and rainfall-runoff model (i.e. SAC-SMA) in snow-dominated regions of the US. Errors arise in various steps of the forecasting system from input data, model structure, model parameters, and initial states. The goal of the current study is to undertake verification of potential improvements in the SNOW17-SAC-SMA modeling framework developed for operational streamflow forecasts. We undertake verification for a range of parameters sets (i.e. RFC, DREAM (Differential Evolution Adaptive Metropolis)) as well as a data assimilation (DA) framework developed for the coupled models. Verification is also undertaken for various initial conditions to observe the influence of variability in initial conditions on the forecast. The study basin is the North Fork America River Basin (NFARB) located on the western side of the Sierra Nevada Mountains in northern California. Hindcasts are verified using both deterministic (i.e. Nash Sutcliffe efficiency, root mean square error, and joint distribution) and probabilistic (i.e. reliability diagram, discrimination diagram, containing ratio, and Quantile plots) statistics. Our presentation includes comparison of the performance of different optimized parameters and the DA framework as well as assessment of the impact associated with the initial conditions used for streamflow forecasts for the NFARB.

  11. A Python Plug-in Based Computational Framework for Spatially Distributed Environmental and Earth Sciences Modelling

    NASA Astrophysics Data System (ADS)

    Willgoose, G. R.

    2009-12-01

    One of the pioneering landform evolution models, SIBERIA, while developed in the 1980’s is still widely used in the science community and is a key component of engineering software used to assess the long-term stability of man-made landforms such as rehabilitated mine sites and nuclear waste repositories. While SIBERIA is very reliable, computationally fast and well tested (both its underlying science and the computer code) the range of emerging applications have challenged the ability of the author to maintain and extend the underlying computer code. Moreover, the architecture of the SIBERIA code is not well suited to collaborative extension of its capabilities without often triggering forking of the code base. This paper describes a new modelling framework designed to supersede SIBERIA (as well as other earth sciences codes by the author) called TelluSim. The design is such that it is potentially more than simply a new landform evolution model, but TelluSim is a more general dynamical system modelling framework using time evolving GIS data as its spatial discretisation. TelluSim is designed as an open modular framework facilitating open-sourcing of the code, while addressing compromises made in the original design of SIBERIA in the 1980’s. An important aspect of the design of TelluSim was to minimise the overhead in interfacing the modules with TelluSim, and minimise any requirement for recoding of existing software, so eliminating a major disadvantage of more complex frameworks. The presentation will discuss in more detail the reasoning behind the design of TelluSim, and experiences of the advantages and disadvantages of using Python relative to other approaches (e.g. Matlab, R). The paper will discuss examples of how TelluSim has facilitated the incorporation and testing of new algorithms, and environmental processes, and the support for novel science and data testing methodologies. It will also discuss plans to link TelluSim with other open source

  12. Short term global health experiences and local partnership models: a framework.

    PubMed

    Loh, Lawrence C; Cherniak, William; Dreifuss, Bradley A; Dacso, Matthew M; Lin, Henry C; Evert, Jessica

    2015-01-01

    Contemporary interest in in short-term experiences in global health (STEGH) has led to important questions of ethics, responsibility, and potential harms to receiving communities. In addressing these issues, the role of local engagement through partnerships between external STEGH facilitating organization(s) and internal community organization(s) has been identified as crucial to mitigating potential pitfalls. This perspective piece offers a framework to categorize different models of local engagement in STEGH based on professional experiences and a review of the existing literature. This framework will encourage STEGH stakeholders to consider partnership models in the development and evaluation of new or existing programs.The proposed framework examines the community context in which STEGH may occur, and considers three broad categories: number of visiting external groups conducting STEGH (single/multiple), number of host entities that interact with the STEGH (none/single/multiple), and frequency of STEGH (continuous/intermittent). These factors culminate in a specific model that provides a description of opportunities and challenges presented by each model. Considering different models, single visiting partners, working without a local partner on an intermittent (or even one-time) basis provided the greatest flexibility to the STEGH participants, but represented the least integration locally and subsequently the greatest potential harm for the receiving community. Other models, such as multiple visiting teams continuously working with a single local partner, provided an opportunity for centralization of efforts and local input, but required investment in consensus-building and streamlining of processes across different groups. We conclude that involving host partners in the design, implementation, and evaluation of STEGH requires more effort on the part of visiting STEGH groups and facilitators, but has the greatest potential benefit for meaningful, locally

  13. SMART: A New Semi-distributed Hydrologic Modelling Framework for Soil Moisture and Runoff Simulations

    NASA Astrophysics Data System (ADS)

    Ajami, Hoori; Sharma, Ashish

    2016-04-01

    A new GIS-based semi-distributed hydrological modelling framework is developed based upon the delineation of contiguous and topologically connected Hydrologic Response Units (HRUs). The Soil Moisture and Runoff simulation Toolkit (SMART) performs topographic and geomorphic analysis of a catchment and delineates HRUs in each first order sub-basin. This HRU delineation approach maintains lateral flow dynamics in first order sub-basins and therefore it is suited for simulating runoff in upland catchments. Simulation elements in SMART are distributed cross sections or equivalent cross sections (ECS) in each first order sub-basin to represent hillslope hydrologic processes. Delineation of ECSs in SMART is performed by weighting the topographic and physiographic properties of the part or entire first-order sub-basin and has the advantage of reducing computational time/effort while maintaining reasonable accuracy in simulated hydrologic state and fluxes (e.g. soil moisture, evapotranspiration and runoff). SMART workflow is written in MATLAB to automate the HRU and cross section delineations, model simulations across multiple cross sections, and post-processing of model outputs to visualize the results. The MATLAB Parallel Processing Toolbox is used for simultaneous simulations of cross sections and is further reduced computational time. SMART workflow tasks are: 1) delineation of first order sub-basins of a catchment using a digital elevation model, 2) hillslope delineation, 3) landform delineation in every first order sub-basin based on topographic and geomorphic properties of a group of sub-basins or the entire catchment, 4) formulation of cross sections as well as equivalent cross sections in every first order sub-basin, and 5) deriving vegetation and soil parameters from spatially distributed land cover and soil information. The current version of SMART uses a 2-d distributed hydrological model based on the Richards' equation. However, any hydrologic model can be

  14. Information Model Driven Semantic Framework Architecture and Design for Distributed Data Repositories

    NASA Astrophysics Data System (ADS)

    Fox, P. A.; Semantic eScience Framework Team

    2011-12-01

    In Earth and space science, the steady evolution away from isolated and single purpose data 'systems' toward systems of systems, data ecosystems, or data frameworks that provide access to highly heterogeneous data repositories is picking up in pace. As a result, common informatics approaches are being sought for how newer architectures are developed and/or implemented. In particular, a clear need to have a repeatable method for modeling, implementing and evolving the information architectures has emerged and one that goes beyond traditional software design. This presentation outlines new component design approaches bases in sets of information model and semantic encodings for mediation.

  15. Super-Resolution Using Hidden Markov Model and Bayesian Detection Estimation Framework

    NASA Astrophysics Data System (ADS)

    Humblot, Fabrice; Mohammad-Djafari, Ali

    2006-12-01

    This paper presents a new method for super-resolution (SR) reconstruction of a high-resolution (HR) image from several low-resolution (LR) images. The HR image is assumed to be composed of homogeneous regions. Thus, the a priori distribution of the pixels is modeled by a finite mixture model (FMM) and a Potts Markov model (PMM) for the labels. The whole a priori model is then a hierarchical Markov model. The LR images are assumed to be obtained from the HR image by lowpass filtering, arbitrarily translation, decimation, and finally corruption by a random noise. The problem is then put in a Bayesian detection and estimation framework, and appropriate algorithms are developed based on Markov chain Monte Carlo (MCMC) Gibbs sampling. At the end, we have not only an estimate of the HR image but also an estimate of the classification labels which leads to a segmentation result.

  16. A Framework for Incorporating Dyads in Models of HIV-Prevention

    PubMed Central

    Hops, Hyman; Redding, Colleen A.; Reis, Harry T.; Rothman, Alexander J.; Simpson, Jeffry A.

    2014-01-01

    Although HIV is contracted by individuals, it is typically transmitted in dyads. Most efforts to promote safer sex practices, however, focus exclusively on individuals. The goal of this paper is to provide a theoretical framework that specifies how models of dyadic processes and relationships can inform models of HIV-prevention. At the center of the framework is the proposition that safer sex between two people requires a dyadic capacity for successful coordination. According to this framework, relational, individual, and structural variables that affect the enactment of safer sex do so through their direct and indirect effects on that dyadic capacity. This dyadic perspective does not require an ongoing relationship between two individuals; rather, it offers a way of distinguishing between dyads along a continuum from anonymous strangers (with minimal coordination of behavior) to long-term partners (with much greater coordination). Acknowledging the dyadic context of HIV-prevention offers new targets for interventions and suggests new approaches to tailoring interventions to specific populations. PMID:20838872

  17. Comparison of mathematical frameworks for modeling erythropoiesis in the context of malaria infection.

    PubMed

    Fonseca, Luis L; Voit, Eberhard O

    2015-12-01

    Malaria is an infectious disease present all around the globe and responsible for half a million deaths per year. A within-host model of this infection requires a framework capable of properly approximating not only the blood stage of the infection but also the erythropoietic process that is in charge of overcoming the malaria induced anemia. Within this context, we compare ordinary differential equations (ODEs) with and without age classes, delayed differential equations (DDEs), and discrete recursive equations (DREs) with age classes. Results show that ODEs without age classes are fair approximations that do not provide a crisp temporal representation of the processes involved, and inclusion of age classes only mitigates the problem to some degree. DDEs perform well with respect to generating the essentially fixed delay between cell production and cell removal due to age, but the inclusion of any other processes, such as sudden blood loss, becomes cumbersome. The framework that was found to perform best in representing the dynamics of red blood cells during malaria infection is a DRE with age classes. In this model structure, the amount of time a cell remains alive is easily controlled, and the addition of age dependent or independent processes is straightforward. All events that populations of cells face during their lifespan, like growth or adaptation in differentiation or maturation rate, are properly represented in this framework. PMID:26362230

  18. Creating "Intelligent" Climate Model Ensemble Averages Using a Process-Based Framework

    NASA Astrophysics Data System (ADS)

    Baker, N. C.; Taylor, P. C.

    2014-12-01

    The CMIP5 archive contains future climate projections from over 50 models provided by dozens of modeling centers from around the world. Individual model projections, however, are subject to biases created by structural model uncertainties. As a result, ensemble averaging of multiple models is often used to add value to model projections: consensus projections have been shown to consistently outperform individual models. Previous reports for the IPCC establish climate change projections based on an equal-weighted average of all model projections. However, certain models reproduce climate processes better than other models. Should models be weighted based on performance? Unequal ensemble averages have previously been constructed using a variety of mean state metrics. What metrics are most relevant for constraining future climate projections? This project develops a framework for systematically testing metrics in models to identify optimal metrics for unequal weighting multi-model ensembles. A unique aspect of this project is the construction and testing of climate process-based model evaluation metrics. A climate process-based metric is defined as a metric based on the relationship between two physically related climate variables—e.g., outgoing longwave radiation and surface temperature. Metrics are constructed using high-quality Earth radiation budget data from NASA's Clouds and Earth's Radiant Energy System (CERES) instrument and surface temperature data sets. It is found that regional values of tested quantities can vary significantly when comparing weighted and unweighted model ensembles. For example, one tested metric weights the ensemble by how well models reproduce the time-series probability distribution of the cloud forcing component of reflected shortwave radiation. The weighted ensemble for this metric indicates lower simulated precipitation (up to .7 mm/day) in tropical regions than the unweighted ensemble: since CMIP5 models have been shown to

  19. Simulating mesoscale coastal evolution for decadal coastal management: A new framework integrating multiple, complementary modelling approaches

    NASA Astrophysics Data System (ADS)

    van Maanen, Barend; Nicholls, Robert J.; French, Jon R.; Barkwith, Andrew; Bonaldo, Davide; Burningham, Helene; Brad Murray, A.; Payo, Andres; Sutherland, James; Thornhill, Gillian; Townend, Ian H.; van der Wegen, Mick; Walkden, Mike J. A.

    2016-03-01

    Coastal and shoreline management increasingly needs to consider morphological change occurring at decadal to centennial timescales, especially that related to climate change and sea-level rise. This requires the development of morphological models operating at a mesoscale, defined by time and length scales of the order 101 to 102 years and 101 to 102 km. So-called 'reduced complexity' models that represent critical processes at scales not much smaller than the primary scale of interest, and are regulated by capturing the critical feedbacks that govern landform behaviour, are proving effective as a means of exploring emergent coastal behaviour at a landscape scale. Such models tend to be computationally efficient and are thus easily applied within a probabilistic framework. At the same time, reductionist models, built upon a more detailed description of hydrodynamic and sediment transport processes, are capable of application at increasingly broad spatial and temporal scales. More qualitative modelling approaches are also emerging that can guide the development and deployment of quantitative models, and these can be supplemented by varied data-driven modelling approaches that can achieve new explanatory insights from observational datasets. Such disparate approaches have hitherto been pursued largely in isolation by mutually exclusive modelling communities. Brought together, they have the potential to facilitate a step change in our ability to simulate the evolution of coastal morphology at scales that are most relevant to managing erosion and flood risk. Here, we advocate and outline a new integrated modelling framework that deploys coupled mesoscale reduced complexity models, reductionist coastal area models, data-driven approaches, and qualitative conceptual models. Integration of these heterogeneous approaches gives rise to model compositions that can potentially resolve decadal- to centennial-scale behaviour of diverse coupled open coast, estuary and inner

  20. A process-based rejectionist framework for evaluating catchment runoff model structure

    NASA Astrophysics Data System (ADS)

    Vaché, Kellie B.; McDonnell, Jeffrey J.

    2006-02-01

    Complex hydrological descriptions at the hillslope scale have been difficult to incorporate within a catchment modeling framework because of the disparity between the scale of measurements and the scale of model subunits. As a result, parameters represented in many conceptual models are often not related to physical properties and therefore cannot be established prior to a model calibration. While tolerable for predictions involving water quantity, water quality simulations require additional attention to transport processes, flow path sources, and water age. This paper examines how isotopic estimates of residence time may be used to subsume flow path process complexity and to provide a simple, scalable evaluative data source for water quantity- and quality-based conceptual models. We test a set of simple distributed hydrologic models (from simple to more complex) against measured discharge and residence time and employ a simple Monte Carlo framework to evaluate the identifiability of parameters and how the inclusion of residence time contributes to the evaluative process. Results indicate that of the models evaluated, only the most complex, including an explicit unsaturated zone volume and an effective porosity, successfully reproduced both discharge dynamics and residence time. In addition, the inclusion of residence time in the evaluation of the accepted models results in a reduction of the a posteriori parameter uncertainty. Results from this study support the conclusion that the incorporation of soft data, in this case, isotopically estimated residence times, in model evaluation is a useful mechanism to bring experimental evidence into the process of model evaluation and selection, thereby providing one mechanism to further reconcile hillslope-scale complexity with catchment-scale simplicity.

  1. Groundwater modelling as a tool for the European Water Framework Directive (WFD) application: The Llobregat case

    NASA Astrophysics Data System (ADS)

    Vázquez-Suñé, E.; Abarca, E.; Carrera, J.; Capino, B.; Gámez, D.; Pool, M.; Simó, T.; Batlle, F.; Niñerola, J. M.; Ibáñez, X.

    The European Water Framework Directive establishes the basis for Community action in the field of water policy. Water authorities in Catalonia, together with users are designing a management program to improve groundwater status and to assess the impact of infrastructures and city-planning activities on the aquifers and their associated natural systems. The objective is to describe the role of groundwater modelling in addressing the issues raised by the Water Framework Directive, and its application to the Llobregat Delta, Barcelona, Spain. In this case modelling was used to address Water Framework Directive in the following: (1) Characterisation of aquifers and the status of groundwater by integration of existing knowledge and new hydrogeological information. Inverse modelling allowed us to reach an accurate description of the paths and mechanisms for the evolution of seawater intrusion. (2) Quantification of groundwater budget (mass balance). This is especially relevant for those terms that are difficult to asses, such as recharge from river infiltration during floods, which we have found to be very important. (3) Evaluation of groundwater-related environmental needs in aquatic ecosystems. The model allows quantifying groundwater input under natural conditions, which can be used as a reference level for stressed conditions. (4) Evaluation of possible impacts of territory planning (Llobregat river course modification, new railway tunnels, airport and docks enlargement, etc.). (5) Definition of management areas. (6) The assessment of possible future scenarios combined with optimization processes to quantify sustainable pumping rates and design measures to control seawater intrusion. The resulting model has been coupled to a user-friendly interface to allow water managers to design and address corrective measures in an agile and effective way.

  2. Construction of 3-D geologic framework and textural models for Cuyama Valley groundwater basin, California

    USGS Publications Warehouse

    Sweetkind, Donald S.; Faunt, Claudia C.; Hanson, Randall T.

    2013-01-01

    Groundwater is the sole source of water supply in Cuyama Valley, a rural agricultural area in Santa Barbara County, California, in the southeasternmost part of the Coast Ranges of California. Continued groundwater withdrawals and associated water-resource management concerns have prompted an evaluation of the hydrogeology and water availability for the Cuyama Valley groundwater basin by the U.S. Geological Survey, in cooperation with the Water Agency Division of the Santa Barbara County Department of Public Works. As a part of the overall groundwater evaluation, this report documents the construction of a digital three-dimensional geologic framework model of the groundwater basin suitable for use within a numerical hydrologic-flow model. The report also includes an analysis of the spatial variability of lithology and grain size, which forms the geologic basis for estimating aquifer hydraulic properties. The geologic framework was constructed as a digital representation of the interpreted geometry and thickness of the principal stratigraphic units within the Cuyama Valley groundwater basin, which include younger alluvium, older alluvium, and the Morales Formation, and underlying consolidated bedrock. The framework model was constructed by creating gridded surfaces representing the altitude of the top of each stratigraphic unit from various input data, including lithologic and electric logs from oil and gas wells and water wells, cross sections, and geologic maps. Sediment grain-size data were analyzed in both two and three dimensions to help define textural variations in the Cuyama Valley groundwater basin and identify areas with similar geologic materials that potentially have fairly uniform hydraulic properties. Sediment grain size was used to construct three-dimensional textural models that employed simple interpolation between drill holes and two-dimensional textural models for each stratigraphic unit that incorporated spatial structure of the textural data.

  3. Implementations of a Flexible Framework for Managing Geologic Sequestration Modeling Projects

    SciTech Connect

    White, Signe K.; Gosink, Luke J.; Sivaramakrishnan, Chandrika; Black, Gary D.; Purohit, Sumit; Bacon, Diana H.; Hou, Zhangshuan; Lin, Guang; Gorton, Ian; Bonneville, Alain

    2013-08-06

    Numerical simulation is a standard practice used to support designing, operating, and monitoring CO2 injection projects. Although a variety of computational tools have been developed that support the numerical simulation process, many are single-purpose or platform specific and have a prescribed workflow that may or may not be suitable for a particular project. We are developing an open-source, flexible framework named Velo that provides a knowledge management infrastructure and tools to support modeling and simulation for various types of projects in a number of scientific domains. The Geologic Sequestration Software Suite (GS3) is a version of this framework with features and tools specifically tailored for geologic sequestration studies. Because of its general nature, GS3 is being employed in a variety of ways on projects with differing goals. GS3 is being used to support the Sim-SEQ international model comparison study, by providing a collaborative framework for the modeling teams and providing tools for model comparison. Another customized deployment of GS3 has been made to support the permit application process. In this case, GS3 is being used to manage data in support of conceptual model development and provide documentation and provenance for numerical simulations. An additional customized deployment of GS3 is being created for use by the United States Environmental Protection Agency (US-EPA) to aid in the CO2 injection permit application review process in one of its regions. These use cases demonstrate GS3’s flexibility, utility, and broad applicability

  4. An Information-Theoretic Framework for Improving Imperfect Dynamical Predictions Via Multi-Model Ensemble Forecasts

    NASA Astrophysics Data System (ADS)

    Branicki, Michal; Majda, Andrew J.

    2015-06-01

    This work focuses on elucidating issues related to an increasingly common technique of multi-model ensemble (MME) forecasting. The MME approach is aimed at improving the statistical accuracy of imperfect time-dependent predictions by combining information from a collection of reduced-order dynamical models. Despite some operational evidence in support of the MME strategy for mitigating the prediction error, the mathematical framework justifying this approach has been lacking. Here, this problem is considered within a probabilistic/stochastic framework which exploits tools from information theory to derive a set of criteria for improving probabilistic MME predictions relative to single-model predictions. The emphasis is on a systematic understanding of the benefits and limitations associated with the MME approach, on uncertainty quantification, and on the development of practical design principles for constructing an MME with improved predictive performance. The conditions for prediction improvement via the MME approach stem from the convexity of the relative entropy which is used here as a measure of the lack of information in the imperfect models relative to the resolved characteristics of the truth dynamics. It is also shown how practical guidelines for MME prediction improvement can be implemented in the context of forced response predictions from equilibrium with the help of the linear response theory utilizing the fluctuation-dissipation formulas at the unperturbed equilibrium. The general theoretical results are illustrated using exactly solvable stochastic non-Gaussian test models.

  5. A Modeling Framework to Incorporate Effects of Infrastructure in Sociohydrological Systems

    NASA Astrophysics Data System (ADS)

    Muneepeerakul, R.

    2014-12-01

    In studying coupled natural-human systems, most modeling efforts focus on humans and the natural resources. In reality, however, humans rarely interact with these resources directly; the relationships between humans and resources are mediated by infrastructures. In sociohydrological systems, these include, for example, dams and irrigation canals. These infrastructures have important characteristics such as threshold behavior and a separate entity/organization tasked with maintaining them. These characteristics influence social dynamics within the system, which in turn determines the state of infrastructure and water usage, thereby exerting feedbacks onto the hydrological processes. Infrastructure is thus a necessary ingredient for modeling co-evolution of human and water in sociohydrological systems. A conceptual framework to address this gap has been proposed by Anderies, Janssen, and Ostrom (2004). Here we develop a model to operationalize the framework and report some preliminary results. Simple in its setup, the model highlights the structure of the social dilemmas and how it affects the system's sustainability. The model also offers a platform to explore how the system's sustainability may respond to external shocks from globalization and global climate change.

  6. Inferring landscape effects on gene flow: a new model selection framework.

    PubMed

    Shirk, A J; Wallin, D O; Cushman, S A; Rice, C G; Warheit, K I

    2010-09-01

    Populations in fragmented landscapes experience reduced gene flow, lose genetic diversity over time and ultimately face greater extinction risk. Improving connectivity in fragmented landscapes is now a major focus of conservation biology. Designing effective wildlife corridors for this purpose, however, requires an accurate understanding of how landscapes shape gene flow. The preponderance of landscape resistance models generated to date, however, is subjectively parameterized based on expert opinion or proxy measures of gene flow. While the relatively few studies that use genetic data are more rigorous, frameworks they employ frequently yield models only weakly related to the observed patterns of genetic isolation. Here, we describe a new framework that uses expert opinion as a starting point. By systematically varying each model parameter, we sought to either validate the assumptions of expert opinion, or identify a peak of support for a new model more highly related to genetic isolation. This approach also accounts for interactions between variables, allows for nonlinear responses and excludes variables that reduce model performance. We demonstrate its utility on a population of mountain goats inhabiting a fragmented landscape in the Cascade Range, Washington. PMID:20723066

  7. Strategic approaches to drug design. I. An integrated software framework for molecular modelling.

    PubMed

    Vinter, J G; Davis, A; Saunders, M R

    1987-04-01

    An integrated molecular graphics and computational chemistry framework is described which has been designed primarily to handle small molecules of up to 300 atoms. The system provides a means of integrating software from any source into a single framework. It is split into two functional subsystems. The first subsystem, called COSMIC, runs on low-cost, serial-linked colour graphics terminals and allows the user to prepare and examine structural data and to submit them for extensive computational chemistry. Links also allow access to databases, other modelling systems and user-written modules. Much of the output from COSMIC cannot be examined with low level graphics. A second subsystem, called ASTRAL, has been developed for the high-resolution Evans & Sutherland PS300 colour graphics terminal and is designed to manipulate complex display structures. The COSMIC minimisers, geometry investigators, molecular orbital displays, electrostatic isopotential generators and various interfaces and utilities are described. PMID:3505586

  8. Increased flexibility for modeling telemetry and nest-survival data using the multistate framework

    USGS Publications Warehouse

    Devineau, Olivier; Kendall, William L.; Doherty, Paul F., Jr.; Shenk, Tanya M.; White, Gary C.; Lukacs, Paul M.; Burnham, Kenneth P.

    2014-01-01

    Although telemetry is one of the most common tools used in the study of wildlife, advances in the analysis of telemetry data have lagged compared to progress in the development of telemetry devices. We demonstrate how standard known-fate telemetry and related nest-survival data analysis models are special cases of the more general multistate framework. We present a short theoretical development, and 2 case examples regarding the American black duck and the mallard. We also present a more complex lynx data analysis. Although not necessary in all situations, the multistate framework provides additional flexibility to analyze telemetry data, which may help analysts and biologists better deal with the vagaries of real-world data collection.

  9. A second gradient theoretical framework for hierarchical multiscale modeling of materials

    SciTech Connect

    Luscher, Darby J; Bronkhorst, Curt A; Mc Dowell, David L

    2009-01-01

    A theoretical framework for the hierarchical multiscale modeling of inelastic response of heterogeneous materials has been presented. Within this multiscale framework, the second gradient is used as a non local kinematic link between the response of a material point at the coarse scale and the response of a neighborhood of material points at the fine scale. Kinematic consistency between these scales results in specific requirements for constraints on the fluctuation field. The wryness tensor serves as a second-order measure of strain. The nature of the second-order strain induces anti-symmetry in the first order stress at the coarse scale. The multiscale ISV constitutive theory is couched in the coarse scale intermediate configuration, from which an important new concept in scale transitions emerges, namely scale invariance of dissipation. Finally, a strategy for developing meaningful kinematic ISVs and the proper free energy functions and evolution kinetics is presented.

  10. Using the Bifocal Modeling Framework to Resolve "Discrepant Events" Between Physical Experiments and Virtual Models in Biology

    NASA Astrophysics Data System (ADS)

    Blikstein, Paulo; Fuhrmann, Tamar; Salehi, Shima

    2016-08-01

    In this paper, we investigate an approach to supporting students' learning in science through a combination of physical experimentation and virtual modeling. We present a study that utilizes a scientific inquiry framework, which we call "bifocal modeling," to link student-designed experiments and computer models in real time. In this study, a group of high school students designed computer models of bacterial growth with reference to a simultaneous physical experiment they were conducting, and were able to validate the correctness of their model against the results of their experiment. Our findings suggest that as the students compared their virtual models with physical experiments, they encountered "discrepant events" that contradicted their existing conceptions and elicited a state of cognitive disequilibrium. This experience of conflict encouraged students to further examine their ideas and to seek more accurate explanations of the observed natural phenomena, improving the design of their computer models.

  11. Using the Bifocal Modeling Framework to Resolve "Discrepant Events" Between Physical Experiments and Virtual Models in Biology

    NASA Astrophysics Data System (ADS)

    Blikstein, Paulo; Fuhrmann, Tamar; Salehi, Shima

    2016-05-01

    In this paper, we investigate an approach to supporting students' learning in science through a combination of physical experimentation and virtual modeling. We present a study that utilizes a scientific inquiry framework, which we call "bifocal modeling," to link student-designed experiments and computer models in real time. In this study, a group of high school students designed computer models of bacterial growth with reference to a simultaneous physical experiment they were conducting, and were able to validate the correctness of their model against the results of their experiment. Our findings suggest that as the students compared their virtual models with physical experiments, they encountered "discrepant events" that contradicted their existing conceptions and elicited a state of cognitive disequilibrium. This experience of conflict encouraged students to further examine their ideas and to seek more accurate explanations of the observed natural phenomena, improving the design of their computer models.

  12. Modular GIS Framework for National Scale Hydrologic and Hydraulic Modeling Support

    NASA Astrophysics Data System (ADS)

    Djokic, D.; Noman, N.; Kopp, S.

    2015-12-01

    Geographic information systems (GIS) have been extensively used for pre- and post-processing of hydrologic and hydraulic models at multiple scales. An extensible GIS-based framework was developed for characterization of drainage systems (stream networks, catchments, floodplain characteristics) and model integration. The framework is implemented as a set of free, open source, Python tools and builds on core ArcGIS functionality and uses geoprocessing capabilities to ensure extensibility. Utilization of COTS GIS core capabilities allows immediate use of model results in a variety of existing online applications and integration with other data sources and applications.The poster presents the use of this framework to downscale global hydrologic models to local hydraulic scale and post process the hydraulic modeling results and generate floodplains at any local resolution. Flow forecasts from ECMWF or WRF-Hydro are downscaled and combined with other ancillary data for input into the RAPID flood routing model. RAPID model results (stream flow along each reach) are ingested into a GIS-based scale dependent stream network database for efficient flow utilization and visualization over space and time. Once the flows are known at localized reaches, the tools can be used to derive the floodplain depth and extent for each time step in the forecast at any available local resolution. If existing rating curves are available they can be used to relate the flow to the depth of flooding, or synthetic rating curves can be derived using the tools in the toolkit and some ancillary data/assumptions. The results can be published as time-enabled spatial services to be consumed by web applications that use floodplain information as an input. Some of the existing online presentation templates can be easily combined with available online demographic and infrastructure data to present the impact of the potential floods on the local community through simple, end user products. This framework

  13. A New Framework for Effective and Efficient Global Sensitivity Analysis of Earth and Environmental Systems Models

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin

    2015-04-01

    Earth and Environmental Systems (EES) models are essential components of research, development, and decision-making in science and engineering disciplines. With continuous advances in understanding and computing power, such models are becoming more complex with increasingly more factors to be specified (model parameters, forcings, boundary conditions, etc.). To facilitate better understanding of the role and importance of different factors in producing the model responses, the procedure known as 'Sensitivity Analysis' (SA) can be very helpful. Despite the availability of a large body of literature on the development and application of various SA approaches, two issues continue to pose major challenges: (1) Ambiguous Definition of Sensitivity - Different SA methods are based in different philosophies and theoretical definitions of sensitivity, and can result in different, even conflicting, assessments of the underlying sensitivities for a given problem, (2) Computational Cost - The cost of carrying out SA can be large, even excessive, for high-dimensional problems and/or computationally intensive models. In this presentation, we propose a new approach to sensitivity analysis that addresses the dual aspects of 'effectiveness' and 'efficiency'. By effective, we mean achieving an assessment that is both meaningful and clearly reflective of the objective of the analysis (the first challenge above), while by efficiency we mean achieving statistically robust results with minimal computational cost (the second challenge above). Based on this approach, we develop a 'global' sensitivity analysis framework that efficiently generates a newly-defined set of sensitivity indices that characterize a range of important properties of metric 'response surfaces' encountered when performing SA on EES models. Further, we show how this framework embraces, and is consistent with, a spectrum of different concepts regarding 'sensitivity', and that commonly-used SA approaches (e.g., Sobol

  14. Model Simulations of the Diurnal and Seasonal Variations of the Global Electric Circuit Using a Consistent 3D Model Framework

    NASA Astrophysics Data System (ADS)

    Lucas, G.; Bayona, V.; Flyer, N.; Baumgaertner, A. J. G.; Thayer, J. P.

    2014-12-01

    We introduce a new numeric solver for the partial differential equations of the Global Electric Circuit (GEC). The model is applied to derive the ionospheric potential with respect to the Earth, as well as the current distribution and electric fields throughout the atmosphere. We will discuss its advantages to previously published approaches, and introduce the model's application within a larger model framework that consistently describes the thunderstorm/electrified cloud current source distribution and conductivity. The new source and conductivity distributions will be utilized in the new numeric GEC solver to demonstrate the effect that temporal and spatial variability of these inputs have on electric fields and currents throughout the domain.

  15. HyDE Framework for Stochastic and Hybrid Model-Based Diagnosis

    NASA Technical Reports Server (NTRS)

    Narasimhan, Sriram; Brownston, Lee

    2012-01-01

    Hybrid Diagnosis Engine (HyDE) is a general framework for stochastic and hybrid model-based diagnosis that offers flexibility to the diagnosis application designer. The HyDE architecture supports the use of multiple modeling paradigms at the component and system level. Several alternative algorithms are available for the various steps in diagnostic reasoning. This approach is extensible, with support for the addition of new modeling paradigms as well as diagnostic reasoning algorithms for existing or new modeling paradigms. HyDE is a general framework for stochastic hybrid model-based diagnosis of discrete faults; that is, spontaneous changes in operating modes of components. HyDE combines ideas from consistency-based and stochastic approaches to model- based diagnosis using discrete and continuous models to create a flexible and extensible architecture for stochastic and hybrid diagnosis. HyDE supports the use of multiple paradigms and is extensible to support new paradigms. HyDE generates candidate diagnoses and checks them for consistency with the observations. It uses hybrid models built by the users and sensor data from the system to deduce the state of the system over time, including changes in state indicative of faults. At each time step when observations are available, HyDE checks each existing candidate for continued consistency with the new observations. If the candidate is consistent, it continues to remain in the candidate set. If it is not consistent, then the information about the inconsistency is used to generate successor candidates while discarding the candidate that was inconsistent. The models used by HyDE are similar to simulation models. They describe the expected behavior of the system under nominal and fault conditions. The model can be constructed in modular and hierarchical fashion by building component/subsystem models (which may themselves contain component/ subsystem models) and linking them through shared variables/parameters. The

  16. Land-Atmosphere Coupling in the Multi-Scale Modelling Framework

    NASA Astrophysics Data System (ADS)

    Kraus, P. M.; Denning, S.

    2015-12-01

    The Multi-Scale Modeling Framework (MMF), in which cloud-resolving models (CRMs) are embedded within general circulation model (GCM) gridcells to serve as the model's cloud parameterization, has offered a number of benefits to GCM simulations. The coupling of these cloud-resolving models directly to land surface model instances, rather than passing averaged atmospheric variables to a single instance of a land surface model, the logical next step in model development, has recently been accomplished. This new configuration offers conspicuous improvements to estimates of precipitation and canopy through-fall, but overall the model exhibits warm surface temperature biases and low productivity.This work presents modifications to a land-surface model that take advantage of the new multi-scale modeling framework, and accommodate the change in spatial scale from a typical GCM range of ~200 km to the CRM grid-scale of 4 km.A parameterization is introduced to apportion modeled surface radiation into direct-beam and diffuse components. The diffuse component is then distributed among the land-surface model instances within each GCM cell domain. This substantially reduces the number excessively low light values provided to the land-surface model when cloudy conditions are modeled in the CRM, associated with its 1-D radiation scheme. The small spatial scale of the CRM, ~4 km, as compared with the typical ~200 km GCM scale, provides much more realistic estimates of precipitation intensity, this permits the elimination of a model parameterization of canopy through-fall. However, runoff at such scales can no longer be considered as an immediate flow to the ocean. Allowing sub-surface water flow between land-surface instances within the GCM domain affords better realism and also reduces temperature and productivity biases.The MMF affords a number of opportunities to land-surface modelers, providing both the advantages of direct simulation at the 4 km scale and a much reduced

  17. SPARK: A Framework for Multi-Scale Agent-Based Biomedical Modeling

    PubMed Central

    Solovyev, Alexey; Mikheev, Maxim; Zhou, Leming; Dutta-Moscato, Joyeeta; Ziraldo, Cordelia; An, Gary; Vodovotz, Yoram; Mi, Qi

    2013-01-01

    Multi-scale modeling of complex biological systems remains a central challenge in the systems biology community. A method of dynamic knowledge representation known as agent-based modeling enables the study of higher level behavior emerging from discrete events performed by individual components. With the advancement of computer technology, agent-based modeling has emerged as an innovative technique to model the complexities of systems biology. In this work, the authors describe SPARK (Simple Platform for Agent-based Representation of Knowledge), a framework for agent-based modeling specifically designed for systems-level biomedical model development. SPARK is a stand-alone application written in Java. It provides a user-friendly interface, and a simple programming language for developing Agent-Based Models (ABMs). SPARK has the following features specialized for modeling biomedical systems: 1) continuous space that can simulate real physical space; 2) flexible agent size and shape that can represent the relative proportions of various cell types; 3) multiple spaces that can concurrently simulate and visualize multiple scales in biomedical models; 4) a convenient graphical user interface. Existing ABMs of diabetic foot ulcers and acute inflammation were implemented in SPARK. Models of identical complexity were run in both NetLogo and SPARK; the SPARK-based models ran two to three times faster. PMID:24163721

  18. Exact matrix treatment of an osmotic ensemble model of adsorption and pressure induced structural transitions in metal organic frameworks.

    PubMed

    Dunne, Lawrence J; Manos, George

    2016-03-14

    Here we present an exactly treated quasi-one dimensional statistical mechanical osmotic ensemble model of pressure and adsorption induced breathing structural transformations of metal-organic frameworks (MOFs). The treatment uses a transfer matrix method. The model successfully reproduces the gas and pressure induced structural changes which are observed experimentally in MOFs. The model treatment presented here is a significant step towards analytical statistical mechanical treatments of flexible metal-organic frameworks. PMID:26514851

  19. A statistical framework for the validation of a population exposure model based on personal exposure data

    NASA Astrophysics Data System (ADS)

    Rodriguez, Delphy; Valari, Myrto; Markakis, Konstantinos; Payan, Sébastien

    2016-04-01

    Currently, ambient pollutant concentrations at monitoring sites are routinely measured by local networks, such as AIRPARIF in Paris, France. Pollutant concentration fields are also simulated with regional-scale chemistry transport models such as CHIMERE (http://www.lmd.polytechnique.fr/chimere) under air-quality forecasting platforms (e.g. Prev'Air http://www.prevair.org) or research projects. These data may be combined with more or less sophisticated techniques to provide a fairly good representation of pollutant concentration spatial gradients over urban areas. Here we focus on human exposure to atmospheric contaminants. Based on census data on population dynamics and demographics, modeled outdoor concentrations and infiltration of outdoor air-pollution indoors we have developed a population exposure model for ozone and PM2.5. A critical challenge in the field of population exposure modeling is model validation since personal exposure data are expensive and therefore, rare. However, recent research has made low cost mobile sensors fairly common and therefore personal exposure data should become more and more accessible. In view of planned cohort field-campaigns where such data will be available over the Paris region, we propose in the present study a statistical framework that makes the comparison between modeled and measured exposures meaningful. Our ultimate goal is to evaluate the exposure model by comparing modeled exposures to monitor data. The scientific question we address here is how to downscale modeled data that are estimated on the county population scale at the individual scale which is appropriate to the available measurements. To assess this question we developed a Bayesian hierarchical framework that assimilates actual individual data into population statistics and updates the probability estimate.

  20. Developing a Model of Practice: Designing a Framework for the Professional Development of School Leaders and Managers.

    ERIC Educational Resources Information Center

    Reeves, Jenny; Forde, Christine; Casteel, Viv; Lynas, Richard

    1998-01-01

    Describes the origins and evolution of a framework for leadership and management development in Scottish schools. The design of this competence framework is underpinned by a professional-development model supporting experiential learning and critical reflection. Calls for a synthesis of various approaches to management development based on a…

  1. An Overview of Models of Speaking Performance and Its Implications for the Development of Procedural Framework for Diagnostic Speaking Tests

    ERIC Educational Resources Information Center

    Zhao, Zhongbao

    2013-01-01

    This paper aims at developing a procedural framework for the development and validation of diagnostic speaking tests. The researcher reviews the current available models of speaking performance, analyzes the distinctive features and then points out the implications for the development of a procedural framework for diagnostic speaking tests. On…

  2. Revisions to the PARCC Model Content Frameworks for Mathematics and ELA/Literacy Based on Public Feedback

    ERIC Educational Resources Information Center

    Partnership for Assessment of Readiness for College and Careers (NJ1), 2011

    2011-01-01

    The PARCC Model Content Frameworks for Mathematics and ELA/Literacy have been developed through a state-led process in collaboration with members of the Common Core State Standards (CCSS) writing teams. The frameworks were reviewed by the public between August 3-31, 2011. Nearly 1,000 responses were collected, and respondents included K-12…

  3. The Behavioral Ecological Model as a Framework for School-Based Anti-Bullying Health Promotion Interventions

    ERIC Educational Resources Information Center

    Dresler-Hawke, Emma; Whitehead, Dean

    2009-01-01

    This article presents a conceptual strategy which uses the Behavioral Ecological Model (BEM) as a health promotion framework to guide school-based bullying awareness programs and subsequent anti-bullying strategies for school nursing practice. Anti-bullying frameworks and tools are scarce despite the extent of the problem of bullying. This article…

  4. Assessing Students' Understandings of Biological Models and Their Use in Science to Evaluate a Theoretical Framework

    ERIC Educational Resources Information Center

    Grünkorn, Juliane; Upmeier zu Belzen, Annette; Krüger, Dirk

    2014-01-01

    Research in the field of students' understandings of models and their use in science describes different frameworks concerning these understandings. Currently, there is no conjoint framework that combines these structures and so far, no investigation has focused on whether it reflects students' understandings sufficiently (empirical…

  5. Genetic Algorithm Based Framework for Automation of Stochastic Modeling of Multi-Season Streamflows

    NASA Astrophysics Data System (ADS)

    Srivastav, R. K.; Srinivasan, K.; Sudheer, K.

    2009-05-01

    Synthetic streamflow data generation involves the synthesis of likely streamflow patterns that are statistically indistinguishable from the observed streamflow data. The various kinds of stochastic models adopted for multi-season streamflow generation in hydrology are: i) parametric models which hypothesize the form of the periodic dependence structure and the distributional form a priori (examples are PAR, PARMA); disaggregation models that aim to preserve the correlation structure at the periodic level and the aggregated annual level; ii) Nonparametric models (examples are bootstrap/kernel based methods), which characterize the laws of chance, describing the stream flow process, without recourse to prior assumptions as to the form or structure of these laws; (k-nearest neighbor (k-NN), matched block bootstrap (MABB)); non-parametric disaggregation model. iii) Hybrid models which blend both parametric and non-parametric models advantageously to model the streamflows effectively. Despite many of these developments that have taken place in the field of stochastic modeling of streamflows over the last four decades, accurate prediction of the storage and the critical drought characteristics has been posing a persistent challenge to the stochastic modeler. This is partly because, usually, the stochastic streamflow model parameters are estimated by minimizing a statistically based objective function (such as maximum likelihood (MLE) or least squares (LS) estimation) and subsequently the efficacy of the models is being validated based on the accuracy of prediction of the estimates of the water-use characteristics, which requires large number of trial simulations and inspection of many plots and tables. Still accurate prediction of the storage and the critical drought characteristics may not be ensured. In this study a multi-objective optimization framework is proposed to find the optimal hybrid model (blend of a simple parametric model, PAR(1) model and matched block

  6. Feminist Framework Plus: Knitting Feminist Theories of Rape Etiology Into a Comprehensive Model.

    PubMed

    McPhail, Beverly A

    2016-07-01

    The radical-liberal feminist perspective on rape posits that the assault is motivated by power and control rather than sexual gratification and is a violent rather than a sexual act. However, rape is a complex act. Relying on only one early strand of feminist thought to explain the etiology of rape limits feminists' understanding of rape and the practice based upon the theory. The history of the adoption of the "power, not sex" theory is presented and the model critiqued. A more integrated model is developed and presented, the Feminist Framework Plus, which knits together five feminist theories into a comprehensive model that better explains the depth and breadth of the etiology of rape. Empirical evidence that supports each theory is detailed as well as the implications of the model on service provision, education, and advocacy. PMID:26018209

  7. How are organisational climate models and patient satisfaction related? A competing value framework approach.

    PubMed

    Ancarani, Alessandro; Di Mauro, Carmela; Giammanco, Maria Daniela

    2009-12-01

    Patient satisfaction has become an important indicator of process quality inside hospitals. Even so, the improvement of patient satisfaction cannot simply follow from the implementation of new incentives schemes and organisational arrangements; it also depends on hospitals' cultures and climates. This paper studies the impact of alternative models of organisational climate in hospital wards on patient satisfaction. Data gathered from seven public hospitals in Italy are used to explore this relationship. The theoretical approach adopted is the Competing Value Framework which classifies organisations according to their inward or outward focus and according to the importance assigned to control vs. flexibility. Results show that both a model stressing openness, change and innovation and a model emphasising cohesion and workers' morale are positively related to patient satisfaction, while a model based on managerial control is negatively associated with patient satisfaction. PMID:19850393

  8. Threat driven modeling framework using petri nets for e-learning system.

    PubMed

    Khamparia, Aditya; Pandey, Babita

    2016-01-01

    Vulnerabilities at various levels are main cause of security risks in e-learning system. This paper presents a modified threat driven modeling framework, to identify the threats after risk assessment which requires mitigation and how to mitigate those threats. To model those threat mitigations aspects oriented stochastic petri nets are used. This paper included security metrics based on vulnerabilities present in e-learning system. The Common Vulnerability Scoring System designed to provide a normalized method for rating vulnerabilities which will be used as basis in metric definitions and calculations. A case study has been also proposed which shows the need and feasibility of using aspect oriented stochastic petri net models for threat modeling which improves reliability, consistency and robustness of the e-learning system. PMID:27119050

  9. A deep learning framework for modeling structural features of RNA-binding protein targets

    PubMed Central

    Zhang, Sai; Zhou, Jingtian; Hu, Hailin; Gong, Haipeng; Chen, Ligong; Cheng, Chao; Zeng, Jianyang

    2016-01-01

    RNA-binding proteins (RBPs) play important roles in the post-transcriptional control of RNAs. Identifying RBP binding sites and characterizing RBP binding preferences are key steps toward understanding the basic mechanisms of the post-transcriptional gene regulation. Though numerous computational methods have been developed for modeling RBP binding preferences, discovering a complete structural representation of the RBP targets by integrating their available structural features in all three dimensions is still a challenging task. In this paper, we develop a general and flexible deep learning framework for modeling structural binding preferences and predicting binding sites of RBPs, which takes (predicted) RNA tertiary structural information into account for the first time. Our framework constructs a unified representation that characterizes the structural specificities of RBP targets in all three dimensions, which can be further used to predict novel candidate binding sites and discover potential binding motifs. Through testing on the real CLIP-seq datasets, we have demonstrated that our deep learning framework can automatically extract effective hidden structural features from the encoded raw sequence and structural profiles, and predict accurate RBP binding sites. In addition, we have conducted the first study to show that integrating the additional RNA tertiary structural features can improve the model performance in predicting RBP binding sites, especially for the polypyrimidine tract-binding protein (PTB), which also provides a new evidence to support the view that RBPs may own specific tertiary structural binding preferences. In particular, the tests on the internal ribosome entry site (IRES) segments yield satisfiable results with experimental support from the literature and further demonstrate the necessity of incorporating RNA tertiary structural information into the prediction model. The source code of our approach can be found in https

  10. A deep learning framework for modeling structural features of RNA-binding protein targets.

    PubMed

    Zhang, Sai; Zhou, Jingtian; Hu, Hailin; Gong, Haipeng; Chen, Ligong; Cheng, Chao; Zeng, Jianyang

    2016-02-29

    RNA-binding proteins (RBPs) play important roles in the post-transcriptional control of RNAs. Identifying RBP binding sites and characterizing RBP binding preferences are key steps toward understanding the basic mechanisms of the post-transcriptional gene regulation. Though numerous computational methods have been developed for modeling RBP binding preferences, discovering a complete structural representation of the RBP targets by integrating their available structural features in all three dimensions is still a challenging task. In this paper, we develop a general and flexible deep learning framework for modeling structural binding preferences and predicting binding sites of RBPs, which takes (predicted) RNA tertiary structural information into account for the first time. Our framework constructs a unified representation that characterizes the structural specificities of RBP targets in all three dimensions, which can be further used to predict novel candidate binding sites and discover potential binding motifs. Through testing on the real CLIP-seq datasets, we have demonstrated that our deep learning framework can automatically extract effective hidden structural features from the encoded raw sequence and structural profiles, and predict accurate RBP binding sites. In addition, we have conducted the first study to show that integrating the additional RNA tertiary structural features can improve the model performance in predicting RBP binding sites, especially for the polypyrimidine tract-binding protein (PTB), which also provides a new evidence to support the view that RBPs may own specific tertiary structural binding preferences. In particular, the tests on the internal ribosome entry site (IRES) segments yield satisfiable results with experimental support from the literature and further demonstrate the necessity of incorporating RNA tertiary structural information into the prediction model. The source code of our approach can be found in https

  11. Framework for Uncertainty Assessment - Hanford Site-Wide Groundwater Flow and Transport Modeling

    NASA Astrophysics Data System (ADS)

    Bergeron, M. P.; Cole, C. R.; Murray, C. J.; Thorne, P. D.; Wurstner, S. K.

    2002-05-01

    Pacific Northwest National Laboratory is in the process of development and implementation of an uncertainty estimation methodology for use in future site assessments that addresses parameter uncertainty as well as uncertainties related to the groundwater conceptual model. The long-term goals of the effort are development and implementation of an uncertainty estimation methodology for use in future assessments and analyses being made with the Hanford site-wide groundwater model. The basic approach in the framework developed for uncertainty assessment consists of: 1) Alternate conceptual model (ACM) identification to identify and document the major features and assumptions of each conceptual model. The process must also include a periodic review of the existing and proposed new conceptual models as data or understanding become available. 2) ACM development of each identified conceptual model through inverse modeling with historical site data. 3) ACM evaluation to identify which of conceptual models are plausible and should be included in any subsequent uncertainty assessments. 4) ACM uncertainty assessments will only be carried out for those ACMs determined to be plausible through comparison with historical observations and model structure identification measures. The parameter uncertainty assessment process generally involves: a) Model Complexity Optimization - to identify the important or relevant parameters for the uncertainty analysis; b) Characterization of Parameter Uncertainty - to develop the pdfs for the important uncertain parameters including identification of any correlations among parameters; c) Propagation of Uncertainty - to propagate parameter uncertainties (e.g., by first order second moment methods if applicable or by a Monte Carlo approach) through the model to determine the uncertainty in the model predictions of interest. 5)Estimation of combined ACM and scenario uncertainty by a double sum with each component of the inner sum (an individual CCDF

  12. A Multi-model Data Assimilation Framework Via Ensemble Kalman Filter

    NASA Astrophysics Data System (ADS)

    Xue, L.; Zhang, D.

    2013-12-01

    The ensemble Kalman filter (EnKF) is a widely-used data assimilation method that has the capacity to sequentially update system parameters and states as new observations become available. One noticeable feature of EnKF is that it not only can provide optimal updates of model parameters and state variables, but also can give the uncertainty associated with them in each assimilation step. The natural system is open and complex, rendering it prone to multiple interpretations and mathematical descriptions. Bayesian model averaging (BMA) is an effective method to account for the uncertainty stemming from the model itself. In this paper, EnKF is embedded into the BMA framework, where individual posterior probability distributions of state vectors after each assimilation step are linearly integrated together through posterior model weights. A two-dimensional illustrative example is employed to demonstrate the proposed multi-model data assimilation approach via EnKF. Results show that statistical bias and uncertainty underestimation can occur when the data assimilation process relies on a single postulated model. The posterior model weight can adjust itself dynamically in time according to its consistency with observations. The performances of log conductivity estimation and head prediction are compared to the standard EnKF method based on the postulated single model and the proposed multi-model EnKF method. Comparisons show that multi-model EnKF performs better in terms of log score and coverage when sufficient observations have been assimilated.

  13. A Two-Part Mixed-Effects Modeling Framework For Analyzing Whole-Brain Network Data

    PubMed Central

    Simpson, Sean L.; Laurienti, Paul J.

    2015-01-01

    Whole-brain network analyses remain the vanguard in neuroimaging research, coming to prominence within the last decade. Network science approaches have facilitated these analyses and allowed examining the brain as an integrated system. However, statistical methods for modeling and comparing groups of networks have lagged behind. Fusing multivariate statistical approaches with network science presents the best path to develop these methods. Toward this end, we propose a two-part mixed-effects modeling framework that allows modeling both the probability of a connection (presence/absence of an edge) and the strength of a connection if it exists. Models within this framework enable quantifying the relationship between an outcome (e.g., disease status) and connectivity patterns in the brain while reducing spurious correlations through inclusion of confounding covariates. They also enable prediction about an outcome based on connectivity structure and vice versa, simulating networks to gain a better understanding of normal ranges of topological variability, and thresholding networks leveraging group information. Thus, they provide a comprehensive approach to studying system level brain properties to further our understanding of normal and abnormal brain function. PMID:25796135

  14. Linked Hydrologic-Hydrodynamic Model Framework to Forecast Impacts of Rivers on Beach Water Quality

    NASA Astrophysics Data System (ADS)

    Anderson, E. J.; Fry, L. M.; Kramer, E.; Ritzenthaler, A.

    2014-12-01

    The goal of NOAA's beach quality forecasting program is to use a multi-faceted approach to aid in detection and prediction of bacteria in recreational waters. In particular, our focus has been on the connection between tributary loads and bacteria concentrations at nearby beaches. While there is a clear link between stormwater runoff and beach water quality, quantifying the contribution of river loadings to nearshore bacterial concentrations is complicated due to multiple processes that drive bacterial concentrations in rivers as well as those processes affecting the fate and transport of bacteria upon exiting the rivers. In order to forecast potential impacts of rivers on beach water quality, we developed a linked hydrologic-hydrodynamic water quality framework that simulates accumulation and washoff of bacteria from the landscape, and then predicts the fate and transport of washed off bacteria from the watershed to the coastal zone. The framework includes a watershed model (IHACRES) to predict fecal indicator bacteria (FIB) loadings to the coastal environment (accumulation, wash-off, die-off) as a function of effective rainfall. These loadings are input into a coastal hydrodynamic model (FVCOM), including a bacteria transport model (Lagrangian particle), to simulate 3D bacteria transport within the coastal environment. This modeling system provides predictive tools to assist local managers in decision-making to reduce human health threats.

  15. Towards Controlling the Glycoform: A Model Framework Linking Extracellular Metabolites to Antibody Glycosylation

    PubMed Central

    Jedrzejewski, Philip M.; del Val, Ioscani Jimenez; Constantinou, Antony; Dell, Anne; Haslam, Stuart M.; Polizzi, Karen M.; Kontoravdi, Cleo

    2014-01-01

    Glycoproteins represent the largest group of the growing number of biologically-derived medicines. The associated glycan structures and their distribution are known to have a large impact on pharmacokinetics. A modelling framework was developed to provide a link from the extracellular environment and its effect on intracellular metabolites to the distribution of glycans on the constant region of an antibody product. The main focus of this work is the mechanistic in silico reconstruction of the nucleotide sugar donor (NSD) metabolic network by means of 34 species mass balances and the saturation kinetics rates of the 60 metabolic reactions involved. NSDs are the co-substrates of the glycosylation process in the Golgi apparatus and their simulated dynamic intracellular concentration profiles were linked to an existing model describing the distribution of N-linked glycan structures of the antibody constant region. The modelling framework also describes the growth dynamics of the cell population by means of modified Monod kinetics. Simulation results match well to experimental data from a murine hybridoma cell line. The result is a modelling platform which is able to describe the product glycoform based on extracellular conditions. It represents a first step towards the in silico prediction of the glycoform of a biotherapeutic and provides a platform for the optimisation of bioprocess conditions with respect to product quality. PMID:24637934

  16. OpenDrift - an open source framework for ocean trajectory modeling

    NASA Astrophysics Data System (ADS)

    Dagestad, Knut-Frode; Breivik, Øyvind; Ådlandsvik, Bjørn

    2016-04-01

    We will present a new, open source tool for modeling the trajectories and fate of particles or substances (Lagrangian Elements) drifting in the ocean, or even in the atmosphere. The software is named OpenDrift, and has been developed at Norwegian Meteorological Institute in cooperation with Institute of Marine Research. OpenDrift is a generic framework written in Python, and is openly available at https://github.com/knutfrode/opendrift/. The framework is modular with respect to three aspects: (1) obtaining input data, (2) the transport/morphological processes, and (3) exporting of results to file. Modularity is achieved through well defined interfaces between components, and use of a consistent vocabulary (CF conventions) for naming of variables. Modular input implies that it is not necessary to preprocess input data (e.g. currents, wind and waves from Eulerian models) to a particular file format. Instead "reader modules" can be written/used to obtain data directly from any original source, including files or through web based protocols (e.g. OPeNDAP/Thredds). Modularity of processes implies that a model developer may focus on the geophysical processes relevant for the application of interest, without needing to consider technical tasks such as reading, reprojecting, and colocating input data, rotation and scaling of vectors and model output. We will show a few example applications of using OpenDrift for predicting drifters, oil spills, and search and rescue objects.

  17. Modeling Framework and Validation of a Smart Grid and Demand Response System for Wind Power Integration

    SciTech Connect

    Broeer, Torsten; Fuller, Jason C.; Tuffner, Francis K.; Chassin, David P.; Djilali, Ned

    2014-01-31

    Electricity generation from wind power and other renewable energy sources is increasing, and their variability introduces new challenges to the power system. The emergence of smart grid technologies in recent years has seen a paradigm shift in redefining the electrical system of the future, in which controlled response of the demand side is used to balance fluctuations and intermittencies from the generation side. This paper presents a modeling framework for an integrated electricity system where loads become an additional resource. The agent-based model represents a smart grid power system integrating generators, transmission, distribution, loads and market. The model incorporates generator and load controllers, allowing suppliers and demanders to bid into a Real-Time Pricing (RTP) electricity market. The modeling framework is applied to represent a physical demonstration project conducted on the Olympic Peninsula, Washington, USA, and validation simulations are performed using actual dynamic data. Wind power is then introduced into the power generation mix illustrating the potential of demand response to mitigate the impact of wind power variability, primarily through thermostatically controlled loads. The results also indicate that effective implementation of Demand Response (DR) to assist integration of variable renewable energy resources requires a diversity of loads to ensure functionality of the overall system.

  18. Physically based estimation of soil water retention from textural data: General framework, new models, and streamlined existing models

    USGS Publications Warehouse

    Nimmo, J.R.; Herkelrath, W.N.; Laguna, Luna A.M.

    2007-01-01

    Numerous models are in widespread use for the estimation of soil water retention from more easily measured textural data. Improved models are needed for better prediction and wider applicability. We developed a basic framework from which new and existing models can be derived to facilitate improvements. Starting from the assumption that every particle has a characteristic dimension R associated uniquely with a matric pressure ?? and that the form of the ??-R relation is the defining characteristic of each model, this framework leads to particular models by specification of geometric relationships between pores and particles. Typical assumptions are that particles are spheres, pores are cylinders with volume equal to the associated particle volume times the void ratio, and that the capillary inverse proportionality between radius and matric pressure is valid. Examples include fixed-pore-shape and fixed-pore-length models. We also developed alternative versions of the model of Arya and Paris that eliminate its interval-size dependence and other problems. The alternative models are calculable by direct application of algebraic formulas rather than manipulation of data tables and intermediate results, and they easily combine with other models (e.g., incorporating structural effects) that are formulated on a continuous basis. Additionally, we developed a family of models based on the same pore geometry as the widely used unsaturated hydraulic conductivity model of Mualem. Predictions of measurements for different suitable media show that some of the models provide consistently good results and can be chosen based on ease of calculations and other factors. ?? Soil Science Society of America. All rights reserved.

  19. Integrating an open source dynamic river model in hydrology modeling frameworks

    NASA Astrophysics Data System (ADS)

    Liu, Frank; Hodges, Ben

    2014-05-01

    A challenge for hydrology modeling is linking landscape runoff models with river network models. Although some hydrological models directly implement a river routing scheme within their code, such a monolithic approach is too rigid because it does not allow the latest river routing advances to be used. Unlike the 2D interface between atmospheric and landscape models, the interface between landscape runoff models and river network models is more difficult to define. In this PICO presentation, we address problems with model interfaces, which are related to issues such as time and space-scale differences between the models. We also provide an overview of SPRINT, an open source river network model, which has adapted the model interface architecture and numerical methods widely used in semiconductor microchip design. Finally, we propose two model integration mechanisms: the file-based "net-list" and the API (application programming interface) approach.

  20. ON JOINT DETERMINISTIC GRID MODELING AND SUB-GRID VARIABILITY CONCEPTUAL FRAMEWORK FOR MODEL EVALUATION

    EPA Science Inventory

    The general situation, (but exemplified in urban areas), where a significant degree of sub-grid variability (SGV) exists in grid models poses problems when comparing gridbased air quality modeling results with observations. Typically, grid models ignore or parameterize processes ...

  1. Atomic charges for modeling metal–organic frameworks: Why and how

    SciTech Connect

    Hamad, Said Balestra, Salvador R.G.; Bueno-Perez, Rocio; Calero, Sofia; Ruiz-Salvador, A. Rabdel

    2015-03-15

    Atomic partial charges are parameters of key importance in the simulation of Metal–Organic Frameworks (MOFs), since Coulombic interactions decrease with the distance more slowly than van der Waals interactions. But despite its relevance, there is no method to unambiguously assign charges to each atom, since atomic charges are not quantum observables. There are several methods that allow the calculation of atomic charges, most of them starting from the electronic wavefunction or the electronic density or the system, as obtained with quantum mechanics calculations. In this work, we describe the most common methods employed to calculate atomic charges in MOFs. In order to show the influence that even small variations of structure have on atomic charges, we present the results that we obtained for DMOF-1. We also discuss the effect that small variations of atomic charges have on the predicted structural properties of IRMOF-1. - Graphical abstract: We review the different method with which to calculate atomic partial charges that can be used in force field-based calculations. We also present two examples that illustrate the influence of the geometry on the calculated charges and the influence of the charges on structural properties. - Highlights: • The choice of atomic charges is crucial in modeling adsorption and diffusion in MOFs. • Methods for calculating atomic charges in MOFs are reviewed. • We discuss the influence of the framework geometry on the calculated charges. • We discuss the influence of the framework charges on structural the properties.

  2. Framework for e-learning assessment in dental education: a global model for the future.

    PubMed

    Arevalo, Carolina R; Bayne, Stephen C; Beeley, Josie A; Brayshaw, Christine J; Cox, Margaret J; Donaldson, Nora H; Elson, Bruce S; Grayden, Sharon K; Hatzipanagos, Stylianos; Johnson, Lynn A; Reynolds, Patricia A; Schönwetter, Dieter J

    2013-05-01

    The framework presented in this article demonstrates strategies for a global approach to e-curricula in dental education by considering a collection of outcome assessment tools. By combining the outcomes for overall assessment, a global model for a pilot project that applies e-assessment tools to virtual learning environments (VLE), including haptics, is presented. Assessment strategies from two projects, HapTEL (Haptics in Technology Enhanced Learning) and UDENTE (Universal Dental E-learning), act as case-user studies that have helped develop the proposed global framework. They incorporate additional assessment tools and include evaluations from questionnaires and stakeholders' focus groups. These measure each of the factors affecting the classical teaching/learning theory framework as defined by Entwistle in a standardized manner. A mathematical combinatorial approach is proposed to join these results together as a global assessment. With the use of haptic-based simulation learning, exercises for tooth preparation assessing enamel and dentine were compared to plastic teeth in manikins. Equivalence for student performance for haptic versus traditional preparation methods was established, thus establishing the validity of the haptic solution for performing these exercises. Further data collected from HapTEL are still being analyzed, and pilots are being conducted to validate the proposed test measures. Initial results have been encouraging, but clearly the need persists to develop additional e-assessment methods for new learning domains. PMID:23658401

  3. Catchment travel and residence time distributions: a theoretical framework for solute transport modeling

    NASA Astrophysics Data System (ADS)

    Botter, G.; Bertuzzo, E.; Rinaldo, A.

    2011-12-01

    The probability density functions (pdf's) of travel and residence times are key descriptors of the mechanisms through which catchments retain and release old and event water, transporting solutes to receiving water bodies. In this contribution we derive a general stochastic framework applicable to arbitrary catchment control volumes, where time-variable precipitation, evapotranspiration and discharge are assumed to be the major hydrological drivers for water and solutes. A master equation for the residence time pdf is derived and solved analytically, providing expressions for travel and residence time pdf's as a function of input/output fluxes and of the relevant mixing processes occurring along streamflow production and plant upatke. Our solutions suggest intrinsically time variant travel and residence time pdf's through a direct dependence on the underlying hydrological forcings and soil vegetation dynamics. The proposed framework highlights the dependence of water/solute travel times on eco-hydrological processes (especially transpiration and uptake), and integrates age-dating and tracer hydrology techniques by providing a coherent framework for catchment transport models. An application to the release of pesticides from an agricultural watershead is also discussed.

  4. Framework for modeling urban restoration resilience time in the aftermath of an extreme event

    USGS Publications Warehouse

    Ramachandran, Varun; Long, Suzanna K.; Shoberg, Thomas G.; Corns, Steven; Carlo, Héctor

    2015-01-01

    The impacts of extreme events continue long after the emergency response has terminated. Effective reconstruction of supply-chain strategic infrastructure (SCSI) elements is essential for postevent recovery and the reconnectivity of a region with the outside. This study uses an interdisciplinary approach to develop a comprehensive framework to model resilience time. The framework is tested by comparing resilience time results for a simulated EF-5 tornado with ground truth data from the tornado that devastated Joplin, Missouri, on May 22, 2011. Data for the simulated tornado were derived for Overland Park, Johnson County, Kansas, in the greater Kansas City, Missouri, area. Given the simulated tornado, a combinatorial graph considering the damages in terms of interconnectivity between different SCSI elements is derived. Reconstruction in the aftermath of the simulated tornado is optimized using the proposed framework to promote a rapid recovery of the SCSI. This research shows promising results when compared with the independent quantifiable data obtained from Joplin, Missouri, returning a resilience time of 22 days compared with 25 days reported by city and state officials.

  5. Principal interval decomposition framework for POD-based model reduction of convective flows

    NASA Astrophysics Data System (ADS)

    San, Omer; Borggaard, Jeff

    2015-11-01

    A principal interval decomposition (PID) framework is proposed to build more reliable reduced-order models for unsteady flow problems. The PID method optimizes the lengths of the time windows over which proper orthogonal decomposition (POD) is performed and can be highly effective in building reduced-order models for convective problems. The performance of these POD models with and without using the PID approach is investigated by applying these methods to the unsteady lock-exchange flow problem modeled by solving the Boussinesq equations in vorticity-streamfunction formulation. This benchmark problem exhibits a strong shear flow induced by a temperature jump and results in the Kelvin-Helmholtz instability. This is considered a challenging benchmark problem for the development of reduced order models. The predictive performance of our model is then analyzed over a wide range of computational modeling and physical parameters. It is shown that the PID approach provides a significant improvement in accuracy over the standard Galerkin POD reduced-order model. Our numerical assessment of the PID shows that it may represent a reliable model reduction tool for convection-dominated, unsteady-flow problems.

  6. Tools and Algorithms to Link Horizontal Hydrologic and Vertical Hydrodynamic Models and Provide a Stochastic Modeling Framework

    NASA Astrophysics Data System (ADS)

    Salah, Ahmad M.; Nelson, E. James; Williams, Gustavious P.

    2010-04-01

    We present algorithms and tools we developed to automatically link an overland flow model to a hydrodynamic water quality model with different spatial and temporal discretizations. These tools run the linked models which provide a stochastic simulation frame. We also briefly present the tools and algorithms we developed to facilitate and analyze stochastic simulations of the linked models. We demonstrate the algorithms by linking the Gridded Surface Subsurface Hydrologic Analysis (GSSHA) model for overland flow with the CE-QUAL-W2 model for water quality and reservoir hydrodynamics. GSSHA uses a two-dimensional horizontal grid while CE-QUAL-W2 uses a two-dimensional vertical grid. We implemented the algorithms and tools in the Watershed Modeling System (WMS) which allows modelers to easily create and use models. The algorithms are general and could be used for other models. Our tools create and analyze stochastic simulations to help understand uncertainty in the model application. While a number of examples of linked models exist, the ability to perform automatic, unassisted linking is a step forward and provides the framework to easily implement stochastic modeling studies.

  7. A Human Sensor Network Framework in Support of Near Real Time Situational Geophysical Modeling

    NASA Astrophysics Data System (ADS)

    Aulov, O.; Price, A.; Smith, J. A.; Halem, M.

    2013-12-01

    The area of Disaster Management is well established among Federal Agencies such as FEMA, EPA, NOAA and NASA. These agencies have well formulated frameworks for response and mitigation based on near real time satellite and conventional observing networks for assimilation into geophysical models. Forecasts from these models are used to communicate with emergency responders and the general public. More recently, agencies have started using social media to broadcast warnings and alerts to potentially affected communities. In this presentation, we demonstrate the added benefits of mining and assimilating the vast amounts of social media data available from heterogeneous hand held devices and social networks into established operational geophysical modeling frameworks as they apply to the five cornerstones of disaster management - Prevention, Mitigation, Preparedness, Response and Recovery. Often, in situations of extreme events, social media provide the earliest notification of adverse extreme events. However, various forms of social media data also can provide useful geolocated and time stamped in situ observations, complementary to directly sensed conventional observations. We use the concept of a Human Sensor Network where one views social media users as carrying field deployed "sensors" whose posts are the remotely "sensed instrument measurements.' These measurements can act as 'station data' providing the resolution and coverage needed for extreme event specific modeling and validation. Here, we explore the use of social media through the use of a Human Sensor Network (HSN) approach as another data input source for assimilation into geophysical models. Employing the HSN paradigm can provide useful feedback in near real-time, but presents software challenges for rapid access, quality filtering and transforming massive social media data into formats consistent with the operational models. As a use case scenario, we demonstrate the value of HSN for disaster management

  8. A New Framework for Systematically Characterizing and Improving Extreme Weather Phenomena in Climate Models

    NASA Astrophysics Data System (ADS)

    O'Brien, T. A.; Kashinath, K.; Collins, W.

    2014-12-01

    Extreme weather phenomena remain a significant challenge for climate models due in part to the relatively small space and time scales at which such events occur. Accordingly, robust simulation of extreme events requires models with high fidelity at these relatively small scales. However, numerous recent studies have shown evidence that current climate models exhibit non-convergent changes in extreme weather statistics as spatial and temporal resolution increase. These studies also provide evidence that such non-convergence originates in the subgrid parameterization suites (e.g., micro/macrophysics and convection). In order to provide a framework for identifying parameterization characteristics that cause non-convergent behavior and for testing parameterization improvements, we have developed a hindcast-based system characterizing the fidelity of extremes as a function of spatial and temporal resolution. The use of hindcasts as a model evaluation tool allows us to identify modes of failure (e.g., false-hits and misses) that systematically vary as a function of resolution. We have implemented this framework for the Community Earth System Model, and we have created a dataset of hindcast ensembles at multiple horizontal resolutions. Preliminary analysis of this multi-resolution set of hindcasts shows that in some regions, (1) the tail of the precipitation probability density (PDF) function grows as resolution increases (in accord with recent studies), and that (2) a large portion of this increase in the PDF tail comes from increases in Type I model errors—simulated extreme events that do not occur in observations. We explore possible causes of this inconsistent model behavior.

  9. The Application of Modeling and Simulation in Capacity Management within the ITIL Framework

    NASA Technical Reports Server (NTRS)

    Rahmani, Sonya; vonderHoff, Otto

    2010-01-01

    Tightly integrating modeling and simulation techniques into Information Technology Infrastructure Library (ITIL) practices can be one of the driving factors behind a successful and cost-effective capacity management effort for any Information Technology (IT) system. ITIL is a best practices framework for managing IT infrastructure, development and operations. Translating ITIL theory into operational reality can be a challenge. This paper aims to highlight how to best integrate modeling and simulation into an ITIL implementation. For cases where the project team initially has difficulty gaining consensus on investing in modeling and simulation resources, a clear definition for M&S implementation into the ITIL framework, specifically its role in supporting Capacity Management, is critical to gaining the support required to garner these resources. This implementation should also help to clearly define M&S support to the overall system mission. This paper will describe the development of an integrated modeling approach and how best to tie M&S to definitive goals for evaluating system capacity and performance requirements. Specifically the paper will discuss best practices for implementing modeling and simulation into ITIL. These practices hinge on implementing integrated M&S methods that 1) encompass at least two or more predictive modeling techniques, 2) complement each one's respective strengths and weaknesses to support the validation of predicted results, and 3) are tied to the system's performance and workload monitoring efforts. How to structure two forms of modeling: statistical and simUlation in the development of "As Is" and "To Be" efforts will be used to exemplify the integrated M&S methods. The paper will show how these methods can better support the project's overall capacity management efforts.

  10. Integrated Modeling in Earth and Space Sciences: An Information Theoretic Framework

    NASA Astrophysics Data System (ADS)

    Sharma, A. S.; Kalnay, E.

    2011-12-01

    Most natural phenomena exhibit multiscale behavior, which is an underlying reason for the challenges in modeling them. The recognition that the key problems, such as extreme events, natural hazards and climate change, require multi-disciplinary approaches to develop models that integrate many natural and anthropogenic phenomena, demand new approaches in the modeling of such systems. Information theory, which emphasizes the inherent features in observational data independent of modeling assumptions, can be used to develop a framework for multi-disciplinary models by integrating the data of the leading processes in multiple systems. An important measure of the inter-relationship among the different phenomena is the lead time among them. The widely used quantities such as the cross-correlation function represent the linear dependence among the variables and are limited in their ability to describe complex driven systems which are essentially nonlinear. The mutual information function, which represents the expectation of the average degree of dependence incorporating all orders of nonlinearity, provides the characteristic times inherent in the data and can be used as the first step to the development of integrated models. This function is used in two systems with widely separated time scales. The first case is the solar wind - magnetosphere interaction and the correlated data yield ~ 5 hr as the inherent time scale for the magnetospheric processes. The second case is a study of the inter-relationship between natural and anthropogenic phenomena and the mutual information functions were computed from the data of the global gross product, temperature and population. These functions show a time delay of ~15 yrs between the changes in global temperature and population as well as gross product, thus providing a measure of the interdependency among the variables underlying climate change. The results from studies of extreme events and an information theoretic modeling

  11. Elementary metabolite units (EMU): a novel framework for modeling isotopic distributions.

    PubMed

    Antoniewicz, Maciek R; Kelleher, Joanne K; Stephanopoulos, Gregory

    2007-01-01

    Metabolic flux analysis (MFA) has emerged as a tool of great significance for metabolic engineering and mammalian physiology. An important limitation of MFA, as carried out via stable isotope labeling and GC/MS and nuclear magnetic resonance (NMR) measurements, is the large number of isotopomer or cumomer equations that need to be solved, especially when multiple isotopic tracers are used for the labeling of the system. This restriction reduces the ability of MFA to fully utilize the power of multiple isotopic tracers in elucidating the physiology of realistic situations comprising complex bioreaction networks. Here, we present a novel framework for the modeling of isotopic labeling systems that significantly reduces the number of system variables without any loss of information. The elementary metabolite unit (EMU) framework is based on a highly efficient decomposition method that identifies the minimum amount of information needed to simulate isotopic labeling within a reaction network using the knowledge of atomic transitions occurring in the network reactions. The functional units generated by the decomposition algorithm, called EMUs, form the new basis for generating system equations that describe the relationship between fluxes and stable isotope measurements. Isotopomer abundances simulated using the EMU framework are identical to those obtained using the isotopomer and cumomer methods, however, require significantly less computation time. For a typical (13)C-labeling system the total number of equations that needs to be solved is reduced by one order-of-magnitude (100s EMUs vs. 1000s isotopomers). As such, the EMU framework is most efficient for the analysis of labeling by multiple isotopic tracers. For example, analysis of the gluconeogenesis pathway with (2)H, (13)C, and (18)O tracers requires only 354 EMUs, compared to more than two million isotopomers. PMID:17088092

  12. Mapping Spatial Variability of Soil Moisture in a Semi-distributed Hydrologic Modelling Framework

    NASA Astrophysics Data System (ADS)

    Ajami, Hoori; Sharma, Ashish

    2016-04-01

    The Soil Moisture and Runoff simulation Toolkit (SMART) is a computationally efficient semi-distributed hydrological modelling framework developed for water balance simulations at a catchment scale. The modelling framework is based upon the delineation of contiguous and topologically connected Hydrologic Response Units (HRUs) and distributed cross sections or equivalent cross sections (ECS) in each first order sub-basin to represent hillslope hydrologic processes. HRUs are delineated in each first order sub-basin based on topographic and geomorphic analysis of the entire catchment. A 2-d distributed hydrological model based on the Richards' equation performs water balance simulations across a series of ECSs formulated by aggregating topographic and physiographic properties of the part or entire first order sub-basins. Delineation of ECSs has the advantage of reducing computational time while maintaining reasonable accuracy in simulated fluxes and states. While HRU level soil moisture is well approximated in the ECS formulation compared to the distributed modelling approaches, spatial variability of soil moisture within a given HRU inside an ECS is ignored. In this study, we developed a disaggregation scheme for soil moisture distribution within every ECS formulated in a first order sub-basin. The statistical disaggregation scheme is developed based on soil moisture simulations of the Baldry sub-catchment, Australia using the integrated land surface-groundwater model, ParFlow.CLM. ParFlow is a variably saturated flow model that solves the 3D Richards' equation for the sub-surface and it is coupled to the Common Land Model (CLM). The disaggregation scheme preserves the mean sub-basin soil moisture and maintains temporal correlation of simulated daily soil moisture. Our preliminary results illustrate that the spatial disaggregation scheme can approximate spatially distributed soil moisture field produced by ParFlow.CLM at 60 m resolution. In addition, the

  13. Distribution-enhanced homogenization framework and model for heterogeneous elasto-plastic problems

    NASA Astrophysics Data System (ADS)

    Alleman, Coleman; Luscher, D. J.; Bronkhorst, Curt; Ghosh, Somnath

    2015-12-01

    Multi-scale computational models offer tractable means to simulate sufficiently large spatial domains comprised of heterogeneous materials by resolving material behavior at different scales and communicating across these scales. Within the framework of computational multi-scale analyses, hierarchical models enable unidirectional transfer of information from lower to higher scales, usually in the form of effective material properties. Determining explicit forms for the macroscale constitutive relations for complex microstructures and nonlinear processes generally requires numerical homogenization of the microscopic response. Conventional low-order homogenization uses results of simulations of representative microstructural domains to construct appropriate expressions for effective macroscale constitutive parameters written as a function of the microstructural characterization. This paper proposes an alternative novel approach, introduced as the distribution-enhanced homogenization framework or DEHF, in which the macroscale constitutive relations are formulated in a series expansion based on the microscale constitutive relations and moments of arbitrary order of the microscale field variables. The framework does not make any a priori assumption on the macroscale constitutive behavior being represented by a homogeneous effective medium theory. Instead, the evolution of macroscale variables is governed by the moments of microscale distributions of evolving field variables. This approach demonstrates excellent accuracy in representing the microscale fields through their distributions. An approximate characterization of the microscale heterogeneity is accounted for explicitly in the macroscale constitutive behavior. Increasing the order of this approximation results in increased fidelity of the macroscale approximation of the microscale constitutive behavior. By including higher-order moments of the microscale fields in the macroscale problem, micromechanical analyses do

  14. The Earth System Modeling Framework and Earth System Curator: Software Components as Building Blocks of Community

    NASA Astrophysics Data System (ADS)

    Deluca, C.; Balaji, V.; da Silva, A.; Dunlap, R.; Hill, C.; Mark, L.; Mechoso, C. R.; Middleton, D.; Nikonov, S.; Rugaber, S.; Suarez, M.

    2006-05-01

    The Earth System Modeling Framework (ESMF) is an established U.S. initiative to develop high performance common modeling infrastructure for climate and weather models. ESMF is the technical foundation for the NASA Modeling, Analysis, and Prediction (MAP) Climate Variability and Change program and the DoD Battlespace Environments Institute (BEI). It has been incorporated into the Community Climate System Model (CCSM), the Weather Research and Forecast (WRF) Model, NOAA NCEP and GFDL models, Army, Navy, and Air Force models, and many others. The new, NSF-funded Earth System Curator is a related database and toolkit that will store information about model configurations, prepare models for execution, and run them locally or in a distributed fashion. The key concept that underlies both ESMF and the Earth System Curator is that of software components. Components are software units that are "composable", meaning they can be combined to form coupled applications. These components may be representations of physical domains, such as atmospheres or oceans; processes within particular domains such as atmospheric radiation or chemistry; or computational functions, such as data assimilation or I/O. ESMF provides interfaces, an architecture, and tools for structuring components hierarchically to form complex, coupled modeling applications. The Earth System Curator will enable modelers to describe, archive, search, compose, and run ESMF and similar components. Together these projects encourage a new paradigm for modeling: one in which the community can draw from a federation of many interoperable components in order to create and deploy applications. The goal is to enable a network of collaborations and new scientific opportunities for the Earth modeling community.

  15. Fully explicit nonlinear optics model in a particle-in-cell framework

    SciTech Connect

    Gordon, D.F. Helle, M.H.; Peñano, J.R.

    2013-10-01

    A numerical technique which incorporates the nonlinear optics of anisotropic crystals into a particle-in-cell framework is described. The model is useful for simulating interactions between crystals, ultra-short laser pulses, intense relativistic electron bunches, plasmas, or any combination thereof. The frequency content of the incident and scattered radiation is limited only by the resolution of the spatial and temporal grid. A numerical stability analysis indicates that the Courant condition is more stringent than in the vacuum case. Numerical experiments are carried out illustrating the electro-optic effect, soliton propagation, and the generation of fields in a crystal by a relativistic electron bunch.

  16. A hierarchical framework for the multiscale modeling of microstructure evolution in heterogeneous materials.

    SciTech Connect

    Luscher, Darby J.

    2010-04-01

    All materials are heterogeneous at various scales of observation. The influence of material heterogeneity on nonuniform response and microstructure evolution can have profound impact on continuum thermomechanical response at macroscopic “engineering” scales. In many cases, it is necessary to treat this behavior as a multiscale process thus integrating the physical understanding of material behavior at various physical (length and time) scales in order to more accurately predict the thermomechanical response of materials as their microstructure evolves. The intent of the dissertation is to provide a formal framework for multiscale hierarchical homogenization to be used in developing constitutive models.

  17. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model

    PubMed Central

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework. PMID:26543899

  18. A stochastic multiscale framework for modeling flow through random heterogeneous porous media

    SciTech Connect

    Ganapathysubramanian, B.; Zabaras, N.

    2009-02-01

    Flow through porous media is ubiquitous, occurring from large geological scales down to the microscopic scales. Several critical engineering phenomena like contaminant spread, nuclear waste disposal and oil recovery rely on accurate analysis and prediction of these multiscale phenomena. Such analysis is complicated by inherent uncertainties as well as the limited information available to characterize the system. Any realistic modeling of these transport phenomena has to resolve two key issues: (i) the multi-length scale variations in permeability that these systems exhibit, and (ii) the inherently limited information available to quantify these property variations that necessitates posing these phenomena as stochastic processes. A stochastic variational multiscale formulation is developed to incorporate uncertain multiscale features. A stochastic analogue to a mixed multiscale finite element framework is used to formulate the physical stochastic multiscale process. Recent developments in linear and non-linear model reduction techniques are used to convert the limited information available about the permeability variation into a viable stochastic input model. An adaptive sparse grid collocation strategy is used to efficiently solve the resulting stochastic partial differential equations (SPDEs). The framework is applied to analyze flow through random heterogeneous media when only limited statistics about the permeability variation are given.

  19. Dynamics of immature mAb glycoform secretion during CHO cell culture: An integrated modelling framework.

    PubMed

    Jimenez Del Val, Ioscani; Fan, Yuzhou; Weilguny, Dietmar

    2016-05-01

    Ensuring consistent glycosylation-associated quality of therapeutic monoclonal antibodies (mAbs) has become a priority in pharmaceutical bioprocessing given that the distribution and composition of the carbohydrates (glycans) bound to these molecules determines their therapeutic efficacy and immunogenicity. However, the interaction between bioprocess conditions, cellular metabolism and the intracellular process of glycosylation remains to be fully understood. To gain further insight into these interactions, we present a novel integrated modelling platform that links dynamic variations in mAb glycosylation with cellular secretory capacity. Two alternative mechanistic representations of how mAb specific productivity (qp ) influences glycosylation are compared. In the first, mAb glycosylation is modulated by the linear velocity with which secretory cargo traverses the Golgi apparatus. In the second, glycosylation is influenced by variations in Golgi volume. Within our modelling framework, both mechanisms accurately reproduce experimentally-observed dynamic changes in mAb glycosylation. In addition, an optimisation-based strategy has been developed to estimate the concentration of glycosylation enzymes required to minimise mAb glycoform variability. Our results suggest that the availability of glycosylation machinery relative to cellular secretory capacity may play a crucial role in mAb glycosylation. In the future, the modelling framework presented here may aid in selecting and engineering cell lines that ensure consistent mAb glycosylatio. PMID:26743760

  20. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model.

    PubMed

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework. PMID:26543899

  1. A Computational Framework for 3D Mechanical Modeling of Plant Morphogenesis with Cellular Resolution

    PubMed Central

    Gilles, Benjamin; Hamant, Olivier; Boudaoud, Arezki; Traas, Jan; Godin, Christophe

    2015-01-01

    The link between genetic regulation and the definition of form and size during morphogenesis remains largely an open question in both plant and animal biology. This is partially due to the complexity of the process, involving extensive molecular networks, multiple feedbacks between different scales of organization and physical forces operating at multiple levels. Here we present a conceptual and modeling framework aimed at generating an integrated understanding of morphogenesis in plants. This framework is based on the biophysical properties of plant cells, which are under high internal turgor pressure, and are prevented from bursting because of the presence of a rigid cell wall. To control cell growth, the underlying molecular networks must interfere locally with the elastic and/or plastic extensibility of this cell wall. We present a model in the form of a three dimensional (3D) virtual tissue, where growth depends on the local modulation of wall mechanical properties and turgor pressure. The model shows how forces generated by turgor-pressure can act both cell autonomously and non-cell autonomously to drive growth in different directions. We use simulations to explore lateral organ formation at the shoot apical meristem. Although different scenarios lead to similar shape changes, they are not equivalent and lead to different, testable predictions regarding the mechanical and geometrical properties of the growing lateral organs. Using flower development as an example, we further show how a limited number of gene activities can explain the complex shape changes that accompany organ outgrowth. PMID:25569615

  2. A computational framework for 3D mechanical modeling of plant morphogenesis with cellular resolution.

    PubMed

    Boudon, Frédéric; Chopard, Jérôme; Ali, Olivier; Gilles, Benjamin; Hamant, Olivier; Boudaoud, Arezki; Traas, Jan; Godin, Christophe

    2015-01-01

    The link between genetic regulation and the definition of form and size during morphogenesis remains largely an open question in both plant and animal biology. This is partially due to the complexity of the process, involving extensive molecular networks, multiple feedbacks between different scales of organization and physical forces operating at multiple levels. Here we present a conceptual and modeling framework aimed at generating an integrated understanding of morphogenesis in plants. This framework is based on the biophysical properties of plant cells, which are under high internal turgor pressure, and are prevented from bursting because of the presence of a rigid cell wall. To control cell growth, the underlying molecular networks must interfere locally with the elastic and/or plastic extensibility of this cell wall. We present a model in the form of a three dimensional (3D) virtual tissue, where growth depends on the local modulation of wall mechanical properties and turgor pressure. The model shows how forces generated by turgor-pressure can act both cell autonomously and non-cell autonomously to drive growth in different directions. We use simulations to explore lateral organ formation at the shoot apical meristem. Although different scenarios lead to similar shape changes, they are not equivalent and lead to different, testable predictions regarding the mechanical and geometrical properties of the growing lateral organs. Using flower development as an example, we further show how a limited number of gene activities can explain the complex shape changes that accompany organ outgrowth. PMID:25569615

  3. Quantifying Fluxes of Chemical and Biological Species in Great Lakes Watersheds: A Reactive Transport Modeling Framework

    NASA Astrophysics Data System (ADS)

    Niu, J.; Phanikumar, M. S.

    2012-12-01

    Understanding and quantifying the interactions between hydro-climatic processes and the fate and transport of aquatic pollutants and the resultant threats to human and ecosystem health is a high priority research area in many parts of the world. In the Great Lakes region, harmful algal blooms, increased beach closures due to microbiological pollution and drinking water related issues continue to be causes for concern in recent years highlighting the need for accurate transport models. In this presentation we describe the development of a watershed-scale multi-component reactive transport modeling framework to describe fluxes of nutrients and bacteria exported to the Great Lakes. We describe an operator-splitting strategy combined with a particle transport modeling approach with reactions to describe transport in different hydrologic units with interactions between domains. The algorithms are tested using analytical solutions (where available), data from plot-scale experiments and monitoring data from watersheds in the Great Lakes region.

  4. A modeling and experiment framework for the emergency management in AHC transmission.

    PubMed

    Chen, Bin; Ge, Yuanzheng; Zhang, Laobing; Zhang, Yongzheng; Zhong, Ziming; Liu, Xiaocheng

    2014-01-01

    Emergency management is crucial to finding effective ways to minimize or even eliminate the damage of emergent events, but there still exists no quantified method to study the events by computation. Statistical algorithms, such as susceptible-infected-recovered (SIR) models on epidemic transmission, ignore many details, thus always influencing the spread of emergent events. In this paper, we first propose an agent-based modeling and experiment framework to model the real world with the emergent events. The model of the real world is called artificial society, which is composed of agent model, agent activity model, and environment model, and it employs finite state automata (FSA) as its modeling paradigm. An artificial campus, on which a series of experiments are done to analyze the key factors of the acute hemorrhagic conjunctivitis (AHC) transmission, is then constructed to illustrate how our method works on the emergency management. Intervention measures and optional configurations (such as the isolation period) of them for the emergency management are also given through the evaluations in these experiments. PMID:24693330

  5. From terrestrial to aquatic fluxes: Integrating stream dynamics within a dynamic global vegetation modeling framework

    NASA Astrophysics Data System (ADS)

    Hoy, Jerad; Poulter, Benjam