Science.gov

Sample records for quark-parton model framework

  1. Semi-inclusive charged-pion electroproduction off protons and deuterons: Cross sections, ratios, and access to the quark-parton model at low energies

    NASA Astrophysics Data System (ADS)

    Asaturyan, R.; Ent, R.; Mkrtchyan, H.; Navasardyan, T.; Tadevosyan, V.; Adams, G. S.; Ahmidouch, A.; Angelescu, T.; Arrington, J.; Asaturyan, A.; Baker, O. K.; Benmouna, N.; Bertoncini, C.; Blok, H. P.; Boeglin, W. U.; Bosted, P. E.; Breuer, H.; Christy, M. E.; Connell, S. H.; Cui, Y.; Dalton, M. M.; Danagoulian, S.; Day, D.; Dunne, J. A.; Dutta, D.; El Khayari, N.; Fenker, H. C.; Frolov, V. V.; Gan, L.; Gaskell, D.; Hafidi, K.; Hinton, W.; Holt, R. J.; Horn, T.; Huber, G. M.; Hungerford, E.; Jiang, X.; Jones, M.; Joo, K.; Kalantarians, N.; Kelly, J. J.; Keppel, C. E.; Kubarovsky, V.; Li, Y.; Liang, Y.; Mack, D.; Malace, S. P.; Markowitz, P.; McGrath, E.; McKee, P.; Meekins, D. G.; Mkrtchyan, A.; Moziak, B.; Niculescu, G.; Niculescu, I.; Opper, A. K.; Ostapenko, T.; Reimer, P. E.; Reinhold, J.; Roche, J.; Rock, S. E.; Schulte, E.; Segbefia, E.; Smith, C.; Smith, G. R.; Stoler, P.; Tang, L.; Ungaro, M.; Uzzle, A.; Vidakovic, S.; Villano, A.; Vulcan, W. F.; Wang, M.; Warren, G.; Wesselmann, F. R.; Wojtsekhowski, B.; Wood, S. A.; Xu, C.; Yuan, L.; Zheng, X.

    2012-01-01

    A large set of cross sections for semi-inclusive electroproduction of charged pions (π±) from both proton and deuteron targets was measured. The data are in the deep-inelastic scattering region with invariant mass squared W2>4 GeV2 (up to ≈7 GeV2) and range in four-momentum transfer squared 2quark-parton model description. We find the azimuthal dependences to be small, as compared to exclusive pion electroproduction, and consistent with theoretical expectations based on tree-level factorization in terms of transverse-momentum-dependent parton distribution and fragmentation functions. In the context of a simple model, the initial transverse momenta of d quarks are found to be slightly smaller than for u quarks, while the transverse momentum width of the favored fragmentation function is about the same as for the unfavored one, and both fragmentation widths are larger than the quark widths.

  2. Semi-inclusive charged-pion electroproduction off protons and deuterons: Cross sections, ratios, and access to the quark-parton model at low energies

    DOE PAGESBeta

    Asaturyan, R.; Ent, R.; Mkrtchyan, H.; Navasardyan, T.; Tadevosyan, V.; Adams, G. S.; Ahmidouch, A.; Angelescu, T.; Arrington, J.; Asaturyan, A.; et al

    2012-01-01

    A large set of cross sections for semi-inclusive electroproduction of charged pions (π±) from both proton and deuteron targets was measured. The data are in the deep-inelastic scattering region with invariant mass squared W2 > 4 GeV2 and range in four-momentum transfer squared 2 < Q2 < 4 (GeV/c)2, and cover a range in the Bjorken scaling variable 0.2 < x < 0.6. The fractional energy of the pions spans a range 0.3 < z < 1, with small transverse momenta with respect to the virtual-photon direction, Pt2 < 0.2 (GeV/c)2. The invariant mass that goes undetected, Mx or W',more » is in the nucleon resonance region, W' < 2 GeV. The new data conclusively show the onset of quark-hadron duality in this process, and the relation of this phenomenon to the high-energy factorization ansatz of electron-quark scattering and subsequent quark → pion production mechanisms. The x, z and Pt2 dependences of several ratios (the ratios of favored-unfavored fragmentation functions, charged pion ratios, deuteron-hydrogen and aluminum-deuteron ratios for π+ and π-) have been studied. The ratios are found to be in good agreement with expectations based upon a high-energy quark-parton model description. We find the azimuthal dependences to be small, as compared to exclusive pion electroproduction, and consistent with theoretical expectations based on tree-level factorization in terms of transverse-momentum-dependent parton distribution and fragmentation functions. In the context of a simple model, the initial transverse momenta of d quarks are found to be slightly smaller than for u quarks, while the transverse momentum width of the favored fragmentation function is about the same as for the unfavored one, and both fragmentation widths are larger than the quark widths.« less

  3. Fermi-Dirac distributions for quark partons

    NASA Astrophysics Data System (ADS)

    Bourrely, C.; Buccella, F.; Miele, G.; Migliore, G.; Soffer, J.; Tibullo, V.

    1994-09-01

    We propose to use Fermi-Dirac distributions for quark and antiquark partons. It allows a fair description of the x-dependence of the very recent NMC data on the proton and neutron structure functions F {2/ p } (x) and F {2/ n } (x) at Q 2=4 GeV2, as well as the CCFR antiquark distributionxbar q(x). We show that one can also use a corresponding Bose-Einstein expression to describe consistently the gluon distribution. The Pauli exclusion principle, which has been identified to explain the flavor asymmetry of the light-quark sea of the proton, is advocated to guide us for making a simple construction of the polarized parton distributions. We predict the spin dependent structure functions g {1/ p } (x) and g {1/ n } (x) in good agreement with EMC and SLAC data. The quark distributions involve some parameters whose values support well the hypothesis that the violation of the quark parton model sum rules is a consequence of the Pauli principle.

  4. Semi-inclusive charged-pion electroproduction off protons and deuterons: Cross sections, ratios, and access to the quark-parton model at low energies

    SciTech Connect

    Asaturyan, R.; Ent, R.; Mkrtchyan, H.; Navasardyan, T.; Tadevosyan, V.; Adams, G. S.; Ahmidouch, A.; Angelescu, T.; Arrington, J.; Asaturyan, A.; Baker, O. K.; Benmouna, N.; Bertoncini, C.; Blok, H. P.; Boeglin, W. U.; Bosted, P. E.; Breuer, H.; Christy, M. E.; Connell, S. H.; Cui, Y.; Dalton, M. M.; Danagoulian, S.; Day, D.; Dunne, J. A.; Dutta, D.; El Khayari, N.; Fenker, H. C.; Frolov, V. V.; Gan, L.; Gaskell, D.; Hafidi, K.; Hinton, W.; Holt, R. J.; Horn, T.; Huber, G. M.; Hungerford, E.; Jiang, X.; Jones, M.; Joo, K.; Kalantarians, N.; Kelly, J. J.; Keppel, C. E.; Kubarovsky, V.; Li, Y.; Liang, Y.; Mack, D.; Malace, S. P.; Markowitz, P.; McGrath, E.; McKee, P.; Meekins, D. G.; Mkrtchyan, A.; Moziak, B.; Niculescu, G.; Niculescu, I.; Opper, A. K.; Ostapenko, T.; Reimer, P. E.; Reinhold, J.; Roche, J.; Rock, S. E.; Schulte, E.; Segbefia, E.; Smith, C.; Smith, G. R.; Stoler, P.; Tang, L.; Ungaro, M.; Uzzle, A.; Vidakovic, S.; Villano, A.; Vulcan, W. F.; Wang, M.; Warren, G.; Wesselmann, F. R.; Wojtsekhowski, B.; Wood, S. A.; Xu, C.; Yuan, L.; Zheng, X.

    2012-01-01

    A large set of cross sections for semi-inclusive electroproduction of charged pions (π±) from both proton and deuteron targets was measured. The data are in the deep-inelastic scattering region with invariant mass squared W2 > 4 GeV2 and range in four-momentum transfer squared 2 < Q2 < 4 (GeV/c)2, and cover a range in the Bjorken scaling variable 0.2 < x < 0.6. The fractional energy of the pions spans a range 0.3 < z < 1, with small transverse momenta with respect to the virtual-photon direction, Pt2 < 0.2 (GeV/c)2. The invariant mass that goes undetected, Mx or W', is in the nucleon resonance region, W' < 2 GeV. The new data conclusively show the onset of quark-hadron duality in this process, and the relation of this phenomenon to the high-energy factorization ansatz of electron-quark scattering and subsequent quark → pion production mechanisms. The x, z and Pt2 dependences of several ratios (the ratios of favored-unfavored fragmentation functions, charged pion ratios, deuteron-hydrogen and aluminum-deuteron ratios for π+ and π-) have been studied. The ratios are found to be in good agreement with expectations based upon a high-energy quark-parton model description. We find the azimuthal dependences to be small, as compared to exclusive pion electroproduction, and consistent with theoretical expectations based on tree-level factorization in terms of transverse-momentum-dependent parton distribution and fragmentation functions. In the context of a simple model, the initial transverse momenta of d quarks are found to be slightly smaller than for u quarks, while the transverse momentum width of the favored fragmentation function is about the same as for the unfavored one, and both fragmentation widths are larger than the quark widths.

  5. Pion and kaon valence-quark parton distribution functions.

    SciTech Connect

    Nguyen, T.; Bashir, A.; Roberts, C. D.; Tandy, P. C.

    2011-06-16

    A rainbow-ladder truncation of QCD's Dyson-Schwinger equations, constrained by existing applications to hadron physics, is employed to compute the valence-quark parton distribution functions of the pion and kaon. Comparison is made to {pi}-N Drell-Yan data for the pion's u-quark distribution and to Drell-Yan data for the ratio u{sub K}(x)/u{sub {pi}}(x): the environmental influence of this quantity is a parameter-free prediction, which agrees well with existing data. Our analysis unifies the computation of distribution functions with that of numerous other properties of pseudoscalar mesons.

  6. Pion and kaon valence-quark parton distribution functions

    SciTech Connect

    Nguyen, Trang; Bashir, Adnan; Roberts, Craig D.; Tandy, Peter C.

    2011-06-15

    A rainbow-ladder truncation of QCD's Dyson-Schwinger equations, constrained by existing applications to hadron physics, is employed to compute the valence-quark parton distribution functions of the pion and kaon. Comparison is made to {pi}-N Drell-Yan data for the pion's u-quark distribution and to Drell-Yan data for the ratio u{sub K}(x)/u{sub {pi}}(x): the environmental influence of this quantity is a parameter-free prediction, which agrees well with existing data. Our analysis unifies the computation of distribution functions with that of numerous other properties of pseudoscalar mesons.

  7. Pion and kaon valence-quark parton distribution functions

    NASA Astrophysics Data System (ADS)

    Nguyen, Trang; Bashir, Adnan; Roberts, Craig D.; Tandy, Peter C.

    2011-06-01

    A rainbow-ladder truncation of QCD’s Dyson-Schwinger equations, constrained by existing applications to hadron physics, is employed to compute the valence-quark parton distribution functions of the pion and kaon. Comparison is made to π-N Drell-Yan data for the pion’s u-quark distribution and to Drell-Yan data for the ratio uK(x)/uπ(x): the environmental influence of this quantity is a parameter-free prediction, which agrees well with existing data. Our analysis unifies the computation of distribution functions with that of numerous other properties of pseudoscalar mesons.

  8. Geologic Framework Model (GFM2000)

    SciTech Connect

    T. Vogt

    2004-08-26

    The purpose of this report is to document the geologic framework model, version GFM2000 with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, and the differences between GFM2000 and previous versions. The version number of this model reflects the year during which the model was constructed. This model supersedes the previous model version, documented in Geologic Framework Model (GFM 3.1) (CRWMS M&O 2000 [DIRS 138860]). The geologic framework model represents a three-dimensional interpretation of the geology surrounding the location of the monitored geologic repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain. The geologic framework model encompasses and is limited to an area of 65 square miles (168 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the geologic framework model (shown in Figure 1-1) were chosen to encompass the exploratory boreholes and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The upper surface of the model is made up of the surface topography and the depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The geologic framework model was constructed from geologic map and borehole data. Additional information from measured stratigraphic sections, gravity profiles, and seismic profiles was also considered. The intended use of the geologic framework model is to provide a geologic framework over the area of interest consistent with the level of detailed needed for hydrologic flow and radionuclide transport modeling through the UZ and for repository design. The model is limited by the availability of data and relative amount of geologic complexity found in an area. The geologic framework model is inherently limited by scale and content. The grid spacing used in the

  9. Dicyanometallates as Model Extended Frameworks

    PubMed Central

    2016-01-01

    We report the structures of eight new dicyanometallate frameworks containing molecular extra-framework cations. These systems include a number of hybrid inorganic–organic analogues of conventional ceramics, such as Ruddlesden–Popper phases and perovskites. The structure types adopted are rationalized in the broader context of all known dicyanometallate framework structures. We show that the structural diversity of this family can be understood in terms of (i) the charge and coordination preferences of the particular metal cation acting as framework node, and (ii) the size, shape, and extent of incorporation of extra-framework cations. In this way, we suggest that dicyanometallates form a particularly attractive model family of extended frameworks in which to explore the interplay between molecular degrees of freedom, framework topology, and supramolecular interactions. PMID:27057759

  10. Dicyanometallates as Model Extended Frameworks.

    PubMed

    Hill, Joshua A; Thompson, Amber L; Goodwin, Andrew L

    2016-05-11

    We report the structures of eight new dicyanometallate frameworks containing molecular extra-framework cations. These systems include a number of hybrid inorganic-organic analogues of conventional ceramics, such as Ruddlesden-Popper phases and perovskites. The structure types adopted are rationalized in the broader context of all known dicyanometallate framework structures. We show that the structural diversity of this family can be understood in terms of (i) the charge and coordination preferences of the particular metal cation acting as framework node, and (ii) the size, shape, and extent of incorporation of extra-framework cations. In this way, we suggest that dicyanometallates form a particularly attractive model family of extended frameworks in which to explore the interplay between molecular degrees of freedom, framework topology, and supramolecular interactions. PMID:27057759

  11. Environmental modeling framework invasiveness: analysis and implications

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Environmental modeling frameworks support scientific model development by providing an Application Programming Interface (API) which model developers use to implement models. This paper presents results of an investigation on the framework invasiveness of environmental modeling frameworks. Invasiv...

  12. Environmental modeling framework invasiveness: analysis and implications

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Environmental modeling frameworks support scientific model development by providing an Application Programming Interface (API) which model developers use to implement models. This paper presents results of an investigation on the framework invasiveness of environmental modeling frameworks. Invasiven...

  13. Sequentially Executed Model Evaluation Framework

    Energy Science and Technology Software Center (ESTSC)

    2014-02-14

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and 1/0 through the time domain (or other discrete domain), and sample 1/0 drivers. This is a Framework library framework, and does not, itself, solve any problems or execute any modelling. The SeMe framework aids in development of models which operate on sequential information, suchmore » as time-series, where evaluation is based on prior results combined with new data for this iteration. Ha) applications in quality monitoring, and was developed as part of the CANARY-EDS software, where real-time water quality data is being analyzed« less

  14. Sequentially Executed Model Evaluation Framework

    SciTech Connect

    2014-02-14

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and 1/0 through the time domain (or other discrete domain), and sample 1/0 drivers. This is a Framework library framework, and does not, itself, solve any problems or execute any modelling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Ha) applications in quality monitoring, and was developed as part of the CANARY-EDS software, where real-time water quality data is being analyzed

  15. Sequentially Executed Model Evaluation Framework

    Energy Science and Technology Software Center (ESTSC)

    2015-10-20

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such asmore » time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as part of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.« less

  16. Sequentially Executed Model Evaluation Framework

    SciTech Connect

    2015-10-20

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as part of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.

  17. Pion valence-quark parton distribution function

    NASA Astrophysics Data System (ADS)

    Chang, Lei; Thomas, Anthony W.

    2015-10-01

    Within the Dyson-Schwinger equation formulation of QCD, a rainbow ladder truncation is used to calculate the pion valence-quark distribution function (PDF). The gap equation is renormalized at a typical hadronic scale, of order 0.5 GeV, which is also set as the default initial scale for the pion PDF. We implement a corrected leading-order expression for the PDF which ensures that the valence-quarks carry all of the pion's light-front momentum at the initial scale. The scaling behavior of the pion PDF at a typical partonic scale of order 5.2 GeV is found to be (1 - x) ν, with ν ≃ 1.6, as x approaches one.

  18. Geologic Framework Model Analysis Model Report

    SciTech Connect

    R. Clayton

    2000-12-19

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the

  19. Reducing the invasiveness of modelling frameworks

    NASA Astrophysics Data System (ADS)

    Donchyts, G.; Baart, F.

    2010-12-01

    There are several modelling frameworks available that allow for environmental models to exchange data with other models. Many efforts have been made in the past years promoting solutions aimed at integrating different numerical models with each other as well as at simplifying the way to set them up, entering the data, and running them. Meanwhile the development of many modeling frameworks concentrated on the interoperability of different model engines, several standards were introduced such as ESMF, OMS and OpenMI. One of the issues with applying modelling frameworks is the invasessness, the more the model has to know about the framework, the more intrussive it is. Another issue when applying modelling frameworks are that a lot of environmental models are written in procedural and in FORTRAN, which is one of the few languages that doesn't have a proper interface with other programming languages. Most modelling frameworks are written in object oriented languages like java/c# and the modelling framework in FORTRAN ESMF is also objected oriented. In this research we show how the application of domain driven, object oriented development techniques to environmental models can reduce the invasiveness of modelling frameworks. Our approach is based on four different steps: 1) application of OO techniques and reflection to the existing model to allow introspection. 2) programming language interoperability, between model written in a procedural programming language and modeling framework written in an object oriented programming language. 3) Domain mapping between data types used by model and other components being integrated 4) Connecting models using framework (wrapper) We compare coupling of an existing model as it was to the same model adapted using the four step approach. We connect both versions of the models using two different integrated modelling frameworks. As an example of a model we use the coastal morphological model XBeach. By adapting this model it allows for

  20. ATMOSPHERIC HEALTH EFFECTS FRAMEWORK (AHEF) MODEL

    EPA Science Inventory

    The Atmospheric and Health Effects Framework (AHEF) is used to assess theglobal impacts of substitutes for ozone-depleting substances (ODS). The AHEF is a series of FORTRAN modeling modules that collectively form a simulation framework for (a) translating ODS production into emi...

  1. The Generalized DINA Model Framework

    ERIC Educational Resources Information Center

    de la Torre, Jimmy

    2011-01-01

    The G-DINA ("generalized deterministic inputs, noisy and gate") model is a generalization of the DINA model with more relaxed assumptions. In its saturated form, the G-DINA model is equivalent to other general models for cognitive diagnosis based on alternative link functions. When appropriate constraints are applied, several commonly used…

  2. Knowledge Encapsulation Framework for Collaborative Social Modeling

    SciTech Connect

    Cowell, Andrew J.; Gregory, Michelle L.; Marshall, Eric J.; McGrath, Liam R.

    2009-03-24

    This paper describes the Knowledge Encapsulation Framework (KEF), a suite of tools to enable knowledge inputs (relevant, domain-specific facts) to modeling and simulation projects, as well as other domains that require effective collaborative workspaces for knowledge-based task. This framework can be used to capture evidence (e.g., trusted material such as journal articles and government reports), discover new evidence (covering both trusted and social media), enable discussions surrounding domain-specific topics and provide automatically generated semantic annotations for improved corpus investigation. The current KEF implementation is presented within a wiki environment, providing a simple but powerful collaborative space for team members to review, annotate, discuss and align evidence with their modeling frameworks. The novelty in this approach lies in the combination of automatically tagged and user-vetted resources, which increases user trust in the environment, leading to ease of adoption for the collaborative environment.

  3. A Framework to Manage Information Models

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; King, T.; Crichton, D.; Walker, R.; Roberts, A.; Thieman, J.

    2008-05-01

    The Information Model is the foundation on which an Information System is built. It defines the entities to be processed, their attributes, and the relationships that add meaning. The development and subsequent management of the Information Model is the single most significant factor for the development of a successful information system. A framework of tools has been developed that supports the management of an information model with the rigor typically afforded to software development. This framework provides for evolutionary and collaborative development independent of system implementation choices. Once captured, the modeling information can be exported to common languages for the generation of documentation, application databases, and software code that supports both traditional and semantic web applications. This framework is being successfully used for several science information modeling projects including those for the Planetary Data System (PDS), the International Planetary Data Alliance (IPDA), the National Cancer Institute's Early Detection Research Network (EDRN), and several Consultative Committee for Space Data Systems (CCSDS) projects. The objective of the Space Physics Archive Search and Exchange (SPASE) program is to promote collaboration and coordination of archiving activity for the Space Plasma Physics community and ensure the compatibility of the architectures used for a global distributed system and the individual data centers. Over the past several years, the SPASE data model working group has made great progress in developing the SPASE Data Model and supporting artifacts including a data dictionary, XML Schema, and two ontologies. The authors have captured the SPASE Information Model in this framework. This allows the generation of documentation that presents the SPASE Information Model in object-oriented notation including UML class diagrams and class hierarchies. The modeling information can also be exported to semantic web languages such

  4. Rethinking modeling framework design: object modeling system 3.0

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Object Modeling System (OMS) is a framework for environmental model development, data provisioning, testing, validation, and deployment. It provides a bridge for transferring technology from the research organization to the program delivery agency. The framework provides a consistent and efficie...

  5. Modelling Diffusion of a Personalized Learning Framework

    ERIC Educational Resources Information Center

    Karmeshu; Raman, Raghu; Nedungadi, Prema

    2012-01-01

    A new modelling approach for diffusion of personalized learning as an educational process innovation in social group comprising adopter-teachers is proposed. An empirical analysis regarding the perception of 261 adopter-teachers from 18 schools in India about a particular personalized learning framework has been made. Based on this analysis,…

  6. Overarching framework for data-based modelling

    NASA Astrophysics Data System (ADS)

    Schelter, Björn; Mader, Malenka; Mader, Wolfgang; Sommerlade, Linda; Platt, Bettina; Lai, Ying-Cheng; Grebogi, Celso; Thiel, Marco

    2014-02-01

    One of the main modelling paradigms for complex physical systems are networks. When estimating the network structure from measured signals, typically several assumptions such as stationarity are made in the estimation process. Violating these assumptions renders standard analysis techniques fruitless. We here propose a framework to estimate the network structure from measurements of arbitrary non-linear, non-stationary, stochastic processes. To this end, we propose a rigorous mathematical theory that underlies this framework. Based on this theory, we present a highly efficient algorithm and the corresponding statistics that are immediately sensibly applicable to measured signals. We demonstrate its performance in a simulation study. In experiments of transitions between vigilance stages in rodents, we infer small network structures with complex, time-dependent interactions; this suggests biomarkers for such transitions, the key to understand and diagnose numerous diseases such as dementia. We argue that the suggested framework combines features that other approaches followed so far lack.

  7. DANA: distributed numerical and adaptive modelling framework.

    PubMed

    Rougier, Nicolas P; Fix, Jérémy

    2012-01-01

    DANA is a python framework ( http://dana.loria.fr ) whose computational paradigm is grounded on the notion of a unit that is essentially a set of time dependent values varying under the influence of other units via adaptive weighted connections. The evolution of a unit's value are defined by a set of differential equations expressed in standard mathematical notation which greatly ease their definition. The units are organized into groups that form a model. Each unit can be connected to any other unit (including itself) using a weighted connection. The DANA framework offers a set of core objects needed to design and run such models. The modeler only has to define the equations of a unit as well as the equations governing the training of the connections. The simulation is completely transparent to the modeler and is handled by DANA. This allows DANA to be used for a wide range of numerical and distributed models as long as they fit the proposed framework (e.g. cellular automata, reaction-diffusion system, decentralized neural networks, recurrent neural networks, kernel-based image processing, etc.). PMID:22994650

  8. An evaluation framework for participatory modelling

    NASA Astrophysics Data System (ADS)

    Krueger, T.; Inman, A.; Chilvers, J.

    2012-04-01

    Strong arguments for participatory modelling in hydrology can be made on substantive, instrumental and normative grounds. These arguments have led to increasingly diverse groups of stakeholders (here anyone affecting or affected by an issue) getting involved in hydrological research and the management of water resources. In fact, participation has become a requirement of many research grants, programs, plans and policies. However, evidence of beneficial outcomes of participation as suggested by the arguments is difficult to generate and therefore rare. This is because outcomes are diverse, distributed, often tacit, and take time to emerge. In this paper we develop an evaluation framework for participatory modelling focussed on learning outcomes. Learning encompasses many of the potential benefits of participation, such as better models through diversity of knowledge and scrutiny, stakeholder empowerment, greater trust in models and ownership of subsequent decisions, individual moral development, reflexivity, relationships, social capital, institutional change, resilience and sustainability. Based on the theories of experiential, transformative and social learning, complemented by practitioner experience our framework examines if, when and how learning has occurred. Special emphasis is placed on the role of models as learning catalysts. We map the distribution of learning between stakeholders, scientists (as a subgroup of stakeholders) and models. And we analyse what type of learning has occurred: instrumental learning (broadly cognitive enhancement) and/or communicative learning (change in interpreting meanings, intentions and values associated with actions and activities; group dynamics). We demonstrate how our framework can be translated into a questionnaire-based survey conducted with stakeholders and scientists at key stages of the participatory process, and show preliminary insights from applying the framework within a rural pollution management situation in

  9. A framework for benchmarking land models

    SciTech Connect

    Luo, Yiqi; Randerson, J.; Abramowitz, G.; Bacour, C.; Blyth, E.; Carvalhais, N.; Ciais, Philippe; Dalmonech, D.; Fisher, J.B.; Fisher, R.; Friedlingstein, P.; Hibbard, Kathleen A.; Hoffman, F. M.; Huntzinger, Deborah; Jones, C.; Koven, C.; Lawrence, David M.; Li, D.J.; Mahecha, M.; Niu, S.L.; Norby, Richard J.; Piao, S.L.; Qi, X.; Peylin, P.; Prentice, I.C.; Riley, William; Reichstein, M.; Schwalm, C.; Wang, Y.; Xia, J. Y.; Zaehle, S.; Zhou, X. H.

    2012-10-09

    Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1) targeted aspects of model performance to be evaluated, (2) a set of benchmarks as defined references to test model performance, (3) metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4) model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1) a priori thresholds of acceptable model performance and (2) a scoring system to combine data–model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties of land models

  10. A Framework for Considering Comprehensibility in Modeling.

    PubMed

    Gleicher, Michael

    2016-06-01

    Comprehensibility in modeling is the ability of stakeholders to understand relevant aspects of the modeling process. In this article, we provide a framework to help guide exploration of the space of comprehensibility challenges. We consider facets organized around key questions: Who is comprehending? Why are they trying to comprehend? Where in the process are they trying to comprehend? How can we help them comprehend? How do we measure their comprehension? With each facet we consider the broad range of options. We discuss why taking a broad view of comprehensibility in modeling is useful in identifying challenges and opportunities for solutions. PMID:27441712

  11. A framework for modeling rail transport vulnerability

    SciTech Connect

    Peterson, Steven K; Church, Richard L.

    2008-01-01

    Railroads represent one of the most efficient methods of long-haul transport for bulk commodities, from coal to agricultural products. Over the past fifty years, the rail network has contracted while tonnage has increased. Service, geographically, has been abandoned along short haul routes and increased along major long haul routes, resulting in a network that is more streamlined. The current rail network may be very vulnerable to disruptions, like the failure of a trestle. This paper proposes a framework to model rail network vulnerability and gives an application of this modeling framework in analyzing rail network vulnerability for the State of Washington. It concludes with a number of policy related issues that need to be addressed in order to identify, plan, and mitigate the risks associated with the sudden loss of a bridge or trestle.

  12. Density Estimation Framework for Model Error Assessment

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Liu, Z.; Najm, H. N.; Safta, C.; VanBloemenWaanders, B.; Michelsen, H. A.; Bambha, R.

    2014-12-01

    In this work we highlight the importance of model error assessment in physical model calibration studies. Conventional calibration methods often assume the model is perfect and account for data noise only. Consequently, the estimated parameters typically have biased values that implicitly compensate for model deficiencies. Moreover, improving the amount and the quality of data may not improve the parameter estimates since the model discrepancy is not accounted for. In state-of-the-art methods model discrepancy is explicitly accounted for by enhancing the physical model with a synthetic statistical additive term, which allows appropriate parameter estimates. However, these statistical additive terms do not increase the predictive capability of the model because they are tuned for particular output observables and may even violate physical constraints. We introduce a framework in which model errors are captured by allowing variability in specific model components and parameterizations for the purpose of achieving meaningful predictions that are both consistent with the data spread and appropriately disambiguate model and data errors. Here we cast model parameters as random variables, embedding the calibration problem within a density estimation framework. Further, we calibrate for the parameters of the joint input density. The likelihood function for the associated inverse problem is degenerate, therefore we use Approximate Bayesian Computation (ABC) to build prediction-constraining likelihoods and illustrate the strengths of the method on synthetic cases. We also apply the ABC-enhanced density estimation to the TransCom 3 CO2 intercomparison study (Gurney, K. R., et al., Tellus, 55B, pp. 555-579, 2003) and calibrate 15 transport models for regional carbon sources and sinks given atmospheric CO2 concentration measurements.

  13. An entropic framework for modeling economies

    NASA Astrophysics Data System (ADS)

    Caticha, Ariel; Golan, Amos

    2014-08-01

    We develop an information-theoretic framework for economic modeling. This framework is based on principles of entropic inference that are designed for reasoning on the basis of incomplete information. We take the point of view of an external observer who has access to limited information about broad macroscopic economic features. We view this framework as complementary to more traditional methods. The economy is modeled as a collection of agents about whom we make no assumptions of rationality (in the sense of maximizing utility or profit). States of statistical equilibrium are introduced as those macrostates that maximize entropy subject to the relevant information codified into constraints. The basic assumption is that this information refers to supply and demand and is expressed in the form of the expected values of certain quantities (such as inputs, resources, goods, production functions, utility functions and budgets). The notion of economic entropy is introduced. It provides a measure of the uniformity of the distribution of goods and resources. It captures both the welfare state of the economy as well as the characteristics of the market (say, monopolistic, concentrated or competitive). Prices, which turn out to be the Lagrange multipliers, are endogenously generated by the economy. Further studies include the equilibrium between two economies and the conditions for stability. As an example, the case of the nonlinear economy that arises from linear production and utility functions is treated in some detail.

  14. Concordance: A Framework for Managing Model Integrity

    NASA Astrophysics Data System (ADS)

    Rose, Louis M.; Kolovos, Dimitrios S.; Drivalos, Nicholas; Williams, James R.; Paige, Richard F.; Polack, Fiona A. C.; Fernandes, Kiran J.

    A change to a software development artefact, such as source code or documentation, can affect the integrity of others. Many contemporary software development environments provide tools that automatically manage (detect, report and reconcile) integrity. For instance, incremental background compilation can reconcile object code with changing source code and report calls to a method that are inconsistent with its definition. Although models are increasingly first-class citizens in software development, contemporary development environments are less able to automatically detect, manage and reconcile the integrity of models than the integrity of other types of artefact. In this paper, we discuss the scalability and efficiency problems faced when managing model integrity for two categories of change that occur in MDE. We present a framework to support the incremental management of model integrity, evaluating the efficiency of the proposed approach atop Eclipse and EMF.

  15. Architecting a Simulation Framework for Model Rehosting

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2004-01-01

    The utility of vehicle math models extends beyond human-in-the-loop simulation. It is desirable to deploy a given model across a multitude of applications that target design, analysis, and research. However, the vehicle model alone represents an incomplete simulation. One must also replicate the environment models (e.g., atmosphere, gravity, terrain) to achieve identical vehicle behavior across all applications. Environment models are increasing in complexity and represent a substantial investment to re-engineer for a new application. A software component that can be rehosted in each application is one solution to the deployment problem. The component must encapsulate both the vehicle and environment models. The component must have a well-defined interface that abstracts the bulk of the logic to operate the models. This paper examines the characteristics of a rehostable modeling component from the perspective of a human-in-the-loop simulation framework. The Langley Standard Real-Time Simulation in C++ (LaSRS++) is used as an example. LaSRS++ was recently redesigned to transform its modeling package into a rehostable component.

  16. A framework for multi-scale modelling

    PubMed Central

    Chopard, B.; Borgdorff, Joris; Hoekstra, A. G.

    2014-01-01

    We review a methodology to design, implement and execute multi-scale and multi-science numerical simulations. We identify important ingredients of multi-scale modelling and give a precise definition of them. Our framework assumes that a multi-scale model can be formulated in terms of a collection of coupled single-scale submodels. With concepts such as the scale separation map, the generic submodel execution loop (SEL) and the coupling templates, one can define a multi-scale modelling language which is a bridge between the application design and the computer implementation. Our approach has been successfully applied to an increasing number of applications from different fields of science and technology. PMID:24982249

  17. A learning framework for catchment erosion modelling

    NASA Astrophysics Data System (ADS)

    Freer, J. E.; Quinton, J.

    2006-12-01

    Erosion modelling at the catchment scale, like many other disciplines that model environmental signals, is not an exact science. We are limited by our incomplete perceptual understanding of relevant processes; in formulating and simplifying these perceptions into conceptual models; and by our ability to collect data at the right resolution and spatial scale to drive and evaluate our models effectively. The challenge is how to develop models which take into account our difficulties in describing processes, parameterising equations and demonstrating that they perform within acceptable limits. In this talk we will: examine how limited data has been used to develop algorithms applied across the world and how this may lead to one source of prediction error discuss the use of uncertainty analysis techniques for describing the possible suite of model predictions that give acceptable responses explore what field observations can tell us about model performance and how these might be used to constrain uncertainties in model predictions or in some cases contribute towards these uncertainties; consider how we might learn from our data to produce models with an appropriate degree of complexity We hope that the talk will begin a debate about our ability to capture the essence of erosional processes and quantities for storm responses through data and modelling that includes characterising the appropriate level of uncertainties. We will use examples from the literature as well as from our own observations and modelling initiatives. We hope to generate some lively discussion about the limits of our observations to both inform and to evaluate, to consider what the appropriate level of complexity should be for catchment scale erosion modelling and to consider ways to develop a learning framework for all erosion scientists to engage in.

  18. Conceptual Frameworks in the Doctoral Research Process: A Pedagogical Model

    ERIC Educational Resources Information Center

    Berman, Jeanette; Smyth, Robyn

    2015-01-01

    This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…

  19. A Smallholder Socio-hydrological Modelling Framework

    NASA Astrophysics Data System (ADS)

    Pande, S.; Savenije, H.; Rathore, P.

    2014-12-01

    Small holders are farmers who own less than 2 ha of farmland. They often have low productivity and thus remain at subsistence level. A fact that nearly 80% of Indian farmers are smallholders, who merely own a third of total farmlands and belong to the poorest quartile, but produce nearly 40% of countries foodgrains underlines the importance of understanding the socio-hydrology of a small holder. We present a framework to understand the socio-hydrological system dynamics of a small holder. It couples the dynamics of 6 main variables that are most relevant at the scale of a small holder: local storage (soil moisture and other water storage), capital, knowledge, livestock production, soil fertility and grass biomass production. The model incorporates rule-based adaptation mechanisms (for example: adjusting expenditures on food and fertilizers, selling livestocks etc.) of small holders when they face adverse socio-hydrological conditions, such as low annual rainfall, higher intra-annual variability in rainfall or variability in agricultural prices. It allows us to study sustainability of small holder farming systems under various settings. We apply the framework to understand the socio-hydrology of small holders in Aurangabad, Maharashtra, India. This district has witnessed suicides of many sugarcane farmers who could not extricate themselves out of the debt trap. These farmers lack irrigation and are susceptible to fluctuating sugar prices and intra-annual hydroclimatic variability. This presentation discusses two aspects in particular: whether government interventions to absolve the debt of farmers is enough and what is the value of investing in local storages that can buffer intra-annual variability in rainfall and strengthening the safety-nets either by creating opportunities for alternative sources of income or by crop diversification.

  20. An Exploratory Investigation on the Invasiveness of Environmental Modeling Frameworks

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper provides initial results of an exploratory investigation on the invasiveness of environmental modeling frameworks. Invasiveness is defined as the coupling between application (i.e., model) and framework code used to implement the model. By comparing the implementation of an environmenta...

  1. Improving the physics models in the Space Weather Modeling Framework

    NASA Astrophysics Data System (ADS)

    Toth, G.; Fang, F.; Frazin, R. A.; Gombosi, T. I.; Ilie, R.; Liemohn, M. W.; Manchester, W. B.; Meng, X.; Pawlowski, D. J.; Ridley, A. J.; Sokolov, I.; van der Holst, B.; Vichare, G.; Yigit, E.; Yu, Y.; Buzulukova, N.; Fok, M. H.; Glocer, A.; Jordanova, V. K.; Welling, D. T.; Zaharia, S. G.

    2010-12-01

    The success of physics based space weather forecasting depends on several factors: we need sufficient amount and quality of timely observational data, we have to understand the physics of the Sun-Earth system well enough, we need sophisticated computational models, and the models have to run faster than real time on the available computational resources. This presentation will focus on a single ingredient, the recent improvements of the mathematical and numerical models in the Space Weather Modeling Framework. We have developed a new physics based CME initiation code using flux emergence from the convection zone solving the equations of radiative magnetohydrodynamics (MHD). Our new lower corona and solar corona models use electron heat conduction, Alfven wave heating, and boundary conditions based on solar tomography. We can obtain a physically consistent solar wind model from the surface of the Sun all the way to the L1 point without artificially changing the polytropic index. The global magnetosphere model can now solve the multi-ion MHD equations and take into account the oxygen outflow from the polar wind model. We have also added the options of solving for Hall MHD and anisotropic pressure. Several new inner magnetosphere models have been added to the framework: CRCM, HEIDI and RAM-SCB. These new models resolve the pitch angle distribution of the trapped particles. The upper atmosphere model GITM has been improved by including a self-consistent equatorial electrodynamics and the effects of solar flares. This presentation will very briefly describe the developments and highlight some results obtained with the improved and new models.

  2. Fermi-Dirac statistics plus liquid description of quark partons

    NASA Astrophysics Data System (ADS)

    Buccella, F.; Miele, G.; Migliore, G.; Tibullo, V.

    1995-12-01

    A previous approach with Fermi-Dirac distributions for fermion partons is here improved to comply with the expected low x behaviour of structure functions. We are so able to get a fair description of the unpolarized and polarized structure functions of the nucleons as well as of neutrino data. We cannot reach definite conclusions, but confirm our suspicion of a relationship between the defects in Gottfried and spin sum rules.

  3. PARCC Model Content Frameworks: Mathematics--Grades 3-11

    ERIC Educational Resources Information Center

    Partnership for Assessment of Readiness for College and Careers (NJ1), 2011

    2011-01-01

    As part of its proposal to the U.S. Department of Education, the Partnership for Assessment of Readiness for College and Careers (PARCC) committed to developing model content frameworks for mathematics to serve as a bridge between the Common Core State Standards and the PARCC assessments. The PARCC Model Content Frameworks were developed through a…

  4. Critical Thinking: Frameworks and Models for Teaching

    ERIC Educational Resources Information Center

    Fahim, Mansoor; Eslamdoost, Samaneh

    2014-01-01

    Developing critical thinking since the educational revolution gave rise to flourishing movements toward embedding critical thinking (CT henceforth) stimulating classroom activities in educational settings. Nevertheless the process faced with complications such as teachability potentiality, lack of practical frameworks concerning actualization of…

  5. A Simulation and Modeling Framework for Space Situational Awareness

    SciTech Connect

    Olivier, S S

    2008-09-15

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.

  6. A modeling framework for resource-user-infrastructure systems

    NASA Astrophysics Data System (ADS)

    Muneepeerakul, R.; Qubbaj, M.; Anderies, J. M.; Aggarwal, R.; Janssen, M.

    2012-12-01

    A compact modeling framework is developed to supplement a conceptual framework of coupled natural-human systems. The framework consists of four components: resource (R), users (U), public infrastructure (PI), and public infrastructure providers (PIP), the last two of which have not been adequately addressed in many existing modeling studies. The modeling approach employed here is a set of replicator equations describing the dynamical frequencies of social strategies (of U and PIP), whose payoffs are explicit and dynamical functions of biophysical components (R and PI). Model development and preliminary results from specific implementation will be reported and discussed.

  7. Field-theoretical description of deep inelastic scattering

    SciTech Connect

    Geyer, B.; Robaschik, D.; Wieczorek, E.

    1980-01-01

    The most important theoretical notions concerning deep inelastic scattering are reviewed. Topics discussed are the model-independent approach, which is based on the general principles of quantum field theory, the application of quantum chromodynamics to deep inelastic scattering, approaches based on the quark--parton model, the light cone algebra, and conformal invariance, and also investigations in the framework of perturbation theory.

  8. A traceability framework for diagnostics of global land models

    NASA Astrophysics Data System (ADS)

    Luo, Yiqi; Xia, Jianyang; Liang, Junyi; Jiang, Lifen; Shi, Zheng; KC, Manoj; Hararuk, Oleksandra; Rafique, Rashid; Wang, Ying-Ping

    2015-04-01

    The biggest impediment to model diagnostics and improvement at present is model intractability. The more processes incorporated, the more difficult it becomes to understand or evaluate model behavior. As a result, uncertainty in predictions among global land models cannot be easily diagnosed and attributed to their sources. We have recently developed an approach to analytically decompose a complex land model into traceable components based on mutually independent properties of modeled core biogeochemical processes. As all global land carbon models share those common properties, this traceability framework is applicable to all of them to improve their tractability. Indeed, we have applied the traceability framework to improve model diagnostics in several aspects. First, we have applied the framework to the Australian Community Atmosphere Biosphere Land Exchange (CABLE) model and Community Land Model version 3.5 (CLM3.5) to identify sources of those model differences. The major causes of different predictions were found to be parameter setting related to carbon input and baseline residence times between the two models. Second, we have used the framework to diagnose impacts of adding nitrogen processes into CABLE on its carbon simulation. Adding nitrogen processes not only reduces net primary production but also shortens residence times in the CABLE model. Third, the framework helps isolate components of CLM3.5 for data assimilation. Data assimilation with global land models has been computationally extremely difficult. By isolating traceable components, we have improved parameterization of CLM3.4 related to soil organic decomposition, microbial kinetics and carbon use efficiency, and litter decomposition. Further, we are currently developing the traceability framework to analyze transient simulations of models that were involved in the coupled Model Intercomparison Project Phase 5 (CMIP5) to improve our understanding on parameter space of global carbon models. This

  9. A framework for modeling human evolution.

    PubMed

    Gintis, Herbert

    2016-01-01

    Culture-led gene-culture coevolution is a framework within which substantive explanations of human evolution must be located. It is not itself an explanation. Explanations depend on such concrete historical evolutionary factors such as the control of fire, collective child-rearing, lethal weapon technology, altruistic cooperation and punishment, and the mastery of complex collaboration protocols leading to an effective division of social labor. PMID:27561218

  10. A Framework for Dimensionality Assessment for Multidimensional Item Response Models

    ERIC Educational Resources Information Center

    Svetina, Dubravka; Levy, Roy

    2014-01-01

    A framework is introduced for considering dimensionality assessment procedures for multidimensional item response models. The framework characterizes procedures in terms of their confirmatory or exploratory approach, parametric or nonparametric assumptions, and applicability to dichotomous, polytomous, and missing data. Popular and emerging…

  11. Coastal Ecosystem Integrated Compartment Model (ICM): Modeling Framework

    NASA Astrophysics Data System (ADS)

    Meselhe, E. A.; White, E. D.; Reed, D.

    2015-12-01

    The Integrated Compartment Model (ICM) was developed as part of the 2017 Coastal Master Plan modeling effort. It is a comprehensive and numerical hydrodynamic model coupled to various geophysical process models. Simplifying assumptions related to some of the flow dynamics are applied to increase the computational efficiency of the model. The model can be used to provide insights about coastal ecosystems and evaluate restoration strategies. It builds on existing tools where possible and incorporates newly developed tools where necessary. It can perform decadal simulations (~ 50 years) across the entire Louisiana coast. It includes several improvements over the approach used to support the 2012 Master Plan, such as: additional processes in the hydrology, vegetation, wetland and barrier island morphology subroutines, increased spatial resolution, and integration of previously disparate models into a single modeling framework. The ICM includes habitat suitability indices (HSIs) to predict broad spatial patterns of habitat change, and it provides an additional integration to a dynamic fish and shellfish community model which quantitatively predicts potential changes in important fishery resources. It can be used to estimate the individual and cumulative effects of restoration and protection projects on the landscape, including a general estimate of water levels associated with flooding. The ICM is also used to examine possible impacts of climate change and future environmental scenarios (e.g. precipitation, Eustatic sea level rise, subsidence, tropical storms, etc.) on the landscape and on the effectiveness of restoration projects. The ICM code is publically accessible, and coastal restoration and protection groups interested in planning-level modeling are encouraged to explore its utility as a computationally efficient tool to examine ecosystem response to future physical or ecological changes, including the implementation of restoration and protection strategies.

  12. A Computational Framework for Realistic Retina Modeling.

    PubMed

    Martínez-Cañada, Pablo; Morillas, Christian; Pino, Begoña; Ros, Eduardo; Pelayo, Francisco

    2016-11-01

    Computational simulations of the retina have led to valuable insights about the biophysics of its neuronal activity and processing principles. A great number of retina models have been proposed to reproduce the behavioral diversity of the different visual processing pathways. While many of these models share common computational stages, previous efforts have been more focused on fitting specific retina functions rather than generalizing them beyond a particular model. Here, we define a set of computational retinal microcircuits that can be used as basic building blocks for the modeling of different retina mechanisms. To validate the hypothesis that similar processing structures may be repeatedly found in different retina functions, we implemented a series of retina models simply by combining these computational retinal microcircuits. Accuracy of the retina models for capturing neural behavior was assessed by fitting published electrophysiological recordings that characterize some of the best-known phenomena observed in the retina: adaptation to the mean light intensity and temporal contrast, and differential motion sensitivity. The retinal microcircuits are part of a new software platform for efficient computational retina modeling from single-cell to large-scale levels. It includes an interface with spiking neural networks that allows simulation of the spiking response of ganglion cells and integration with models of higher visual areas. PMID:27354192

  13. Mediation Analysis in a Latent Growth Curve Modeling Framework

    ERIC Educational Resources Information Center

    von Soest, Tilmann; Hagtvet, Knut A.

    2011-01-01

    This article presents several longitudinal mediation models in the framework of latent growth curve modeling and provides a detailed account of how such models can be constructed. Logical and statistical challenges that might arise when such analyses are conducted are also discussed. Specifically, we discuss how the initial status (intercept) and…

  14. Modeling asset price processes based on mean-field framework

    NASA Astrophysics Data System (ADS)

    Ieda, Masashi; Shiino, Masatoshi

    2011-12-01

    We propose a model of the dynamics of financial assets based on the mean-field framework. This framework allows us to construct a model which includes the interaction among the financial assets reflecting the market structure. Our study is on the cutting edge in the sense of a microscopic approach to modeling the financial market. To demonstrate the effectiveness of our model concretely, we provide a case study, which is the pricing problem of the European call option with short-time memory noise.

  15. Towards a consistent modeling framework across scales

    NASA Astrophysics Data System (ADS)

    Jagers, B.

    2013-12-01

    The morphodynamic evolution of river-delta-coastal systems may be studied in detail to predict local, short-term changes or at a more aggregated level to indicate the net large scale, long-term effect. The whole spectrum of spatial and temporal scales needs to be considered for environmental impact studies. Usually this implies setting up a number of different models for different scales. Since the various models often use codes that have been independently developed by different researchers and include different formulations, it may be difficult to arrive at a consistent set of modeling results. This is one of the reasons why Deltares has taken on an effort to develop a consistent suite of model components that can be applied over a wide range of scales. The heart of this suite is formed by a flexible mesh flow component that supports mixed 1D-2D-3D domains, a equally flexible transport component with an expandable library of water quality and ecological processes, and a library of sediment transport and morphology routines that can be linked directly to the flow component or used as part of the process library. We will present the latest developments with a focus on the status of the sediment transport and morphology component for running consistent 1D, 2D and 3D models.

  16. Entity-Centric Abstraction and Modeling Framework for Transportation Architectures

    NASA Technical Reports Server (NTRS)

    Lewe, Jung-Ho; DeLaurentis, Daniel A.; Mavris, Dimitri N.; Schrage, Daniel P.

    2007-01-01

    A comprehensive framework for representing transpportation architectures is presented. After discussing a series of preceding perspectives and formulations, the intellectual underpinning of the novel framework using an entity-centric abstraction of transportation is described. The entities include endogenous and exogenous factors and functional expressions are offered that relate these and their evolution. The end result is a Transportation Architecture Field which permits analysis of future concepts under the holistic perspective. A simulation model which stems from the framework is presented and exercised producing results which quantify improvements in air transportation due to advanced aircraft technologies. Finally, a modeling hypothesis and its accompanying criteria are proposed to test further use of the framework for evaluating new transportation solutions.

  17. Evolutionary Framework for Lepidoptera Model Systems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Model systems” are specific organisms upon which detailed studies have been conducted examining a fundamental biological question. If the studies are robust, their results can be extrapolated among an array of organisms that possess features in common with the subject organism. The true power of...

  18. Theoretical Tinnitus Framework: A Neurofunctional Model

    PubMed Central

    Ghodratitoostani, Iman; Zana, Yossi; Delbem, Alexandre C. B.; Sani, Siamak S.; Ekhtiari, Hamed; Sanchez, Tanit G.

    2016-01-01

    Subjective tinnitus is the conscious (attended) awareness perception of sound in the absence of an external source and can be classified as an auditory phantom perception. Earlier literature establishes three distinct states of conscious perception as unattended, attended, and attended awareness conscious perception. The current tinnitus development models depend on the role of external events congruently paired with the causal physical events that precipitate the phantom perception. We propose a novel Neurofunctional Tinnitus Model to indicate that the conscious (attended) awareness perception of phantom sound is essential in activating the cognitive-emotional value. The cognitive-emotional value plays a crucial role in governing attention allocation as well as developing annoyance within tinnitus clinical distress. Structurally, the Neurofunctional Tinnitus Model includes the peripheral auditory system, the thalamus, the limbic system, brainstem, basal ganglia, striatum, and the auditory along with prefrontal cortices. Functionally, we assume the model includes presence of continuous or intermittent abnormal signals at the peripheral auditory system or midbrain auditory paths. Depending on the availability of attentional resources, the signals may or may not be perceived. The cognitive valuation process strengthens the lateral-inhibition and noise canceling mechanisms in the mid-brain, which leads to the cessation of sound perception and renders the signal evaluation irrelevant. However, the “sourceless” sound is eventually perceived and can be cognitively interpreted as suspicious or an indication of a disease in which the cortical top-down processes weaken the noise canceling effects. This results in an increase in cognitive and emotional negative reactions such as depression and anxiety. The negative or positive cognitive-emotional feedbacks within the top-down approach may have no relation to the previous experience of the patients. They can also be

  19. Theoretical Tinnitus Framework: A Neurofunctional Model.

    PubMed

    Ghodratitoostani, Iman; Zana, Yossi; Delbem, Alexandre C B; Sani, Siamak S; Ekhtiari, Hamed; Sanchez, Tanit G

    2016-01-01

    Subjective tinnitus is the conscious (attended) awareness perception of sound in the absence of an external source and can be classified as an auditory phantom perception. Earlier literature establishes three distinct states of conscious perception as unattended, attended, and attended awareness conscious perception. The current tinnitus development models depend on the role of external events congruently paired with the causal physical events that precipitate the phantom perception. We propose a novel Neurofunctional Tinnitus Model to indicate that the conscious (attended) awareness perception of phantom sound is essential in activating the cognitive-emotional value. The cognitive-emotional value plays a crucial role in governing attention allocation as well as developing annoyance within tinnitus clinical distress. Structurally, the Neurofunctional Tinnitus Model includes the peripheral auditory system, the thalamus, the limbic system, brainstem, basal ganglia, striatum, and the auditory along with prefrontal cortices. Functionally, we assume the model includes presence of continuous or intermittent abnormal signals at the peripheral auditory system or midbrain auditory paths. Depending on the availability of attentional resources, the signals may or may not be perceived. The cognitive valuation process strengthens the lateral-inhibition and noise canceling mechanisms in the mid-brain, which leads to the cessation of sound perception and renders the signal evaluation irrelevant. However, the "sourceless" sound is eventually perceived and can be cognitively interpreted as suspicious or an indication of a disease in which the cortical top-down processes weaken the noise canceling effects. This results in an increase in cognitive and emotional negative reactions such as depression and anxiety. The negative or positive cognitive-emotional feedbacks within the top-down approach may have no relation to the previous experience of the patients. They can also be

  20. Multicriteria framework for selecting a process modelling language

    NASA Astrophysics Data System (ADS)

    Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel

    2016-01-01

    The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.

  1. Mathematical models for biodegradation of chlorinated solvents. 1: Model framework

    SciTech Connect

    Zhang, X.; Banerji, S.; Bajpai, R.

    1996-12-31

    Complete mineralization of chlorinated solvents by microbial action has been demonstrated under aerobic as well as anaerobic conditions. In most of the cases, it is believed that the biodegradation is initiated by broad-specificity enzymes involved in metabolism of a primary substrate. Under aerobic conditions, some of the primary carbon and energy substrates are methane, propane, toluene, phenol, and ammonia; under anaerobic conditions, glucose, sucrose, acetate, propionate, isopropanol, methanol, and even natural organics act as the carbon source. Published biochemical studies suggest that the limiting step is often the initial part of the biodegradation pathway within the microbial system. For aerobic systems, the limiting step is thought to be the reaction catalyzed by mono- and dioxygenases which are induced by most primary substrates, although some constitutive strains have been reported. Other critical features of the biodegradative pathway include: (1) activity losses of critical enzyme(s) through the action of metabolic byproducts, (2) energetic needs of contaminant biodegradation which must be met by catabolism of the primary substrates, (3) changes in metabolic patterns in mixed cultures found in nature depending on the availability of electron acceptors, and (4) the associated accumulation and disappearance of metabolic intermediates. Often, the contaminant pool itself consists of several chlorinated solvents with separate and interactive biochemical needs. The existing models address some of the issues mentioned above. However, their ability to successfully predict biological fate of chlorinated solvents in nature is severely limited due to the existing mathematical models. Limiting step(s), inactivation of critical enzymes, recovery action, energetics, and a framework for multiple degradative pathways will be presented as a comprehensive model. 91 refs.

  2. First-Order Frameworks for Managing Models in Engineering Optimization

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natlia M.; Lewis, Robert Michael

    2000-01-01

    Approximation/model management optimization (AMMO) is a rigorous methodology for attaining solutions of high-fidelity optimization problems with minimal expense in high- fidelity function and derivative evaluation. First-order AMMO frameworks allow for a wide variety of models and underlying optimization algorithms. Recent demonstrations with aerodynamic optimization achieved three-fold savings in terms of high- fidelity function and derivative evaluation in the case of variable-resolution models and five-fold savings in the case of variable-fidelity physics models. The savings are problem dependent but certain trends are beginning to emerge. We give an overview of the first-order frameworks, current computational results, and an idea of the scope of the first-order framework applicability.

  3. Frameworks for Assessing the Quality of Modeling and Simulation Capabilities

    NASA Astrophysics Data System (ADS)

    Rider, W. J.

    2012-12-01

    The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are

  4. A Model Framework for Course Materials Construction. Third Edition.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    A model framework for course materials construction is presented as an aid to Coast Guard course writers and coordinators, curriculum developers, and instructors who must modify a course or draft a new one. The model assumes that the instructor or other designated person has: (1) completed a task analysis which identifies the competencies, skills…

  5. Argumentation in Science Education: A Model-Based Framework

    ERIC Educational Resources Information Center

    Bottcher, Florian; Meisert, Anke

    2011-01-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons…

  6. Characteristics and Conceptual Framework of the Easy-Play Model

    ERIC Educational Resources Information Center

    Lu, Chunlei; Steele, Kyle

    2014-01-01

    The Easy-Play Model offers a defined framework to organize games that promote an inclusive and enjoyable sport experience. The model can be implemented by participants playing sports in educational, recreational or social contexts with the goal of achieving an active lifestyle in an inclusive, cooperative and enjoyable environment. The Easy-Play…

  7. A National Modeling Framework for Water Management Decisions

    NASA Astrophysics Data System (ADS)

    Bales, J. D.; Cline, D. W.; Pietrowsky, R.

    2013-12-01

    The National Weather Service (NWS), the U.S. Army Corps of Engineers (USACE), and the U.S. Geological Survey (USGS), all Federal agencies with complementary water-resources activities, entered into an Interagency Memorandum of Understanding (MOU) "Collaborative Science Services and Tools to Support Integrated and Adaptive Water Resources Management" to collaborate in activities that are supportive to their respective missions. One of the interagency activities is the development of a highly integrated national water modeling framework and information services framework. Together these frameworks establish a common operating picture, improve modeling and synthesis, support the sharing of data and products among agencies, and provide a platform for incorporation of new scientific understanding. Each of the agencies has existing operational systems to assist in carrying out their respective missions. The systems generally are designed, developed, tested, fielded, and supported by specialized teams. A broader, shared approach is envisioned and would include community modeling, wherein multiple independent investigators or teams develop and contribute new modeling capabilities based on science advances; modern technology in coupling model components and visualizing results; and a coupled atmospheric - hydrologic model construct such that the framework could be used in real-time water-resources decision making or for long-term management decisions. The framework also is being developed to account for organizational structures of the three partners such that, for example, national data sets can move down to the regional scale, and vice versa. We envision the national water modeling framework to be an important element of North American Water Program, to contribute to goals of the Program, and to be informed by the science and approaches developed as a part of the Program.

  8. Design of single object model of software reuse framework

    NASA Astrophysics Data System (ADS)

    Yan, Liu

    2011-12-01

    In order to embody the reuse significance of software reuse framework fully, this paper will analyze in detail about the single object model mentioned in the article "The overall design of software reuse framework" and induce them as add and delete and modify mode, check mode, and search and scroll and display integrated mode. Three modes correspond to their own interface design template, class and database design concept. The modelling idea helps developers clear their minds and speed up. Even laymen can complete the development task easily.

  9. A software engineering perspective on environmental modeling framework design: The object modeling system

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The environmental modeling community has historically been concerned with the proliferation of models and the effort associated with collective model development tasks (e.g., code generation, data provisioning and transformation, etc.). Environmental modeling frameworks (EMFs) have been developed to...

  10. A Type-Theoretic Framework for Certified Model Transformations

    NASA Astrophysics Data System (ADS)

    Calegari, Daniel; Luna, Carlos; Szasz, Nora; Tasistro, Álvaro

    We present a framework based on the Calculus of Inductive Constructions (CIC) and its associated tool the Coq proof assistant to allow certification of model transformations in the context of Model-Driven Engineering (MDE). The approached is based on a semi-automatic translation process from metamodels, models and transformations of the MDE technical space into types, propositions and functions of the CIC technical space. We describe this translation and illustrate its use in a standard case study.

  11. 3-D HYDRODYNAMIC MODELING IN A GEOSPATIAL FRAMEWORK

    SciTech Connect

    Bollinger, J; Alfred Garrett, A; Larry Koffman, L; David Hayes, D

    2006-08-24

    3-D hydrodynamic models are used by the Savannah River National Laboratory (SRNL) to simulate the transport of thermal and radionuclide discharges in coastal estuary systems. Development of such models requires accurate bathymetry, coastline, and boundary condition data in conjunction with the ability to rapidly discretize model domains and interpolate the required geospatial data onto the domain. To facilitate rapid and accurate hydrodynamic model development, SRNL has developed a pre- and post-processor application in a geospatial framework to automate the creation of models using existing data. This automated capability allows development of very detailed models to maximize exploitation of available surface water radionuclide sample data and thermal imagery.

  12. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation.

    PubMed

    Mangado, Nerea; Ceresa, Mario; Duchateau, Nicolas; Kjer, Hans Martin; Vera, Sergio; Dejea Velardo, Hector; Mistrik, Pavel; Paulsen, Rasmus R; Fagertun, Jens; Noailly, Jérôme; Piella, Gemma; González Ballester, Miguel Ángel

    2016-08-01

    Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging. To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient's CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns constitutive parameters to all components of the finite element model. This model can then be used to study in silico the effects of the electrical stimulation of the cochlear implant. Results are shown on a total of 25 models of patients. In all cases, a final mesh suitable for finite element simulations was obtained, in an average time of 94 s. The framework has proven to be fast and robust, and is promising for a detailed prognosis of the cochlear implantation surgery. PMID:26715210

  13. A computational framework for a database of terrestrial biosphere models

    NASA Astrophysics Data System (ADS)

    Metzler, Holger; Müller, Markus; Ceballos-Núñez, Verónika; Sierra, Carlos A.

    2016-04-01

    Most terrestrial biosphere models consist of a set of coupled ordinary first order differential equations. Each equation represents a pool containing carbon with a certain turnover rate. Although such models share some basic mathematical structures, they can have very different properties such as number of pools, cycling rates, and internal fluxes. We present a computational framework that helps analyze the structure and behavior of terrestrial biosphere models using as an example the process of soil organic matter decomposition. The same framework can also be used for other sub-processes such as carbon fixation or allocation. First, the models have to be fed into a database consisting of simple text files with a common structure. Then they are read in using Python and transformed into an internal 'Model Class' that can be used to automatically create an overview stating the model's structure, state variables, internal and external fluxes. SymPy, a Python library for symbolic mathematics, helps to also calculate the Jacobian matrix at possibly given steady states and the eigenvalues of this matrix. If complete parameter sets are available, the model can also be run using R to simulate its behavior under certain conditions and to support a deeper stability analysis. In this case, the framework is also able to provide phase-plane plots if appropriate. Furthermore, an overview of all the models in the database can be given to help identify their similarities and differences.

  14. Framework for Understanding Structural Errors (FUSE): a modular framework to diagnose differences between hydrological models

    USGS Publications Warehouse

    Clark, Martyn P.; Slater, Andrew G.; Rupp, David E.; Woods, Ross A.; Vrugt, Jasper A.; Gupta, Hoshin V.; Wagener, Thorsten; Hay, Lauren E.

    2008-01-01

    The problems of identifying the most appropriate model structure for a given problem and quantifying the uncertainty in model structure remain outstanding research challenges for the discipline of hydrology. Progress on these problems requires understanding of the nature of differences between models. This paper presents a methodology to diagnose differences in hydrological model structures: the Framework for Understanding Structural Errors (FUSE). FUSE was used to construct 79 unique model structures by combining components of 4 existing hydrological models. These new models were used to simulate streamflow in two of the basins used in the Model Parameter Estimation Experiment (MOPEX): the Guadalupe River (Texas) and the French Broad River (North Carolina). Results show that the new models produced simulations of streamflow that were at least as good as the simulations produced by the models that participated in the MOPEX experiment. Our initial application of the FUSE method for the Guadalupe River exposed relationships between model structure and model performance, suggesting that the choice of model structure is just as important as the choice of model parameters. However, further work is needed to evaluate model simulations using multiple criteria to diagnose the relative importance of model structural differences in various climate regimes and to assess the amount of independent information in each of the models. This work will be crucial to both identifying the most appropriate model structure for a given problem and quantifying the uncertainty in model structure. To facilitate research on these problems, the FORTRAN-90 source code for FUSE is available upon request from the lead author.

  15. Language Arts Curriculum Framework: Sample Curriculum Model, Grade 1.

    ERIC Educational Resources Information Center

    Arkansas State Dept. of Education, Little Rock.

    Based on the 1998 Arkansas State Language Arts Framework, this sample curriculum model for grade one language arts is divided into sections focusing on writing; listening, speaking, and viewing; and reading. Each section lists standards; benchmarks; assessments; and strategies/activities. The reading section itself is divided into print awareness;…

  16. Language Arts Curriculum Framework: Sample Curriculum Model, Grade K.

    ERIC Educational Resources Information Center

    Arkansas State Dept. of Education, Little Rock.

    Based on the 1998 Arkansas State Language Arts Framework, this sample curriculum model for kindergarten language arts is divided into sections focusing on writing; listening, speaking, and viewing; and reading. Each section lists standards; benchmarks; assessments; and strategies/activities. The reading section itself is divided into print…

  17. A Theoretical Framework for Physics Education Research: Modeling Student Thinking

    ERIC Educational Resources Information Center

    Redish, Edward F.

    2004-01-01

    Education is a goal-oriented field. But if we want to treat education scientifically so we can accumulate, evaluate, and refine what we learn, then we must develop a theoretical framework that is strongly rooted in objective observations and through which different theoretical models of student thinking can be compared. Much that is known in the…

  18. S-factor in the framework of the ``shadow`` model

    SciTech Connect

    Scalia, A. |

    1995-02-05

    The {ital S}({ital E}) factor is obtained in the framework of the ``shadow`` model. The analytical expression of the {ital S}({ital E}) function is not compatible with a {ital S}-factor which is a slow varying function of the energy. {copyright} {ital 1995} {ital American} {ital Institute} {ital of} {ital Physics}.

  19. The BMW Model: A New Framework for Teaching Monetary Economics

    ERIC Educational Resources Information Center

    Bofinger, Peter; Mayer, Eric; Wollmershauser, Timo

    2006-01-01

    Although the IS/LM-AS/AD model is still the central tool of macroeconomic teaching in most macroeconomic textbooks, it has been criticized by several economists. Colander (1995) demonstrated that the framework is logically inconsistent, Romer (2000) showed that it is unable to deal with a monetary policy that uses the interest rate as its…

  20. A Framework for Understanding Physics Students' Computational Modeling Practices

    NASA Astrophysics Data System (ADS)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by

  1. Model framework for describing the dynamics of evolving networks

    NASA Astrophysics Data System (ADS)

    Tobochnik, Jan; Strandburg, Katherine; Csardi, Gabor; Erdi, Peter

    2007-03-01

    We present a model framework for describing the dynamics of evolving networks. In this framework the addition of edges is stochastically governed by some important intrinsic and structural properties of network vertices through an attractiveness function. We discuss the solution of the inverse problem: determining the attractiveness function from the network evolution data. We also present a number of example applications: the description of the US patent citation network using vertex degree, patent age and patent category variables, and we show how the time-dependent version of the method can be used to find and describe important changes in the internal dynamics. We also compare our results to scientific citation networks.

  2. Theoretical Models and Operational Frameworks in Public Health Ethics

    PubMed Central

    Petrini, Carlo

    2010-01-01

    The article is divided into three sections: (i) an overview of the main ethical models in public health (theoretical foundations); (ii) a summary of several published frameworks for public health ethics (practical frameworks); and (iii) a few general remarks. Rather than maintaining the superiority of one position over the others, the main aim of the article is to summarize the basic approaches proposed thus far concerning the development of public health ethics by describing and comparing the various ideas in the literature. With this in mind, an extensive list of references is provided. PMID:20195441

  3. A High Performance Bayesian Computing Framework for Spatiotemporal Uncertainty Modeling

    NASA Astrophysics Data System (ADS)

    Cao, G.

    2015-12-01

    All types of spatiotemporal measurements are subject to uncertainty. With spatiotemporal data becomes increasingly involved in scientific research and decision making, it is important to appropriately model the impact of uncertainty. Quantitatively modeling spatiotemporal uncertainty, however, is a challenging problem considering the complex dependence and dataheterogeneities.State-space models provide a unifying and intuitive framework for dynamic systems modeling. In this paper, we aim to extend the conventional state-space models for uncertainty modeling in space-time contexts while accounting for spatiotemporal effects and data heterogeneities. Gaussian Markov Random Field (GMRF) models, also known as conditional autoregressive models, are arguably the most commonly used methods for modeling of spatially dependent data. GMRF models basically assume that a geo-referenced variable primarily depends on its neighborhood (Markov property), and the spatial dependence structure is described via a precision matrix. Recent study has shown that GMRFs are efficient approximation to the commonly used Gaussian fields (e.g., Kriging), and compared with Gaussian fields, GMRFs enjoy a series of appealing features, such as fast computation and easily accounting for heterogeneities in spatial data (e.g, point and areal). This paper represents each spatial dataset as a GMRF and integrates them into a state-space form to statistically model the temporal dynamics. Different types of spatial measurements (e.g., categorical, count or continuous), can be accounted for by according link functions. A fast alternative to MCMC framework, so-called Integrated Nested Laplace Approximation (INLA), was adopted for model inference.Preliminary case studies will be conducted to showcase the advantages of the described framework. In the first case, we apply the proposed method for modeling the water table elevation of Ogallala aquifer over the past decades. In the second case, we analyze the

  4. Compendium of models from a gauge U(1) framework

    NASA Astrophysics Data System (ADS)

    Ma, Ernest

    2016-06-01

    A gauge U(1) framework was established in 2002 to extend the supersymmetric Standard Model. It has many possible realizations. Whereas all have the necessary and sufficient ingredients to explain the possible 750 GeV diphoton excess, observed recently by the ATLAS and CMS Collaborations at the large hadron collider (LHC), they differ in other essential aspects. A compendium of such models is discussed.

  5. A framework for modeling uncertainty in regional climate change (Invited)

    NASA Astrophysics Data System (ADS)

    Monier, E.; Gao, X.; Scott, J. R.; Sokolov, A. P.; Schlosser, C. A.

    2013-12-01

    In this study, we present a new modeling framework and a large ensemble of climate projections to investigate the uncertainty in regional climate change over the United States associated with four dimensions of uncertainty. The sources of uncertainty considered in this framework are the emissions projections (using different climate policies), the climate system response (represented by different values of climate sensitivity and net aerosol forcing), natural variability (by perturbing initial conditions) and structural uncertainty (using different climate models). The modeling framework revolves around the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model with an intermediate complexity earth system model (with a two-dimensional zonal-mean atmosphere). Regional climate change over the United States is obtained through a two-pronged approach. First, we use the IGSM-CAM framework which links the IGSM to the National Center for Atmospheric Research (NCAR) Community Atmosphere Model (CAM). Secondly, we use a pattern-scaling method that extends the IGSM zonal mean based on climate change patterns from various climate models. Results show that uncertainty in temperature changes are mainly driven by policy choices and the range of climate sensitivity considered. Meanwhile, the four sources of uncertainty contribute more equally to precipitation changes, with natural variability having a large impact in the first part of the 21st century. Overall, the choice of policy is the largest driver of uncertainty in future projections of climate change over the United States. In light of these results, we recommend that when investigating climate change impacts over specific regions, studies consider all four sources of uncertainty analyzed in this paper.

  6. An enhanced BSIM modeling framework for selfheating aware circuit design

    NASA Astrophysics Data System (ADS)

    Schleyer, M.; Leuschner, S.; Baumgartner, P.; Mueller, J.-E.; Klar, H.

    2014-11-01

    This work proposes a modeling framework to enhance the industry-standard BSIM4 MOSFET models with capabilities for coupled electro-thermal simulations. An automated simulation environment extracts thermal information from model data as provided by the semiconductor foundry. The standard BSIM4 model is enhanced with a Verilog-A based wrapper module, adding thermal nodes which can be connected to a thermal-equivalent RC network. The proposed framework allows a fully automated extraction process based on the netlist of the top-level design and the model library. A numerical analysis tool is used to control the extraction flow and to obtain all required parameters. The framework is used to model self-heating effects on a fully integrated class A/AB power amplifier (PA) designed in a standard 65 nm CMOS process. The PA is driven with +30 dBm output power, leading to an average temperature rise of approximately 40 °C over ambient temperature.

  7. Possibilities: A framework for modeling students' deductive reasoning in physics

    NASA Astrophysics Data System (ADS)

    Gaffney, Jonathan David Housley

    Students often make errors when trying to solve qualitative or conceptual physics problems, and while many successful instructional interventions have been generated to prevent such errors, the process of deduction that students use when solving physics problems has not been thoroughly studied. In an effort to better understand that reasoning process, I have developed a new framework, which is based on the mental models framework in psychology championed by P. N. Johnson-Laird. My new framework models how students search possibility space when thinking about conceptual physics problems and suggests that errors arise from failing to flesh out all possibilities. It further suggests that instructional interventions should focus on making apparent those possibilities, as well as all physical consequences those possibilities would incur. The possibilities framework emerged from the analysis of data from a unique research project specifically invented for the purpose of understanding how students use deductive reasoning. In the selection task, participants were given a physics problem along with three written possible solutions with the goal of identifying which one of the three possible solutions was correct. Each participant was also asked to identify the errors in the incorrect solutions. For the study presented in this dissertation, participants not only performed the selection task individually on four problems, but they were also placed into groups of two or three and asked to discuss with each other the reasoning they used in making their choices and attempt to reach a consensus about which solution was correct. Finally, those groups were asked to work together to perform the selection task on three new problems. The possibilities framework appropriately models the reasoning that students use, and it makes useful predictions about potentially helpful instructional interventions. The study reported in this dissertation emphasizes the useful insight the

  8. Open source data assimilation framework for hydrological modeling

    NASA Astrophysics Data System (ADS)

    Ridler, Marc; Hummel, Stef; van Velzen, Nils; Katrine Falk, Anne; Madsen, Henrik

    2013-04-01

    An open-source data assimilation framework is proposed for hydrological modeling. Data assimilation (DA) in hydrodynamic and hydrological forecasting systems has great potential to improve predictions and improve model result. The basic principle is to incorporate measurement information into a model with the aim to improve model results by error minimization. Great strides have been made to assimilate traditional in-situ measurements such as discharge, soil moisture, hydraulic head and snowpack into hydrologic models. More recently, remotely sensed data retrievals of soil moisture, snow water equivalent or snow cover area, surface water elevation, terrestrial water storage and land surface temperature have been successfully assimilated in hydrological models. The assimilation algorithms have become increasingly sophisticated to manage measurement and model bias, non-linear systems, data sparsity (time & space) and undetermined system uncertainty. It is therefore useful to use a pre-existing DA toolbox such as OpenDA. OpenDA is an open interface standard for (and free implementation of) a set of tools to quickly implement DA and calibration for arbitrary numerical models. The basic design philosophy of OpenDA is to breakdown DA into a set of building blocks programmed in object oriented languages. To implement DA, a model must interact with OpenDA to create model instances, propagate the model, get/set variables (or parameters) and free the model once DA is completed. An open-source interface for hydrological models exists capable of all these tasks: OpenMI. OpenMI is an open source standard interface already adopted by key hydrological model providers. It defines a universal approach to interact with hydrological models during simulation to exchange data during runtime, thus facilitating the interactions between models and data sources. The interface is flexible enough so that models can interact even if the model is coded in a different language, represent

  9. A new framework for an electrophotographic printer model

    NASA Astrophysics Data System (ADS)

    Colon-Lopez, Fermin A.

    Digital halftoning is a printing technology that creates the illusion of continuous tone images for printing devices such as electrophotographic printers that can only produce a limited number of tone levels. Digital halftoning works because the human visual system has limited spatial resolution which blurs the printed dots of the halftone image, creating the gray sensation of a continuous tone image. Because the printing process is imperfect it introduces distortions to the halftone image. The quality of the printed image depends, among other factors, on the complex interactions between the halftone image, the printer characteristics, the colorant, and the printing substrate. Printer models are used to assist in the development of new types of halftone algorithms that are designed to withstand the effects of printer distortions. For example, model-based halftone algorithms optimize the halftone image through an iterative process that integrates a printer model within the algorithm. The two main goals of a printer model are to provide accurate estimates of the tone and of the spatial characteristics of the printed halftone pattern. Various classes of printer models, from simple tone calibrations to complex mechanistic models, have been reported in the literature. Existing models have one or more of the following limiting factors: they only predict tone reproduction, they depend on the halftone pattern, they require complex calibrations or complex calculations, they are printer specific, they reproduce unrealistic dot structures, and they are unable to adapt responses to new data. The two research objectives of this dissertation are (1) to introduce a new framework for printer modeling and (2) to demonstrate the feasibility of such a framework in building an electrophotographic printer model. The proposed framework introduces the concept of modeling a printer as a texture transformation machine. The basic premise is that modeling the texture differences between the

  10. Framework for the Parametric System Modeling of Space Exploration Architectures

    NASA Technical Reports Server (NTRS)

    Komar, David R.; Hoffman, Jim; Olds, Aaron D.; Seal, Mike D., II

    2008-01-01

    This paper presents a methodology for performing architecture definition and assessment prior to, or during, program formulation that utilizes a centralized, integrated architecture modeling framework operated by a small, core team of general space architects. This framework, known as the Exploration Architecture Model for IN-space and Earth-to-orbit (EXAMINE), enables: 1) a significantly larger fraction of an architecture trade space to be assessed in a given study timeframe; and 2) the complex element-to-element and element-to-system relationships to be quantitatively explored earlier in the design process. Discussion of the methodology advantages and disadvantages with respect to the distributed study team approach typically used within NASA to perform architecture studies is presented along with an overview of EXAMINE s functional components and tools. An example Mars transportation system architecture model is used to demonstrate EXAMINE s capabilities in this paper. However, the framework is generally applicable for exploration architecture modeling with destinations to any celestial body in the solar system.

  11. A VGI data integration framework based on linked data model

    NASA Astrophysics Data System (ADS)

    Wan, Lin; Ren, Rongrong

    2015-12-01

    This paper aims at the geographic data integration and sharing method for multiple online VGI data sets. We propose a semantic-enabled framework for online VGI sources cooperative application environment to solve a target class of geospatial problems. Based on linked data technologies - which is one of core components of semantic web, we can construct the relationship link among geographic features distributed in diverse VGI platform by using linked data modeling methods, then deploy these semantic-enabled entities on the web, and eventually form an interconnected geographic data network to support geospatial information cooperative application across multiple VGI data sources. The mapping and transformation from VGI sources to RDF linked data model is presented to guarantee the unique data represent model among different online social geographic data sources. We propose a mixed strategy which combined spatial distance similarity and feature name attribute similarity as the measure standard to compare and match different geographic features in various VGI data sets. And our work focuses on how to apply Markov logic networks to achieve interlinks of the same linked data in different VGI-based linked data sets. In our method, the automatic generating method of co-reference object identification model according to geographic linked data is discussed in more detail. It finally built a huge geographic linked data network across loosely-coupled VGI web sites. The results of the experiment built on our framework and the evaluation of our method shows the framework is reasonable and practicable.

  12. An Intercomparison of 2-D Models Within a Common Framework

    NASA Technical Reports Server (NTRS)

    Weisenstein, Debra K.; Ko, Malcolm K. W.; Scott, Courtney J.; Jackman, Charles H.; Fleming, Eric L.; Considine, David B.; Kinnison, Douglas E.; Connell, Peter S.; Rotman, Douglas A.; Bhartia, P. K. (Technical Monitor)

    2002-01-01

    A model intercomparison among the Atmospheric and Environmental Research (AER) 2-D model, the Goddard Space Flight Center (GSFC) 2-D model, and the Lawrence Livermore National Laboratory 2-D model allows us to separate differences due to model transport from those due to the model's chemical formulation. This is accomplished by constructing two hybrid models incorporating the transport parameters of the GSFC and LLNL models within the AER model framework. By comparing the results from the native models (AER and e.g. GSFC) with those from the hybrid model (e.g. AER chemistry with GSFC transport), differences due to chemistry and transport can be identified. For the analysis, we examined an inert tracer whose emission pattern is based on emission from a High Speed Civil Transport (HSCT) fleet; distributions of trace species in the 2015 atmosphere; and the response of stratospheric ozone to an HSCT fleet. Differences in NO(y) in the upper stratosphere are found between models with identical transport, implying different model representations of atmospheric chemical processes. The response of O3 concentration to HSCT aircraft emissions differs in the models from both transport-dominated differences in the HSCT-induced perturbations of H2O and NO(y) as well as from differences in the model represent at ions of O3 chemical processes. The model formulations of cold polar processes are found to be the most significant factor in creating large differences in the calculated ozone perturbations

  13. A Structural Model Decomposition Framework for Systems Health Management

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Daigle, Matthew J.; Bregon, Anibal; Pulido, Belamino

    2013-01-01

    Systems health management (SHM) is an important set of technologies aimed at increasing system safety and reliability by detecting, isolating, and identifying faults; and predicting when the system reaches end of life (EOL), so that appropriate fault mitigation and recovery actions can be taken. Model-based SHM approaches typically make use of global, monolithic system models for online analysis, which results in a loss of scalability and efficiency for large-scale systems. Improvement in scalability and efficiency can be achieved by decomposing the system model into smaller local submodels and operating on these submodels instead. In this paper, the global system model is analyzed offline and structurally decomposed into local submodels. We define a common model decomposition framework for extracting submodels from the global model. This framework is then used to develop algorithms for solving model decomposition problems for the design of three separate SHM technologies, namely, estimation (which is useful for fault detection and identification), fault isolation, and EOL prediction. We solve these model decomposition problems using a three-tank system as a case study.

  14. Modelling Framework and Assistive Device for Peripheral Intravenous Injections

    NASA Astrophysics Data System (ADS)

    Kam, Kin F.; Robinson, Martin P.; Gilbert, Mathew A.; Pelah, Adar

    2016-02-01

    Intravenous access for blood sampling or drug administration that requires peripheral venepuncture is perhaps the most common invasive procedure practiced in hospitals, clinics and general practice surgeries.We describe an idealised mathematical framework for modelling the dynamics of the peripheral venepuncture process. Basic assumptions of the model are confirmed through motion analysis of needle trajectories during venepuncture, taken from video recordings of a skilled practitioner injecting into a practice kit. The framework is also applied to the design and construction of a proposed device for accurate needle guidance during venepuncture administration, assessed as consistent and repeatable in application and does not lead to over puncture. The study provides insights into the ubiquitous peripheral venepuncture process and may contribute to applications in training and in the design of new devices, including for use in robotic automation.

  15. A unifying framework for marine ecological model comparison

    NASA Astrophysics Data System (ADS)

    Fennel, Wolfgang; Osborn, Thomas

    2005-05-01

    The complex network of the marine food chain with all the details of the behavior of individuals and the interactions with physical processes cannot be included into one generic model. Modelling requires simplification and idealization. The reduction of complex problems to simpler, but tractable problems are guided by the questions being addressed. Consequently, a variety of different models have been developed with different choices of state variables, process formulations, and different degree of physical control. In the last decade a multitude of studies were based on biogeochemical models, population models, and individual based models. There are now models available that cover the full range from individual based models, to population models, to biomass models, to combinations thereof. The biological model components are linked to physical models ranging from 1d water column models to full 3d general circulation models. This paper attempts to develop an unifying theoretical framework that elucidates the relationships among the different classes of models. The theory is based on state densities, which characterize individuals in an abstract phase space. Integration of the state densities over spatial or biological variables relates population densities, abundance or biomass to individuals.

  16. Common and Innovative Visuals: A sparsity modeling framework for video.

    PubMed

    Abdolhosseini Moghadam, Abdolreza; Kumar, Mrityunjay; Radha, Hayder

    2014-05-01

    Efficient video representation models are critical for many video analysis and processing tasks. In this paper, we present a framework based on the concept of finding the sparsest solution to model video frames. To model the spatio-temporal information, frames from one scene are decomposed into two components: (i) a common frame, which describes the visual information common to all the frames in the scene/segment, and (ii) a set of innovative frames, which depicts the dynamic behaviour of the scene. The proposed approach exploits and builds on recent results in the field of compressed sensing to jointly estimate the common frame and the innovative frames for each video segment. We refer to the proposed modeling framework by CIV (Common and Innovative Visuals). We show how the proposed model can be utilized to find scene change boundaries and extend CIV to videos from multiple scenes. Furthermore, the proposed model is robust to noise and can be used for various video processing applications without relying on motion estimation and detection or image segmentation. Results for object tracking, video editing (object removal, inpainting) and scene change detection are presented to demonstrate the efficiency and the performance of the proposed model. PMID:24808407

  17. A framework for modeling contaminant impacts on reservoir water quality

    NASA Astrophysics Data System (ADS)

    Jeznach, Lillian C.; Jones, Christina; Matthews, Thomas; Tobiason, John E.; Ahlfeld, David P.

    2016-06-01

    This study presents a framework for using hydrodynamic and water quality models to understand the fate and transport of potential contaminants in a reservoir and to develop appropriate emergency response and remedial actions. In the event of an emergency situation, prior detailed modeling efforts and scenario evaluations allow for an understanding of contaminant plume behavior, including maximum concentrations that could occur at the drinking water intake and contaminant travel time to the intake. A case study assessment of the Wachusett Reservoir, a major drinking water supply for metropolitan Boston, MA, provides an example of an application of the framework and how hydrodynamic and water quality models can be used to quantitatively and scientifically guide management in response to varieties of contaminant scenarios. The model CE-QUAL-W2 was used to investigate the water quality impacts of several hypothetical contaminant scenarios, including hypothetical fecal coliform input from a sewage overflow as well as an accidental railway spill of ammonium nitrate. Scenarios investigated the impacts of decay rates, season, and inter-reservoir transfers on contaminant arrival times and concentrations at the drinking water intake. The modeling study highlights the importance of a rapid operational response by managers to contain a contaminant spill in order to minimize the mass of contaminant that enters the water column, based on modeled reservoir hydrodynamics. The development and use of hydrodynamic and water quality models for surface drinking water sources subject to the potential for contaminant entry can provide valuable guidance for making decisions about emergency response and remediation actions.

  18. New model framework and structure and the commonality evaluation model. [concerning unmanned spacecraft projects

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The development of a framework and structure for shuttle era unmanned spacecraft projects and the development of a commonality evaluation model is documented. The methodology developed for model utilization in performing cost trades and comparative evaluations for commonality studies is discussed. The model framework consists of categories of activities associated with the spacecraft system's development process. The model structure describes the physical elements to be treated as separate identifiable entities. Cost estimating relationships for subsystem and program-level components were calculated.

  19. An Integrated Snow Radiance and Snow Physics Modeling Framework for Cold Land Surface Modeling

    NASA Technical Reports Server (NTRS)

    Kim, Edward J.; Tedesco, Marco

    2006-01-01

    Recent developments in forward radiative transfer modeling and physical land surface modeling are converging to allow the assembly of an integrated snow/cold lands modeling framework for land surface modeling and data assimilation applications. The key elements of this framework include: a forward radiative transfer model (FRTM) for snow, a snowpack physical model, a land surface water/energy cycle model, and a data assimilation scheme. Together these form a flexible framework for self-consistent remote sensing and water/energy cycle studies. In this paper we will describe the elements and the integration plan. Each element of this framework is modular so the choice of element can be tailored to match the emphasis of a particular study. For example, within our framework, four choices of a FRTM are available to simulate the brightness temperature of snow: Two models are available to model the physical evolution of the snowpack and underlying soil, and two models are available to handle the water/energy balance at the land surface. Since the framework is modular, other models-physical or statistical--can be accommodated, too. All modules will operate within the framework of the Land Information System (LIS), a land surface modeling framework with data assimilation capabilities running on a parallel-node computing cluster at the NASA Goddard Space Flight Center. The advantages of such an integrated modular framework built on the LIS will be described through examples-e.g., studies to analyze snow field experiment observations, and simulations of future satellite missions for snow and cold land processes.

  20. A constitutive model for magnetostriction based on thermodynamic framework

    NASA Astrophysics Data System (ADS)

    Ho, Kwangsoo

    2016-08-01

    This work presents a general framework for the continuum-based formulation of dissipative materials with magneto-mechanical coupling in the viewpoint of irreversible thermodynamics. The thermodynamically consistent model developed for the magnetic hysteresis is extended to include the magnetostrictive effect. The dissipative and hysteretic response of magnetostrictive materials is captured through the introduction of internal state variables. The evolution rate of magnetostrictive strain as well as magnetization is derived from thermodynamic and dissipative potentials in accordance with the general principles of thermodynamics. It is then demonstrated that the constitutive model is competent to describe the magneto-mechanical behavior by comparing simulation results with the experimental data reported in the literature.

  1. Hybrid automata as a unifying framework for modeling excitable cells.

    PubMed

    Ye, P; Entcheva, E; Smolka, S A; True, M R; Grosu, R

    2006-01-01

    We propose hybrid automata (HA) as a unifying framework for computational models of excitable cells. HA, which combine discrete transition graphs with continuous dynamics, can be naturally used to obtain a piecewise, possibly linear, approximation of a nonlinear excitable-cell model. We first show how HA can be used to efficiently capture the action-potential morphology--as well as reproduce typical excitable-cell characteristics such as refractoriness and restitution--of the dynamic Luo-Rudy model of a guinea-pig ventricular myocyte. We then recast two well-known computational models, Biktashev's and Fenton-Karma, as HA without any loss of expressiveness. Given that HA possess an intuitive graphical representation and are supported by a rich mathematical theory and numerous analysis tools, we argue that they are well positioned as a computational model for biological processes. PMID:17947070

  2. Velo: A Knowledge Management Framework for Modeling and Simulation

    SciTech Connect

    Gorton, Ian; Sivaramakrishnan, Chandrika; Black, Gary D.; White, Signe K.; Purohit, Sumit; Lansing, Carina S.; Madison, Michael C.; Schuchardt, Karen L.; Liu, Yan

    2012-03-01

    Modern scientific enterprises are inherently knowledge-intensive. Scientific studies in domains such as geosciences, climate, and biology require the acquisition and manipulation of large amounts of experimental and field data to create inputs for large-scale computational simulations. The results of these simulations are then analyzed, leading to refinements of inputs and models and additional simulations. The results of this process must be managed and archived to provide justifications for regulatory decisions and publications that are based on the models. In this paper we introduce our Velo framework that is designed as a reusable, domain independent knowledge management infrastructure for modeling and simulation. Velo leverages, integrates and extends open source collaborative and content management technologies to create a scalable and flexible core platform that can be tailored to specific scientific domains. We describe the architecture of Velo for managing and associating the various types of data that are used and created in modeling and simulation projects, as well as the framework for integrating domain-specific tools. To demonstrate realizations of Velo, we describe examples from two deployed sites for carbon sequestration and climate modeling. These provide concrete example of the inherent extensibility and utility of our approach.

  3. Development of a distributed air pollutant dry deposition modeling framework.

    PubMed

    Hirabayashi, Satoshi; Kroll, Charles N; Nowak, David J

    2012-12-01

    A distributed air pollutant dry deposition modeling system was developed with a geographic information system (GIS) to enhance the functionality of i-Tree Eco (i-Tree, 2011). With the developed system, temperature, leaf area index (LAI) and air pollutant concentration in a spatially distributed form can be estimated, and based on these and other input variables, dry deposition of carbon monoxide (CO), nitrogen dioxide (NO(2)), sulfur dioxide (SO(2)), and particulate matter less than 10 microns (PM10) to trees can be spatially quantified. Employing nationally available road network, traffic volume, air pollutant emission/measurement and meteorological data, the developed system provides a framework for the U.S. city managers to identify spatial patterns of urban forest and locate potential areas for future urban forest planting and protection to improve air quality. To exhibit the usability of the framework, a case study was performed for July and August of 2005 in Baltimore, MD. PMID:22858662

  4. The ontology model of FrontCRM framework

    NASA Astrophysics Data System (ADS)

    Budiardjo, Eko K.; Perdana, Wira; Franshisca, Felicia

    2013-03-01

    Adoption and implementation of Customer Relationship Management (CRM) is not merely a technological installation, but the emphasis is more on the application of customer-centric philosophy and culture as a whole. CRM must begin at the level of business strategy, the only level that thorough organizational changes are possible to be done. Changes agenda can be directed to each departmental plans, and supported by information technology. Work processes related to CRM concept include marketing, sales, and services. FrontCRM is developed as framework to guide in identifying business processes related to CRM in which based on the concept of strategic planning approach. This leads to processes and practices identification in every process area related to marketing, sales, and services. The Ontology model presented on this paper by means serves as tools to avoid framework misunderstanding, to define practices systematically within process area and to find CRM software features related to those practices.

  5. A General Framework for Multiphysics Modeling Based on Numerical Averaging

    NASA Astrophysics Data System (ADS)

    Lunati, I.; Tomin, P.

    2014-12-01

    In the last years, multiphysics (hybrid) modeling has attracted increasing attention as a tool to bridge the gap between pore-scale processes and a continuum description at the meter-scale (laboratory scale). This approach is particularly appealing for complex nonlinear processes, such as multiphase flow, reactive transport, density-driven instabilities, and geomechanical coupling. We present a general framework that can be applied to all these classes of problems. The method is based on ideas from the Multiscale Finite-Volume method (MsFV), which has been originally developed for Darcy-scale application. Recently, we have reformulated MsFV starting with a local-global splitting, which allows us to retain the original degree of coupling for the local problems and to use spatiotemporal adaptive strategies. The new framework is based on the simple idea that different characteristic temporal scales are inherited from different spatial scales, and the global and the local problems are solved with different temporal resolutions. The global (coarse-scale) problem is constructed based on a numerical volume-averaging paradigm and a continuum (Darcy-scale) description is obtained by introducing additional simplifications (e.g., by assuming that pressure is the only independent variable at the coarse scale, we recover an extended Darcy's law). We demonstrate that it is possible to adaptively and dynamically couple the Darcy-scale and the pore-scale descriptions of multiphase flow in a single conceptual and computational framework. Pore-scale problems are solved only in the active front region where fluid distribution changes with time. In the rest of the domain, only a coarse description is employed. This framework can be applied to other important problems such as reactive transport and crack propagation. As it is based on a numerical upscaling paradigm, our method can be used to explore the limits of validity of macroscopic models and to illuminate the meaning of

  6. PyCatch: catchment modelling in the PCRaster framework

    NASA Astrophysics Data System (ADS)

    Karssenberg, Derek; Lana-Renault, Noemí; Schmitz, Oliver

    2015-04-01

    PCRaster is an open source software framework for the construction and execution of stochastic, spatio-temporal, forward, models. It provides a large number of spatial operations on raster maps, with an emphasis on operations that are capable of transporting material (water, sediment) over a drainage network. These operations have been written in C++ and are provided to the model builder as Python functions. Models are constructed by combining these functions in a Python script. To ease implementation of models that use time steps and Monte Carlo iterations, the software comes with a Python framework providing control flow for temporal modelling and Monte Carlo simulation, including options for Bayesian data assimilation (Ensemble Kalman Filter, Particle Filter). A sophisticated visualization tool is provided capable of visualizing, animating, and exploring stochastic, spatio-temporal input or model output data. PCRaster is used for construction of for instance hydrological models (hillslope to global scale), land use change models, and geomorphological models. It is still being improved upon, for instance by adding under the hood functionality for executing models on multiple CPU cores, and by adding components for agent-based and network simulation. The software runs in MS Windows and Linux and is available at http://www.pcraster.eu. We provide an extensive set of online course materials (partly available free of charge). Using the PCRaster software framework, we recently developed the PyCatch model components for hydrological modelling and land degradation modelling at catchment scale. The PyCatch components run at time steps of seconds to weeks, and grid cell sizes of approximately 1-100 m, which can be selected depending on the case study for which PyCatch is used. Hydrological components currently implemented include classes for simulation of incoming solar radiation, evapotranspiration (Penman-Monteith), surface storage, infiltration (Green and Ampt

  7. Generalized framework for context-specific metabolic model extraction methods

    PubMed Central

    Robaina Estévez, Semidán; Nikoloski, Zoran

    2014-01-01

    Genome-scale metabolic models (GEMs) are increasingly applied to investigate the physiology not only of simple prokaryotes, but also eukaryotes, such as plants, characterized with compartmentalized cells of multiple types. While genome-scale models aim at including the entirety of known metabolic reactions, mounting evidence has indicated that only a subset of these reactions is active in a given context, including: developmental stage, cell type, or environment. As a result, several methods have been proposed to reconstruct context-specific models from existing genome-scale models by integrating various types of high-throughput data. Here we present a mathematical framework that puts all existing methods under one umbrella and provides the means to better understand their functioning, highlight similarities and differences, and to help users in selecting a most suitable method for an application. PMID:25285097

  8. Modeling air pollution in the Tracking and Analysis Framework (TAF)

    SciTech Connect

    Shannon, J.D.

    1998-12-31

    The Tracking and Analysis Framework (TAF) is a set of interactive computer models for integrated assessment of the Acid Rain Provisions (Title IV) of the 1990 Clean Air Act Amendments. TAF is designed to execute in minutes on a personal computer, thereby making it feasible for a researcher or policy analyst to examine quickly the effects of alternate modeling assumptions or policy scenarios. Because the development of TAF involves researchers in many different disciplines, TAF has been given a modular structure. In most cases, the modules contain reduced-form models that are based on more complete models exercised off-line. The structure of TAF as of December 1996 is shown. Both the Atmospheric Pathways Module produce estimates for regional air pollution variables.

  9. Modelling multimedia teleservices with OSI upper layers framework: Short paper

    NASA Astrophysics Data System (ADS)

    Widya, I.; Vanrijssen, E.; Michiels, E.

    The paper presents the use of the concepts and modelling principles of the Open Systems Interconnection (OSI) upper layers structure in the modelling of multimedia teleservices. It puts emphasis on the revised Application Layer Structure (OSI/ALS). OSI/ALS is an object based reference model which intends to coordinate the development of application oriented services and protocols in a consistent and modular way. It enables the rapid deployment and integrated use of these services. The paper emphasizes further on the nesting structure defined in OSI/ALS which allows the design of scalable and user tailorable/controllable teleservices. OSI/ALS consistent teleservices are moreover implementable on communication platforms of different capabilities. An analysis of distributed multimedia architectures which can be found in the literature, confirms the ability of the OSI/ALS framework to model the interworking functionalities of teleservices.

  10. A Structural Model Decomposition Framework for Hybrid Systems Diagnosis

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Bregon, Anibal; Roychoudhury, Indranil

    2015-01-01

    Nowadays, a large number of practical systems in aerospace and industrial environments are best represented as hybrid systems that consist of discrete modes of behavior, each defined by a set of continuous dynamics. These hybrid dynamics make the on-line fault diagnosis task very challenging. In this work, we present a new modeling and diagnosis framework for hybrid systems. Models are composed from sets of user-defined components using a compositional modeling approach. Submodels for residual generation are then generated for a given mode, and reconfigured efficiently when the mode changes. Efficient reconfiguration is established by exploiting causality information within the hybrid system models. The submodels can then be used for fault diagnosis based on residual generation and analysis. We demonstrate the efficient causality reassignment, submodel reconfiguration, and residual generation for fault diagnosis using an electrical circuit case study.

  11. Flexible Modeling of Epidemics with an Empirical Bayes Framework

    PubMed Central

    Brooks, Logan C.; Farrow, David C.; Hyun, Sangwon; Tibshirani, Ryan J.; Rosenfeld, Roni

    2015-01-01

    Seasonal influenza epidemics cause consistent, considerable, widespread loss annually in terms of economic burden, morbidity, and mortality. With access to accurate and reliable forecasts of a current or upcoming influenza epidemic’s behavior, policy makers can design and implement more effective countermeasures. This past year, the Centers for Disease Control and Prevention hosted the “Predict the Influenza Season Challenge”, with the task of predicting key epidemiological measures for the 2013–2014 U.S. influenza season with the help of digital surveillance data. We developed a framework for in-season forecasts of epidemics using a semiparametric Empirical Bayes framework, and applied it to predict the weekly percentage of outpatient doctors visits for influenza-like illness, and the season onset, duration, peak time, and peak height, with and without using Google Flu Trends data. Previous work on epidemic modeling has focused on developing mechanistic models of disease behavior and applying time series tools to explain historical data. However, tailoring these models to certain types of surveillance data can be challenging, and overly complex models with many parameters can compromise forecasting ability. Our approach instead produces possibilities for the epidemic curve of the season of interest using modified versions of data from previous seasons, allowing for reasonable variations in the timing, pace, and intensity of the seasonal epidemics, as well as noise in observations. Since the framework does not make strict domain-specific assumptions, it can easily be applied to some other diseases with seasonal epidemics. This method produces a complete posterior distribution over epidemic curves, rather than, for example, solely point predictions of forecasting targets. We report prospective influenza-like-illness forecasts made for the 2013–2014 U.S. influenza season, and compare the framework’s cross-validated prediction error on historical data to

  12. A visual interface for the SUPERFLEX hydrological modelling framework

    NASA Astrophysics Data System (ADS)

    Gao, H.; Fenicia, F.; Kavetski, D.; Savenije, H. H. G.

    2012-04-01

    The SUPERFLEX framework is a modular modelling system for conceptual hydrological modelling at the catchment scale. This work reports the development of a visual interface for the SUPERFLEX model. This aims to enhance the communication between the hydrologic experimentalists and modelers, in particular further bridging the gap between the field soft data and the modeler's knowledge. In collaboration with field experimentalists, modelers can visually and intuitively hypothesize different model architectures and combinations of reservoirs, select from a library of constructive functions to describe the relationship between reservoirs' storage and discharge, specify the shape of lag functions and, finally, set parameter values. The software helps hydrologists take advantage of any existing insights into the study site, translate it into a conceptual hydrological model and implement it within a computationally robust algorithm. This tool also helps challenge and contrast competing paradigms such as the "uniqueness of place" vs "one model fits all". Using this interface, hydrologists can test different hypotheses and model representations, and stepwise build deeper understanding of the watershed of interest.

  13. Archetype Model-Driven Development Framework for EHR Web System

    PubMed Central

    Kimura, Eizen; Ishihara, Ken

    2013-01-01

    Objectives This article describes the Web application framework for Electronic Health Records (EHRs) we have developed to reduce construction costs for EHR sytems. Methods The openEHR project has developed clinical model driven architecture for future-proof interoperable EHR systems. This project provides the specifications to standardize clinical domain model implementations, upon which the ISO/CEN 13606 standards are based. The reference implementation has been formally described in Eiffel. Moreover C# and Java implementations have been developed as reference. While scripting languages had been more popular because of their higher efficiency and faster development in recent years, they had not been involved in the openEHR implementations. From 2007, we have used the Ruby language and Ruby on Rails (RoR) as an agile development platform to implement EHR systems, which is in conformity with the openEHR specifications. Results We implemented almost all of the specifications, the Archetype Definition Language parser, and RoR scaffold generator from archetype. Although some problems have emerged, most of them have been resolved. Conclusions We have provided an agile EHR Web framework, which can build up Web systems from archetype models using RoR. The feasibility of the archetype model to provide semantic interoperability of EHRs has been demonstrated and we have verified that that it is suitable for the construction of EHR systems. PMID:24523991

  14. Modeling QCD for Hadron Physics

    SciTech Connect

    Tandy, P. C.

    2011-10-24

    We review the approach to modeling soft hadron physics observables based on the Dyson-Schwinger equations of QCD. The focus is on light quark mesons and in particular the pseudoscalar and vector ground states, their decays and electromagnetic couplings. We detail the wide variety of observables that can be correlated by a ladder-rainbow kernel with one infrared parameter fixed to the chiral quark condensate. A recently proposed novel perspective in which the quark condensate is contained within hadrons and not the vacuum is mentioned. The valence quark parton distributions, in the pion and kaon, as measured in the Drell Yan process, are investigated with the same ladder-rainbow truncation of the Dyson-Schwinger and Bethe-Salpeter equations.

  15. Modeling QCD for Hadron Physics

    NASA Astrophysics Data System (ADS)

    Tandy, P. C.

    2011-10-01

    We review the approach to modeling soft hadron physics observables based on the Dyson-Schwinger equations of QCD. The focus is on light quark mesons and in particular the pseudoscalar and vector ground states, their decays and electromagnetic couplings. We detail the wide variety of observables that can be correlated by a ladder-rainbow kernel with one infrared parameter fixed to the chiral quark condensate. A recently proposed novel perspective in which the quark condensate is contained within hadrons and not the vacuum is mentioned. The valence quark parton distributions, in the pion and kaon, as measured in the Drell Yan process, are investigated with the same ladder-rainbow truncation of the Dyson-Schwinger and Bethe-Salpeter equations.

  16. Service-Oriented Approach to Coupling Earth System Models and Modeling Frameworks

    NASA Astrophysics Data System (ADS)

    Goodall, J. L.; Saint, K. D.; Ercan, M. B.; Briley, L. J.; Murphy, S.; You, H.; DeLuca, C.; Rood, R. B.

    2012-12-01

    Modeling water systems often requires coupling models across traditional Earth science disciplinary boundaries. While there has been significant effort within various Earth science disciplines (e.g., atmospheric science, hydrology, and Earth surface dynamics) to create models and, more recently, modeling frameworks, there has been less work on methods for coupling across disciplinary-specific models and modeling frameworks. We present work investigating one possible method for coupling across disciplinary-specific Earth system models and modeling frameworks: service-oriented architectures. In a service-oriented architecture, models act as distinct units or components within a system and are designed to pass well defined messages to consumers of the service. While the approach offers the potential to couple heterogeneous computational models by allowing a high degree of autonomy across models of the Earth system, there are significant scientific and technical challenges to be addressed when coupling models designed for different communities and built for different modeling frameworks. We have addressed some of these challenges through a case study where we coupled a hydrologic model compliant with the OpenMI standard with an atmospheric model compliant with the EMSF standard. In this case study, the two models were coupled through data exchanges of boundary conditions enabled by exposing the atmospheric model as a web service. A discussion of the technical and scientific challenges, some that we have addressed and others that remain open, will be presented including differences in computer architectures, data semantics, and spatial scales between the coupled models.

  17. LQCD workflow execution framework: Models, provenance and fault-tolerance

    NASA Astrophysics Data System (ADS)

    Piccoli, Luciano; Dubey, Abhishek; Simone, James N.; Kowalkowlski, James B.

    2010-04-01

    Large computing clusters used for scientific processing suffer from systemic failures when operated over long continuous periods for executing workflows. Diagnosing job problems and faults leading to eventual failures in this complex environment is difficult, specifically when the success of an entire workflow might be affected by a single job failure. In this paper, we introduce a model-based, hierarchical, reliable execution framework that encompass workflow specification, data provenance, execution tracking and online monitoring of each workflow task, also referred to as participants. The sequence of participants is described in an abstract parameterized view, which is translated into a concrete data dependency based sequence of participants with defined arguments. As participants belonging to a workflow are mapped onto machines and executed, periodic and on-demand monitoring of vital health parameters on allocated nodes is enabled according to pre-specified rules. These rules specify conditions that must be true pre-execution, during execution and post-execution. Monitoring information for each participant is propagated upwards through the reflex and healing architecture, which consists of a hierarchical network of decentralized fault management entities, called reflex engines. They are instantiated as state machines or timed automatons that change state and initiate reflexive mitigation action(s) upon occurrence of certain faults. We describe how this cluster reliability framework is combined with the workflow execution framework using formal rules and actions specified within a structure of first order predicate logic that enables a dynamic management design that reduces manual administrative workload, and increases cluster-productivity.

  18. ASSESSING MULTIMEDIA/MULTIPATHWAY EXPOSURE TO ARSENIC USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    A series of case studies is presented focusing on multimedia/multipathway population exposures to arsenic, employing the Population Based Modeling approach of the MENTOR (Modeling Environment for Total Risks) framework. This framework considers currently five exposure routes: i...

  19. A Robust Control Design Framework for Substructure Models

    NASA Technical Reports Server (NTRS)

    Lim, Kyong B.

    1994-01-01

    A framework for designing control systems directly from substructure models and uncertainties is proposed. The technique is based on combining a set of substructure robust control problems by an interface stiffness matrix which appears as a constant gain feedback. Variations of uncertainties in the interface stiffness are treated as a parametric uncertainty. It is shown that multivariable robust control can be applied to generate centralized or decentralized controllers that guarantee performance with respect to uncertainties in the interface stiffness, reduced component modes and external disturbances. The technique is particularly suited for large, complex, and weakly coupled flexible structures.

  20. A Graph Based Framework to Model Virus Integration Sites.

    PubMed

    Fronza, Raffaele; Vasciaveo, Alessandro; Benso, Alfredo; Schmidt, Manfred

    2016-01-01

    With next generation sequencing thousands of virus and viral vector integration genome targets are now under investigation to uncover specific integration preferences and to define clusters of integration, termed common integration sites (CIS), that may allow to assess gene therapy safety or to detect disease related genomic features such as oncogenes. Here, we addressed the challenge to: 1) define the notion of CIS on graph models, 2) demonstrate that the structure of CIS enters in the category of scale-free networks and 3) show that our network approach analyzes CIS dynamically in an integrated systems biology framework using the Retroviral Transposon Tagged Cancer Gene Database (RTCGD) as a testing dataset. PMID:27257470

  1. A Framework and Model for Evaluating Clinical Decision Support Architectures

    PubMed Central

    Wright, Adam; Sittig, Dean F.

    2008-01-01

    In this paper, we develop a four-phase model for evaluating architectures for clinical decision support that focuses on: defining a set of desirable features for a decision support architecture; building a proof-of-concept prototype; demonstrating that the architecture is useful by showing that it can be integrated with existing decision support systems and comparing its coverage to that of other architectures. We apply this framework to several well-known decision support architectures, including Arden Syntax, GLIF, SEBASTIAN and SAGE PMID:18462999

  2. A hybrid parallel framework for the cellular Potts model simulations

    SciTech Connect

    Jiang, Yi; He, Kejing; Dong, Shoubin

    2009-01-01

    The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approach achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).

  3. An Integrated Framework Advancing Membrane Protein Modeling and Design

    PubMed Central

    Weitzner, Brian D.; Duran, Amanda M.; Tilley, Drew C.; Elazar, Assaf; Gray, Jeffrey J.

    2015-01-01

    Membrane proteins are critical functional molecules in the human body, constituting more than 30% of open reading frames in the human genome. Unfortunately, a myriad of difficulties in overexpression and reconstitution into membrane mimetics severely limit our ability to determine their structures. Computational tools are therefore instrumental to membrane protein structure prediction, consequently increasing our understanding of membrane protein function and their role in disease. Here, we describe a general framework facilitating membrane protein modeling and design that combines the scientific principles for membrane protein modeling with the flexible software architecture of Rosetta3. This new framework, called RosettaMP, provides a general membrane representation that interfaces with scoring, conformational sampling, and mutation routines that can be easily combined to create new protocols. To demonstrate the capabilities of this implementation, we developed four proof-of-concept applications for (1) prediction of free energy changes upon mutation; (2) high-resolution structural refinement; (3) protein-protein docking; and (4) assembly of symmetric protein complexes, all in the membrane environment. Preliminary data show that these algorithms can produce meaningful scores and structures. The data also suggest needed improvements to both sampling routines and score functions. Importantly, the applications collectively demonstrate the potential of combining the flexible nature of RosettaMP with the power of Rosetta algorithms to facilitate membrane protein modeling and design. PMID:26325167

  4. An Integrated Framework Advancing Membrane Protein Modeling and Design.

    PubMed

    Alford, Rebecca F; Koehler Leman, Julia; Weitzner, Brian D; Duran, Amanda M; Tilley, Drew C; Elazar, Assaf; Gray, Jeffrey J

    2015-09-01

    Membrane proteins are critical functional molecules in the human body, constituting more than 30% of open reading frames in the human genome. Unfortunately, a myriad of difficulties in overexpression and reconstitution into membrane mimetics severely limit our ability to determine their structures. Computational tools are therefore instrumental to membrane protein structure prediction, consequently increasing our understanding of membrane protein function and their role in disease. Here, we describe a general framework facilitating membrane protein modeling and design that combines the scientific principles for membrane protein modeling with the flexible software architecture of Rosetta3. This new framework, called RosettaMP, provides a general membrane representation that interfaces with scoring, conformational sampling, and mutation routines that can be easily combined to create new protocols. To demonstrate the capabilities of this implementation, we developed four proof-of-concept applications for (1) prediction of free energy changes upon mutation; (2) high-resolution structural refinement; (3) protein-protein docking; and (4) assembly of symmetric protein complexes, all in the membrane environment. Preliminary data show that these algorithms can produce meaningful scores and structures. The data also suggest needed improvements to both sampling routines and score functions. Importantly, the applications collectively demonstrate the potential of combining the flexible nature of RosettaMP with the power of Rosetta algorithms to facilitate membrane protein modeling and design. PMID:26325167

  5. A unifying modeling framework for highly multivariate disease mapping.

    PubMed

    Botella-Rocamora, P; Martinez-Beneito, M A; Banerjee, S

    2015-04-30

    Multivariate disease mapping refers to the joint mapping of multiple diseases from regionally aggregated data and continues to be the subject of considerable attention for biostatisticians and spatial epidemiologists. The key issue is to map multiple diseases accounting for any correlations among themselves. Recently, Martinez-Beneito (2013) provided a unifying framework for multivariate disease mapping. While attractive in that it colligates a variety of existing statistical models for mapping multiple diseases, this and other existing approaches are computationally burdensome and preclude the multivariate analysis of moderate to large numbers of diseases. Here, we propose an alternative reformulation that accrues substantial computational benefits enabling the joint mapping of tens of diseases. Furthermore, the approach subsumes almost all existing classes of multivariate disease mapping models and offers substantial insight into the properties of statistical disease mapping models. PMID:25645551

  6. CIMS: A FRAMEWORK FOR INFRASTRUCTURE INTERDEPENDENCY MODELING AND ANALYSIS

    SciTech Connect

    Donald D. Dudenhoeffer; May R. Permann; Milos Manic

    2006-12-01

    Today’s society relies greatly upon an array of complex national and international infrastructure networks such as transportation, utilities, telecommunication, and even financial networks. While modeling and simulation tools have provided insight into the behavior of individual infrastructure networks, a far less understood area is that of the interrelationships among multiple infrastructure networks including the potential cascading effects that may result due to these interdependencies. This paper first describes infrastructure interdependencies as well as presenting a formalization of interdependency types. Next the paper describes a modeling and simulation framework called CIMS© and the work that is being conducted at the Idaho National Laboratory (INL) to model and simulate infrastructure interdependencies and the complex behaviors that can result.

  7. The Application of Architecture Frameworks to Modelling Exploration Operations Costs

    NASA Technical Reports Server (NTRS)

    Shishko, Robert

    2006-01-01

    Developments in architectural frameworks and system-of-systems thinking have provided useful constructs for systems engineering. DoDAF concepts, language, and formalisms, in particular, provide a natural way of conceptualizing an operations cost model applicable to NASA's space exploration vision. Not all DoDAF products have meaning or apply to a DoDAF inspired operations cost model, but this paper describes how such DoDAF concepts as nodes, systems, and operational activities relate to the development of a model to estimate exploration operations costs. The paper discusses the specific implementation to the Mission Operations Directorate (MOD) operational functions/activities currently being developed and presents an overview of how this powerful representation can apply to robotic space missions as well.

  8. Using the Mead model as a framework for nursing care.

    PubMed

    Edwards, S L

    1992-12-01

    A model of nursing has no valid purpose unless it serves nurses to help make their nursing better (Fawcett, 1989). The Mead model formed the basis for nursing care of Jason, a young patient who sustained a head injury, a puncture wound and lacerations to his face, in the study presented here. Examination of the Mead Model of nursing is followed by an account of why this model was used in preference to others as a framework for Jason's care. Three components of his nursing care--wound care, communication, involvement of relatives--are discussed in relation to both the model and current knowledge. It was concluded that as a structured way of planning and giving care, the Mead model lacks adequate guidelines. A less experienced nurse using the Mead model may overlook certain aspects of care, an experienced nurse may use his/her knowledge to give high standard care using research-based information. However, models need to be tested so they may be rejected or modified as guidelines for care in this case in the United Kingdom, within a welfare-orientated society. PMID:1483020

  9. Sensor management using a new framework for observation modeling

    NASA Astrophysics Data System (ADS)

    Kolba, Mark P.; Collins, Leslie M.

    2009-05-01

    In previous work, a sensor management framework has been developed that manages a suite of sensors in a search for static targets within a grid of cells. This framework has been studied for binary, non-binary, and correlated sensor observations, and the sensor manager was found to outperform a direct search technique with each of these different types of observations. Uncertainty modeling for both binary and non-binary observations has also been studied. In this paper, a new observation model is introduced that is motivated by the physics of static target detection problems such as landmine detection and unexploded ordnance (UXO) discrimination. The new observation model naturally accommodates correlated sensor observations and models both the correlation that occurs between observations made by different sensors and the correlation that occurs between observations made by the same sensor. Uncertainty modeling is also implicitly incorporated into the observation model because the underlying parameters of the target and clutter cells are allowed to vary and are not assumed to be constant across target cells and across clutter cells. Sensor management is then performed by maximizing the expected information gain that is made with each new sensor observation. The performance of the sensor manager is examined through performance evaluation with real data from the UXO discrimination application. It is demonstrated that the sensor manager is able to provide comparable detection performance to a direct search strategy using fewer sensor observations than direct search. It is also demonstrated that the sensor manager is able to ignore features that are uninformative to the discrimination problem.

  10. DEVELOP MULTI-STRESSOR, OPEN ARCHITECTURE MODELING FRAMEWORK FOR ECOLOGICAL EXPOSURE FROM SITE TO WATERSHED SCALE

    EPA Science Inventory

    A number of multimedia modeling frameworks are currently being developed. The Multimedia Integrated Modeling System (MIMS) is one of these frameworks. A framework should be seen as more of a multimedia modeling infrastructure than a single software system. This infrastructure do...

  11. Receptor modeling application framework for particle source apportionment.

    PubMed

    Watson, John G; Zhu, Tan; Chow, Judith C; Engelbrecht, Johann; Fujita, Eric M; Wilson, William E

    2002-12-01

    Receptor models infer contributions from particulate matter (PM) source types using multivariate measurements of particle chemical and physical properties. Receptor models complement source models that estimate concentrations from emissions inventories and transport meteorology. Enrichment factor, chemical mass balance, multiple linear regression, eigenvector. edge detection, neural network, aerosol evolution, and aerosol equilibrium models have all been used to solve particulate air quality problems, and more than 500 citations of their theory and application document these uses. While elements, ions, and carbons were often used to apportion TSP, PM10, and PM2.5 among many source types, many of these components have been reduced in source emissions such that more complex measurements of carbon fractions, specific organic compounds, single particle characteristics, and isotopic abundances now need to be measured in source and receptor samples. Compliance monitoring networks are not usually designed to obtain data for the observables, locations, and time periods that allow receptor models to be applied. Measurements from existing networks can be used to form conceptual models that allow the needed monitoring network to be optimized. The framework for using receptor models to solve air quality problems consists of: (1) formulating a conceptual model; (2) identifying potential sources; (3) characterizing source emissions; (4) obtaining and analyzing ambient PM samples for major components and source markers; (5) confirming source types with multivariate receptor models; (6) quantifying source contributions with the chemical mass balance; (7) estimating profile changes and the limiting precursor gases for secondary aerosols; and (8) reconciling receptor modeling results with source models, emissions inventories, and receptor data analyses. PMID:12492167

  12. Modeling the spectral solar irradiance in the SOTERIA Project Framework

    NASA Astrophysics Data System (ADS)

    Vieira, Luis Eduardo; Dudok de Wit, Thierry; Kretzschmar, Matthieu; Cessateur, Gaël

    The evolution of the radiative energy input is a key element to understand the variability of the Earth's neutral and ionized atmospheric components. However, reliable observations are limited to the last decades, when observations realized above the Earth's atmosphere became possible. These observations have provide insights about the variability of the spectral solar irradiance on time scales from days to years, but there is still large uncertainties on the evolu-tion on time scales from decades to centuries. Here we discuss the physics-based modeling of the ultraviolet solar irradiance under development in the Solar-Terrestrial Investigations and Archives (SOTERIA) project framework. In addition, we compare the modeled solar emission with variability observed by LYRA instrument onboard of Proba2 spacecraft.

  13. Modeling of Active Transmembrane Transport in a Mixture Theory Framework

    PubMed Central

    Ateshian, Gerard A.; Morrison, Barclay; Hung, Clark T.

    2010-01-01

    This study formulates governing equations for active transport across semi-permeable membranes within the framework of the theory of mixtures. In mixture theory, which models the interactions of any number of fluid and solid constituents, a supply term appears in the conservation of linear momentum to describe momentum exchanges among the constituents. In past applications, this momentum supply was used to model frictional interactions only, thereby describing passive transport processes. In this study, it is shown that active transport processes, which impart momentum to solutes or solvent, may also be incorporated in this term. By projecting the equation of conservation of linear momentum along the normal to the membrane, a jump condition is formulated for the mechano-electrochemical potential of fluid constituents which is generally applicable to nonequilibrium processes involving active transport. The resulting relations are simple and easy to use, and address an important need in the membrane transport literature. PMID:20213212

  14. An epidemiological framework for modelling fungicide dynamics and control.

    PubMed

    Castle, Matthew D; Gilligan, Christopher A

    2012-01-01

    Defining appropriate policies for controlling the spread of fungal disease in agricultural landscapes requires appropriate theoretical models. Most existing models for the fungicidal control of plant diseases do not explicitly include the dynamics of the fungicide itself, nor do they consider the impact of infection occurring during the host growth phase. We introduce a modelling framework for fungicide application that allows us to consider how "explicit" modelling of fungicide dynamics affects the invasion and persistence of plant pathogens. Specifically, we show that "explicit" models exhibit bistability zones for values of the basic reproductive number (R0) less than one within which the invasion and persistence threshold depends on the initial infection levels. This is in contrast to classical models where invasion and persistence thresholds are solely dependent on R0. In addition if initial infection occurs during the growth phase then an additional "invasion zone" can exist for even smaller values of R0. Within this region the system will experience an epidemic that is not able to persist. We further show that ideal fungicides with high levels of effectiveness, low rates of application and low rates of decay lead to the existence of these bistability zones. The results are robust to the inclusion of demographic stochasticity. PMID:22899992

  15. A Novel Modeling Framework for Heterogeneous Catalyst Design

    NASA Astrophysics Data System (ADS)

    Katare, Santhoji; Bhan, Aditya; Caruthers, James; Delgass, Nicholas; Lauterbach, Jochen; Venkatasubramanian, Venkat

    2002-03-01

    A systems-oriented, integrated knowledge architecture that enables the use of data from High Throughput Experiments (HTE) for catalyst design is being developed. Higher-level critical reasoning is required to extract information efficiently from the increasingly available HTE data and to develop predictive models that can be used for design purposes. Towards this objective, we have developed a framework that aids the catalyst designer in negotiating the data and model complexities. Traditional kinetic and statistical tools have been systematically implemented and novel artificial intelligence tools have been developed and integrated to speed up the process of modeling catalytic reactions. Multiple nonlinear models that describe CO oxidation on supported metals have been screened using qualitative and quantitative features based optimization ideas. Physical constraints of the system have been used to select the optimum model parameters from the multiple solutions to the parameter estimation problem. Preliminary results about the selection of catalyst descriptors that match a target performance and the use of HTE data for refining fundamentals based models will be discussed.

  16. A framework for quantifying net benefits of alternative prognostic models.

    PubMed

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G

    2012-01-30

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. PMID:21905066

  17. A Categorical Framework for Model Classification in the Geosciences

    NASA Astrophysics Data System (ADS)

    Hauhs, Michael; Trancón y Widemann, Baltasar; Lange, Holger

    2016-04-01

    Models have a mixed record of success in the geosciences. In meteorology, model development and implementation has been among the first and most successful examples of triggering computer technology in science. On the other hand, notorious problems such as the 'equifinality issue' in hydrology lead to a rather mixed reputation of models in other areas. The most successful models in geosciences are applications of dynamic systems theory to non-living systems or phenomena. Thus, we start from the hypothesis that the success of model applications relates to the influence of life on the phenomenon under study. We thus focus on the (formal) representation of life in models. The aim is to investigate whether disappointment in model performance is due to system properties such as heterogeneity and historicity of ecosystems, or rather reflects an abstraction and formalisation problem at a fundamental level. As a formal framework for this investigation, we use category theory as applied in computer science to specify behaviour at an interface. Its methods have been developed for translating and comparing formal structures among different application areas and seems highly suited for a classification of the current "model zoo" in the geosciences. The approach is rather abstract, with a high degree of generality but a low level of expressibility. Here, category theory will be employed to check the consistency of assumptions about life in different models. It will be shown that it is sufficient to distinguish just four logical cases to check for consistency of model content. All four cases can be formalised as variants of coalgebra-algebra homomorphisms. It can be demonstrated that transitions between the four variants affect the relevant observations (time series or spatial maps), the formalisms used (equations, decision trees) and the test criteria of success (prediction, classification) of the resulting model types. We will present examples from hydrology and ecology in

  18. Quasi-3D Multi-scale Modeling Framework Development

    NASA Astrophysics Data System (ADS)

    Arakawa, A.; Jung, J.

    2008-12-01

    When models are truncated in or near an energetically active range of the spectrum, model physics must be changed as the resolution changes. The model physics of GCMs and that of CRMs are, however, quite different from each other and at present there is no unified formulation of model physics that automatically provides transition between these model physics. The Quasi-3D (Q3D) Multi-scale Modeling Framework (MMF) is an attempt to bridge this gap. Like the recently proposed Heterogeneous Multiscale Method (HMM) (E and Engquist 2003), MMF combines a macroscopic model, GCM, and a microscopic model, CRM. Unlike the traditional multiscale methods such as the multi-grid and adapted mesh refinement techniques, HMM and MMF are for solving multi-physics problems. They share the common objective "to design combined macroscopic-microscopic computational methods that are much more efficient than solving the full microscopic model and at the same time give the information we need" (E et al. 2008). The question is then how to meet this objective in practice, which can be highly problem dependent. In HHM, the efficiency is gained typically by localization of the microscale problem. Following the pioneering work by Grabowski and Smolarkiewicz (1999) and Grabowski (2001), MMF takes advantage of the fact that 2D CRMs are reasonably successful in simulating deep clouds. In this approach, the efficiency is gained by sacrificing the three-dimensionality of cloud-scale motion. It also "localizes" the algorithm through embedding a CRM in each GCM grid box using cyclic boundary condition. The Q3D MMF is an attempt to reduce the expense due to these constraints by partially including the cloud-scale 3D effects and extending the CRM beyond individual GCM grid boxes. As currently formulated, the Q3D MMF is a 4D estimation/prediction framework that combines a GCM with a 3D anelastic cloud-resolving vector vorticity equation model (VVM) applied to a network of horizontal grids. The network

  19. Proposed framework for thermomechanical life modeling of metal matrix composites

    NASA Technical Reports Server (NTRS)

    Halford, Gary R.; Lerch, Bradley A.; Saltsman, James F.

    1993-01-01

    The framework of a mechanics of materials model is proposed for thermomechanical fatigue (TMF) life prediction of unidirectional, continuous-fiber metal matrix composites (MMC's). Axially loaded MMC test samples are analyzed as structural components whose fatigue lives are governed by local stress-strain conditions resulting from combined interactions of the matrix, interfacial layer, and fiber constituents. The metallic matrix is identified as the vehicle for tracking fatigue crack initiation and propagation. The proposed framework has three major elements. First, TMF flow and failure characteristics of in situ matrix material are approximated from tests of unreinforced matrix material, and matrix TMF life prediction equations are numerically calibrated. The macrocrack initiation fatigue life of the matrix material is divided into microcrack initiation and microcrack propagation phases. Second, the influencing factors created by the presence of fibers and interfaces are analyzed, characterized, and documented in equation form. Some of the influences act on the microcrack initiation portion of the matrix fatigue life, others on the microcrack propagation life, while some affect both. Influencing factors include coefficient of thermal expansion mismatch strains, residual (mean) stresses, multiaxial stress states, off-axis fibers, internal stress concentrations, multiple initiation sites, nonuniform fiber spacing, fiber debonding, interfacial layers and cracking, fractured fibers, fiber deflections of crack fronts, fiber bridging of matrix cracks, and internal oxidation along internal interfaces. Equations exist for some, but not all, of the currently identified influencing factors. The third element is the inclusion of overriding influences such as maximum tensile strain limits of brittle fibers that could cause local fractures and ensuing catastrophic failure of surrounding matrix material. Some experimental data exist for assessing the plausibility of the proposed

  20. Usage Intention Framework Model: A Fuzzy Logic Interpretation of the Classical Utaut Model

    ERIC Educational Resources Information Center

    Sandaire, Johnny

    2009-01-01

    A fuzzy conjoint analysis (FCA: Turksen, 1992) model for enhancing management decision in the technology adoption domain was implemented as an extension to the UTAUT model (Venkatesh, Morris, Davis, & Davis, 2003). Additionally, a UTAUT-based Usage Intention Framework Model (UIFM) introduced a closed-loop feedback system. The empirical evidence…

  1. Assessment of solution uncertainties in single-column modeling frameworks

    SciTech Connect

    Hack, J.J.; Pedretti, J.A.

    2000-01-15

    Single-column models (SCMs) have been extensively promoted in recent years as an effective means to develop and test physical parameterizations targeted for more complex three-dimensional climate models. Although there are some clear advantages associated with single-column modeling, there are also some significant disadvantages, including the absence of large-scale feedbacks. Basic limitations of an SCM framework can make it difficult to interpret solutions, and at times contribute to rather striking failures to identify even first-order sensitivities as they would be observed in a global climate simulation. This manuscript will focus on one of the basic experimental approaches currently exploited by the single-column modeling community, with an emphasis on establishing the inherent uncertainties in the numerical solutions. The analysis will employ the standard physics package from the NCAR CCM3 and will illustrate the nature of solution uncertainties that arise from nonlinearities in parameterized physics. The results of this study suggest the need to make use of an ensemble methodology when conducting single-column modeling investigations.

  2. Modelling grain growth in the framework of Rational Extended Thermodynamics

    NASA Astrophysics Data System (ADS)

    Kertsch, Lukas; Helm, Dirk

    2016-05-01

    Grain growth is a significant phenomenon for the thermomechanical processing of metals. Since the mobility of the grain boundaries is thermally activated and energy stored in the grain boundaries is released during their motion, a mutual interaction with the process conditions occurs. To model such phenomena, a thermodynamic framework for the representation of thermomechanical coupling phenomena in metals including a microstructure description is required. For this purpose, Rational Extended Thermodynamics appears to be a useful tool. We apply an entropy principle to derive a thermodynamically consistent model for grain coarsening due to the growth and shrinkage of individual grains. Despite the rather different approaches applied, we obtain a grain growth model which is similar to existing ones and can be regarded as a thermodynamic extension of that by Hillert (1965) to more general systems. To demonstrate the applicability of the model, we compare our simulation results to grain growth experiments in pure copper by different authors, which we are able to reproduce very accurately. Finally, we study the implications of the energy release due to grain growth on the energy balance. The present unified approach combining a microstructure description and continuum mechanics is ready to be further used to develop more elaborate material models for complex thermo-chemo-mechanical coupling phenomena.

  3. Sensor models and a framework for sensor management

    NASA Astrophysics Data System (ADS)

    Gaskell, Alex P.; Probert, Penelope J.

    1993-08-01

    We describe the use of Bayesian belief networks and decision theoretic principles for sensor management in multi-sensor systems. This framework provides a way of representing sensory data and choosing actions under uncertainty. The work considers how to distribute functionality between sensors and the controller. Use is made of logical sensors based on complementary physical sensors to provide information at the task level of abstraction represented within the network. We are applying these methods in the area of low level planning in mobile robotics. A key feature of the work is the development of quantified models to represent diverse sensors, in particular the sonar array and infra-red triangulation sensors we use on our AGV. We need to develop a model which can handle these very different sensors but provides a common interface to the sensor management process. We do this by quantifying the uncertainty through probabilistic models of the sensors, taking into account their physical characteristics and interaction with the expected environment. Modelling the sensor characteristics to an appropriate level of detail has the advantage of giving more accurate and robust mapping between the physical and logical sensor, as well as a better understanding of environmental dependency and its limitations. We describe a model of a sonar array, which explicitly takes into account features such as beam-width and ranging errors, and its integration into the sensor management process.

  4. Extreme Precipitation in a Multi-Scale Modeling Framework

    NASA Astrophysics Data System (ADS)

    Phillips, M.; Denning, S.; Arabi, M.

    2015-12-01

    Extreme precipitation events are characterized by infrequent but large magnitude accummulatations that generally occur on scales belowthat resolved by the typical Global Climate Model. The Multi-scale Modeling Framework allows for information about the precipitation on these scales to be simulated for long periods of time without the large computational resources required for the use of a full cloud permitting model. The Community Earth System Model was run for 30 years in both its MMF and GCM modes, and the annual maximum series of 24 hour precipitation accumulations were used to estimate the parameters of statistical distributions. The distributions generated from model ouput were then t to a General Extreme Value distribution and evaluated against observations. These results indicate that the MMF produces extreme precipitation with a statistical distribution that closely resembles that of observations and motivates the continued use of the MMF for analysis of extreme precipitation, and shows an improvement over the traditional GCM. The improvement in statistical distributions of annual maxima is greatest in regions that are dominated by convective precipitation where the small-scale information provided by the MMF heavily influences precipitation processes.

  5. Structural uncertainty in watershed phosphorus modeling: Toward a stochastic framework

    NASA Astrophysics Data System (ADS)

    Chen, Lei; Gong, Yongwei; Shen, Zhenyao

    2016-06-01

    Structural uncertainty is an important source of model predictive errors, but few studies have been conducted on the error-transitivity from model structure to nonpoint source (NPS) prediction. In this study, we focused on the structural uncertainty caused by the algorithms and equations that are used to describe the phosphorus (P) cycle at the watershed scale. The sensitivity of simulated P to each algorithm/equation was quantified using the Soil and Water Assessment Tool (SWAT) in the Three Gorges Reservoir Area, China. The results indicated that the ratios of C:N and P:N for humic materials, as well as the algorithm of fertilization and P leaching contributed the largest output uncertainties. In comparison, the initiation of inorganic P in the soil layer and the transformation algorithm between P pools are less sensitive for the NPS-P predictions. In addition, the coefficient of variation values were quantified as 0.028-0.086, indicating that the structure-induced uncertainty is minor compared to NPS-P prediction uncertainty caused by the model input and parameters. Using the stochastic framework, the cumulative probability of simulated NPS-P data provided a trade-off between expenditure burden and desired risk. In this sense, this paper provides valuable information for the control of model structural uncertainty, and can be extrapolated to other model-based studies.

  6. A Data Driven Framework for Integrating Regional Climate Models

    NASA Astrophysics Data System (ADS)

    Lansing, C.; Kleese van Dam, K.; Liu, Y.; Elsethagen, T.; Guillen, Z.; Stephan, E.; Critchlow, T.; Gorton, I.

    2012-12-01

    There are increasing needs for research addressing complex climate sensitive issues of concern to decision-makers and policy planners at a regional level. Decisions about allocating scarce water across competing municipal, agricultural, and ecosystem demands is just one of the challenges ahead, along with decisions regarding competing land use priorities such as biofuels, food, and species habitat. Being able to predict the extent of future climate change in the context of introducing alternative energy production strategies requires a new generation of modeling capabilities. We will also need more complete representations of human systems at regional scales, incorporating the influences of population centers, land use, agriculture and existing and planned electrical demand and generation infrastructure. At PNNL we are working towards creating a first-of-a-kind capability known as the Integrated Regional Earth System Model (iRESM). The fundamental goal of the iRESM initiative is the critical analyses of the tradeoffs and consequences of decision and policy making for integrated human and environmental systems. This necessarily combines different scientific processes, bridging different temporal and geographic scales and resolving the semantic differences between them. To achieve this goal, iRESM is developing a modeling framework and supporting infrastructure that enable the scientific team to evaluate different scenarios in light of specific stakeholder questions such as "How do regional changes in mean climate states and climate extremes affect water storage and energy consumption and how do such decisions influence possible mitigation and carbon management schemes?" The resulting capability will give analysts a toolset to gain insights into how regional economies can respond to climate change mitigation policies and accelerated deployment of alternative energy technologies. The iRESM framework consists of a collection of coupled models working with high

  7. LAMMPS framework for dynamic bonding and an application modeling DNA

    NASA Astrophysics Data System (ADS)

    Svaneborg, Carsten

    2012-08-01

    We have extended the Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) to support directional bonds and dynamic bonding. The framework supports stochastic formation of new bonds, breakage of existing bonds, and conversion between bond types. Bond formation can be controlled to limit the maximal functionality of a bead with respect to various bond types. Concomitant with the bond dynamics, angular and dihedral interactions are dynamically introduced between newly connected triplets and quartets of beads, where the interaction type is determined from the local pattern of bead and bond types. When breaking bonds, all angular and dihedral interactions involving broken bonds are removed. The framework allows chemical reactions to be modeled, and use it to simulate a simplistic, coarse-grained DNA model. The resulting DNA dynamics illustrates the power of the present framework. Catalogue identifier: AEME_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEME_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 2 243 491 No. of bytes in distributed program, including test data, etc.: 771 Distribution format: tar.gz Programming language: C++ Computer: Single and multiple core servers Operating system: Linux/Unix/Windows Has the code been vectorized or parallelized?: Yes. The code has been parallelized by the use of MPI directives. RAM: 1 Gb Classification: 16.11, 16.12 Nature of problem: Simulating coarse-grain models capable of chemistry e.g. DNA hybridization dynamics. Solution method: Extending LAMMPS to handle dynamic bonding and directional bonds. Unusual features: Allows bonds to be created and broken while angular and dihedral interactions are kept consistent. Additional comments: The distribution file for this program is approximately 36 Mbytes and therefore is not delivered directly

  8. A modelling framework to simulate foliar fungal epidemics using functional–structural plant models

    PubMed Central

    Garin, Guillaume; Fournier, Christian; Andrieu, Bruno; Houlès, Vianney; Robert, Corinne; Pradal, Christophe

    2014-01-01

    Background and Aims Sustainable agriculture requires the identification of new, environmentally responsible strategies of crop protection. Modelling of pathosystems can allow a better understanding of the major interactions inside these dynamic systems and may lead to innovative protection strategies. In particular, functional–structural plant models (FSPMs) have been identified as a means to optimize the use of architecture-related traits. A current limitation lies in the inherent complexity of this type of modelling, and thus the purpose of this paper is to provide a framework to both extend and simplify the modelling of pathosystems using FSPMs. Methods Different entities and interactions occurring in pathosystems were formalized in a conceptual model. A framework based on these concepts was then implemented within the open-source OpenAlea modelling platform, using the platform's general strategy of modelling plant–environment interactions and extending it to handle plant interactions with pathogens. New developments include a generic data structure for representing lesions and dispersal units, and a series of generic protocols to communicate with objects representing the canopy and its microenvironment in the OpenAlea platform. Another development is the addition of a library of elementary models involved in pathosystem modelling. Several plant and physical models are already available in OpenAlea and can be combined in models of pathosystems using this framework approach. Key Results Two contrasting pathosystems are implemented using the framework and illustrate its generic utility. Simulations demonstrate the framework's ability to simulate multiscaled interactions within pathosystems, and also show that models are modular components within the framework and can be extended. This is illustrated by testing the impact of canopy architectural traits on fungal dispersal. Conclusions This study provides a framework for modelling a large number of pathosystems

  9. Legitimising data-driven models: exemplification of a new data-driven mechanistic modelling framework

    NASA Astrophysics Data System (ADS)

    Mount, N. J.; Dawson, C. W.; Abrahart, R. J.

    2013-07-01

    In this paper the difficult problem of how to legitimise data-driven hydrological models is addressed using an example of a simple artificial neural network modelling problem. Many data-driven models in hydrology have been criticised for their black-box characteristics, which prohibit adequate understanding of their mechanistic behaviour and restrict their wider heuristic value. In response, presented here is a new generic data-driven mechanistic modelling framework. The framework is significant because it incorporates an evaluation of the legitimacy of a data-driven model's internal modelling mechanism as a core element in the modelling process. The framework's value is demonstrated by two simple artificial neural network river forecasting scenarios. We develop a novel adaptation of first-order partial derivative, relative sensitivity analysis to enable each model's mechanistic legitimacy to be evaluated within the framework. The results demonstrate the limitations of standard, goodness-of-fit validation procedures by highlighting how the internal mechanisms of complex models that produce the best fit scores can have lower mechanistic legitimacy than simpler counterparts whose scores are only slightly inferior. Thus, our study directly tackles one of the key debates in data-driven, hydrological modelling: is it acceptable for our ends (i.e. model fit) to justify our means (i.e. the numerical basis by which that fit is achieved)?

  10. Improving NASA's Multiscale Modeling Framework for Tropical Cyclone Climate Study

    NASA Technical Reports Server (NTRS)

    Shen, Bo-Wen; Nelson, Bron; Cheung, Samson; Tao, Wei-Kuo

    2013-01-01

    One of the current challenges in tropical cyclone (TC) research is how to improve our understanding of TC interannual variability and the impact of climate change on TCs. Recent advances in global modeling, visualization, and supercomputing technologies at NASA show potential for such studies. In this article, the authors discuss recent scalability improvement to the multiscale modeling framework (MMF) that makes it feasible to perform long-term TC-resolving simulations. The MMF consists of the finite-volume general circulation model (fvGCM), supplemented by a copy of the Goddard cumulus ensemble model (GCE) at each of the fvGCM grid points, giving 13,104 GCE copies. The original fvGCM implementation has a 1D data decomposition; the revised MMF implementation retains the 1D decomposition for most of the code, but uses a 2D decomposition for the massive copies of GCEs. Because the vast majority of computation time in the MMF is spent computing the GCEs, this approach can achieve excellent speedup without incurring the cost of modifying the entire code. Intelligent process mapping allows differing numbers of processes to be assigned to each domain for load balancing. The revised parallel implementation shows highly promising scalability, obtaining a nearly 80-fold speedup by increasing the number of cores from 30 to 3,335.

  11. A hierarchical modeling framework for multiple observer transect surveys.

    PubMed

    Conn, Paul B; Laake, Jeffrey L; Johnson, Devin S

    2012-01-01

    Ecologists often use multiple observer transect surveys to census animal populations. In addition to animal counts, these surveys produce sequences of detections and non-detections for each observer. When combined with additional data (i.e. covariates such as distance from the transect line), these sequences provide the additional information to estimate absolute abundance when detectability on the transect line is less than one. Although existing analysis approaches for such data have proven extremely useful, they have some limitations. For instance, it is difficult to extrapolate from observed areas to unobserved areas unless a rigorous sampling design is adhered to; it is also difficult to share information across spatial and temporal domains or to accommodate habitat-abundance relationships. In this paper, we introduce a hierarchical modeling framework for multiple observer line transects that removes these limitations. In particular, abundance intensities can be modeled as a function of habitat covariates, making it easier to extrapolate to unsampled areas. Our approach relies on a complete data representation of the state space, where unobserved animals and their covariates are modeled using a reversible jump Markov chain Monte Carlo algorithm. Observer detections are modeled via a bivariate normal distribution on the probit scale, with dependence induced by a distance-dependent correlation parameter. We illustrate performance of our approach with simulated data and on a known population of golf tees. In both cases, we show that our hierarchical modeling approach yields accurate inference about abundance and related parameters. In addition, we obtain accurate inference about population-level covariates (e.g. group size). We recommend that ecologists consider using hierarchical models when analyzing multiple-observer transect data, especially when it is difficult to rigorously follow pre-specified sampling designs. We provide a new R package, hierarchical

  12. A Smart Modeling Framework for Integrating BMI-enabled Models as Web Services

    NASA Astrophysics Data System (ADS)

    Jiang, P.; Elag, M.; Kumar, P.; Peckham, S. D.; Liu, R.; Marini, L.; Hsu, L.

    2015-12-01

    Serviced-oriented computing provides an opportunity to couple web service models using semantic web technology. Through this approach, models that are exposed as web services can be conserved in their own local environment, thus making it easy for modelers to maintain and update the models. In integrated modeling, the serviced-oriented loose-coupling approach requires (1) a set of models as web services, (2) the model metadata describing the external features of a model (e.g., variable name, unit, computational grid, etc.) and (3) a model integration framework. We present the architecture of coupling web service models that are self-describing by utilizing a smart modeling framework. We expose models that are encapsulated with CSDMS (Community Surface Dynamics Modeling System) Basic Model Interfaces (BMI) as web services. The BMI-enabled models are self-describing by uncovering models' metadata through BMI functions. After a BMI-enabled model is serviced, a client can initialize, execute and retrieve the meta-information of the model by calling its BMI functions over the web. Furthermore, a revised version of EMELI (Peckham, 2015), an Experimental Modeling Environment for Linking and Interoperability, is chosen as the framework for coupling BMI-enabled web service models. EMELI allows users to combine a set of component models into a complex model by standardizing model interface using BMI as well as providing a set of utilities smoothing the integration process (e.g., temporal interpolation). We modify the original EMELI so that the revised modeling framework is able to initialize, execute and find the dependencies of the BMI-enabled web service models. By using the revised EMELI, an example will be presented on integrating a set of topoflow model components that are BMI-enabled and exposed as web services. Reference: Peckham, S.D. (2014) EMELI 1.0: An experimental smart modeling framework for automatic coupling of self-describing models, Proceedings of HIC 2014

  13. The Pretzelosity Distribution Function and Intrinsic Motion of the Constituents in Nucleon

    SciTech Connect

    Efremov, A. V.; Teryaev, O. V.; Schweitzer, P.; Zavada, P.

    2009-08-04

    The pretzelosity distribution function h{sub 1T}{sup perpendicular} is studied in a covariant quark-parton model which describes the structure of the nucleon in terms of 3D quark intrinsic motion. This relativistic model framework supports the relation between helicity, transversity and pretzelosity observed in other relativistic models without assuming SU(6) spin-flavor symmetry. Numerical results and predictions for SIDIS experiments are presented.

  14. a Framework for AN Open Source Geospatial Certification Model

    NASA Astrophysics Data System (ADS)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105

  15. Internal modelling under Risk-Based Capital (RBC) framework

    NASA Astrophysics Data System (ADS)

    Ling, Ang Siew; Hin, Pooi Ah

    2015-12-01

    Very often the methods for the internal modelling under the Risk-Based Capital framework make use of the data which are in the form of run-off triangle. The present research will instead extract from a group of n customers, the historical data for the sum insured si of the i-th customer together with the amount paid yij and the amount aij reported but not yet paid in the j-th development year for j = 1, 2, 3, 4, 5, 6. We model the future value (yij+1, aij+1) to be dependent on the present year value (yij, aij) and the sum insured si via a conditional distribution which is derived from a multivariate power-normal mixture distribution. For a group of given customers with different original purchase dates, the distribution of the aggregate claims liabilities may be obtained from the proposed model. The prediction interval based on the distribution for the aggregate claim liabilities is found to have good ability of covering the observed aggregate claim liabilities.

  16. Investigating GPDs in the framework of the double distribution model

    NASA Astrophysics Data System (ADS)

    Nazari, F.; Mirjalili, A.

    2016-06-01

    In this paper, we construct the generalized parton distribution (GPD) in terms of the kinematical variables x, ξ, t, using the double distribution model. By employing these functions, we could extract some quantities which makes it possible to gain a three-dimensional insight into the nucleon structure function at the parton level. The main objective of GPDs is to combine and generalize the concepts of ordinary parton distributions and form factors. They also provide an exclusive framework to describe the nucleons in terms of quarks and gluons. Here, we first calculate, in the Double Distribution model, the GPD based on the usual parton distributions arising from the GRV and CTEQ phenomenological models. Obtaining quarks and gluons angular momenta from the GPD, we would be able to calculate the scattering observables which are related to spin asymmetries of the produced quarkonium. These quantities are represented by AN and ALS. We also calculate the Pauli and Dirac form factors in deeply virtual Compton scattering. Finally, in order to compare our results with the existing experimental data, we use the difference of the polarized cross-section for an initial longitudinal leptonic beam and unpolarized target particles (ΔσLU). In all cases, our obtained results are in good agreement with the available experimental data.

  17. Evolution of Climate Science Modelling Language within international standards frameworks

    NASA Astrophysics Data System (ADS)

    Lowe, Dominic; Woolf, Andrew

    2010-05-01

    The Climate Science Modelling Language (CSML) was originally developed as part of the NERC Data Grid (NDG) project in the UK. It was one of the first Geography Markup Language (GML) application schemas describing complex feature types for the metocean domain. CSML feature types can be used to describe typical climate products such as model runs or atmospheric profiles. CSML has been successfully used within NDG to provide harmonised access to a number of different data sources. For example, meteorological observations held in heterogeneous databases by the British Atmospheric Data Centre (BADC) and Centre for Ecology and Hydrology (CEH) were served uniformly as CSML features via Web Feature Service. CSML has now been substantially revised to harmonise it with the latest developments in OGC and ISO conceptual modelling for geographic information. In particular, CSML is now aligned with the near-final ISO 19156 Observations & Measurements (O&M) standard. CSML combines the O&M concept of 'sampling features' together with an observation result based on the coverage model (ISO 19123). This general pattern is specialised for particular data types of interest, classified on the basis of sampling geometry and topology. In parallel work, the OGC Met Ocean Domain Working Group has established a conceptual modelling activity. This is a cross-organisational effort aimed at reaching consensus on a common core data model that could be re-used in a number of met-related application areas: operational meteorology, aviation meteorology, climate studies, and the research community. It is significant to note that this group has also identified sampling geometry and topology as a key classification axis for data types. Using the Model Driven Architecture (MDA) approach as adopted by INSPIRE we demonstrate how the CSML application schema is derived from a formal UML conceptual model based on the ISO TC211 framework. By employing MDA tools which map consistently between UML and GML we

  18. AN INTEGRATED MODELING FRAMEWORK FOR CARBON MANAGEMENT TECHNOLOGIES

    SciTech Connect

    Anand B. Rao; Edward S. Rubin; Michael B. Berkenpas

    2004-03-01

    CO{sub 2} capture and storage (CCS) is gaining widespread interest as a potential method to control greenhouse gas emissions from fossil fuel sources, especially electric power plants. Commercial applications of CO{sub 2} separation and capture technologies are found in a number of industrial process operations worldwide. Many of these capture technologies also are applicable to fossil fuel power plants, although applications to large-scale power generation remain to be demonstrated. This report describes the development of a generalized modeling framework to assess alternative CO{sub 2} capture and storage options in the context of multi-pollutant control requirements for fossil fuel power plants. The focus of the report is on post-combustion CO{sub 2} capture using amine-based absorption systems at pulverized coal-fired plants, which are the most prevalent technology used for power generation today. The modeling framework builds on the previously developed Integrated Environmental Control Model (IECM). The expanded version with carbon sequestration is designated as IECM-cs. The expanded modeling capability also includes natural gas combined cycle (NGCC) power plants and integrated coal gasification combined cycle (IGCC) systems as well as pulverized coal (PC) plants. This report presents details of the performance and cost models developed for an amine-based CO{sub 2} capture system, representing the baseline of current commercial technology. The key uncertainties and variability in process design, performance and cost parameters which influence the overall cost of carbon mitigation also are characterized. The new performance and cost models for CO{sub 2} capture systems have been integrated into the IECM-cs, along with models to estimate CO{sub 2} transport and storage costs. The CO{sub 2} control system also interacts with other emission control technologies such as flue gas desulfurization (FGD) systems for SO{sub 2} control. The integrated model is applied to

  19. Model-Based Reasoning in the Physics Laboratory: Framework and Initial Results

    ERIC Educational Resources Information Center

    Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.

    2015-01-01

    We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable…

  20. Models of Recognition, Repetition Priming, and Fluency : Exploring a New Framework

    ERIC Educational Resources Information Center

    Berry, Christopher J.; Shanks, David R.; Speekenbrink, Maarten; Henson, Richard N. A.

    2012-01-01

    We present a new modeling framework for recognition memory and repetition priming based on signal detection theory. We use this framework to specify and test the predictions of 4 models: (a) a single-system (SS) model, in which one continuous memory signal drives recognition and priming; (b) a multiple-systems-1 (MS1) model, in which completely…

  1. Smart licensing and environmental flows: Modeling framework and sensitivity testing

    NASA Astrophysics Data System (ADS)

    Wilby, R. L.; Fenn, C. R.; Wood, P. J.; Timlett, R.; Lequesne, T.

    2011-12-01

    Adapting to climate change is just one among many challenges facing river managers. The response will involve balancing the long-term water demands of society with the changing needs of the environment in sustainable and cost effective ways. This paper describes a modeling framework for evaluating the sensitivity of low river flows to different configurations of abstraction licensing under both historical climate variability and expected climate change. A rainfall-runoff model is used to quantify trade-offs among environmental flow (e-flow) requirements, potential surface and groundwater abstraction volumes, and the frequency of harmful low-flow conditions. Using the River Itchen in southern England as a case study it is shown that the abstraction volume is more sensitive to uncertainty in the regional climate change projection than to the e-flow target. It is also found that "smarter" licensing arrangements (involving a mix of hands off flows and "rising block" abstraction rules) could achieve e-flow targets more frequently than conventional seasonal abstraction limits, with only modest reductions in average annual yield, even under a hotter, drier climate change scenario.

  2. A modeling framework for potential induced degradation in PV modules

    NASA Astrophysics Data System (ADS)

    Bermel, Peter; Asadpour, Reza; Zhou, Chao; Alam, Muhammad A.

    2015-09-01

    Major sources of performance degradation and failure in glass-encapsulated PV modules include moisture-induced gridline corrosion, potential-induced degradation (PID) of the cell, and stress-induced busbar delamination. Recent studies have shown that PV modules operating in damp heat at -600 V are vulnerable to large amounts of degradation, potentially up to 90% of the original power output within 200 hours. To improve module reliability and restore power production in the presence of PID and other failure mechanisms, a fundamental rethinking of accelerated testing is needed. This in turn will require an improved understanding of technology choices made early in development that impact failures later. In this work, we present an integrated approach of modeling, characterization, and validation to address these problems. A hierarchical modeling framework will allows us to clarify the mechanisms of corrosion, PID, and delamination. We will employ a physics-based compact model of the cell, topology of the electrode interconnection, geometry of the packaging stack, and environmental operating conditions to predict the current, voltage, temperature, and stress distributions in PV modules correlated with the acceleration of specific degradation modes. A self-consistent solution will capture the essential complexity of the technology-specific acceleration of PID and other degradation mechanisms as a function of illumination, ambient temperature, and relative humidity. Initial results from our model include specific lifetime predictions suitable for direct comparison with indoor and outdoor experiments, which are qualitatively validated by prior work. This approach could play a significant role in developing novel accelerated lifetime tests.

  3. A trajectory generation framework for modeling spacecraft entry in MDAO

    NASA Astrophysics Data System (ADS)

    D`Souza, Sarah N.; Sarigul-Klijn, Nesrin

    2016-04-01

    In this paper a novel trajectory generation framework was developed that optimizes trajectory event conditions for use in a Generalized Entry Guidance algorithm. The framework was developed to be adaptable via the use of high fidelity equations of motion and drag based analytical bank profiles. Within this framework, a novel technique was implemented that resolved the sensitivity of the bank profile to atmospheric non-linearities. The framework's adaptability was established by running two different entry bank conditions. Each case yielded a reference trajectory and set of transition event conditions that are flight feasible and implementable in a Generalized Entry Guidance algorithm.

  4. A modeling and simulation framework for electrokinetic nanoparticle treatment

    NASA Astrophysics Data System (ADS)

    Phillips, James

    2011-12-01

    The focus of this research is to model and provide a simulation framework for the packing of differently sized spheres within a hard boundary. The novel contributions of this dissertation are the cylinders of influence (COI) method and sectoring method implementations. The impetus for this research stems from modeling electrokinetic nanoparticle (EN) treatment, which packs concrete pores with differently sized nanoparticles. We show an improved speed of the simulation compared to previously published results of EN treatment simulation while obtaining similar porosity reduction results. We mainly focused on readily, commercially available particle sizes of 2 nm and 20 nm particles, but have the capability to model other sizes. Our simulation has graphical capabilities and can provide additional data unobtainable from physical experimentation. The data collected has a median of 0.5750 and a mean of 0.5504. The standard error is 0.0054 at alpha = 0.05 for a 95% confidence interval of 0.5504 +/- 0.0054. The simulation has produced maximum packing densities of 65% and minimum packing densities of 34%. Simulation data are analyzed using linear regression via the R statistical language to obtain two equations: one that describes porosity reduction based on all cylinder and particle characteristics, and another that focuses on describing porosity reduction based on cylinder diameter for 2 and 20 nm particles into pores of 100 nm height. Simulation results are similar to most physical results obtained from MIP and WLR. Some MIP results do not fall within the simulation limits; however, this is expected as MIP has been documented to be an inaccurate measure of pore distribution and porosity of concrete. Despite the disagreement between WLR and MIP, there is a trend that porosity reduction is higher two inches from the rebar as compared to the rebar-concrete interface. The simulation also detects a higher porosity reduction further from the rebar. This may be due to particles

  5. Robust, Adaptive Functional Regression in Functional Mixed Model Framework

    PubMed Central

    Zhu, Hongxiao; Brown, Philip J.; Morris, Jeffrey S.

    2012-01-01

    Functional data are increasingly encountered in scientific studies, and their high dimensionality and complexity lead to many analytical challenges. Various methods for functional data analysis have been developed, including functional response regression methods that involve regression of a functional response on univariate/multivariate predictors with nonparametrically represented functional coefficients. In existing methods, however, the functional regression can be sensitive to outlying curves and outlying regions of curves, so is not robust. In this paper, we introduce a new Bayesian method, robust functional mixed models (R-FMM), for performing robust functional regression within the general functional mixed model framework, which includes multiple continuous or categorical predictors and random effect functions accommodating potential between-function correlation induced by the experimental design. The underlying model involves a hierarchical scale mixture model for the fixed effects, random effect and residual error functions. These modeling assumptions across curves result in robust nonparametric estimators of the fixed and random effect functions which down-weight outlying curves and regions of curves, and produce statistics that can be used to flag global and local outliers. These assumptions also lead to distributions across wavelet coefficients that have outstanding sparsity and adaptive shrinkage properties, with great flexibility for the data to determine the sparsity and the heaviness of the tails. Together with the down-weighting of outliers, these within-curve properties lead to fixed and random effect function estimates that appear in our simulations to be remarkably adaptive in their ability to remove spurious features yet retain true features of the functions. We have developed general code to implement this fully Bayesian method that is automatic, requiring the user to only provide the functional data and design matrices. It is efficient

  6. Digital Moon: A three-dimensional framework for lunar modeling

    NASA Astrophysics Data System (ADS)

    Paige, D. A.; Elphic, R. C.; Foote, E. J.; Meeker, S. R.; Siegler, M. A.; Vasavada, A. R.

    2009-12-01

    The Moon has a complex three-dimensional shape with significant large-scale and small-scale topographic relief. The Moon’s topography largely controls the distribution of incident solar radiation, as well as the scattered solar and infrared radiation fields. Topography also affects the Moon’s interaction with the space environment, its magnetic field, and the propagation of seismic waves. As more extensive and detailed lunar datasets become available, there is an increasing need to interpret and compare them with the results of physical models in a fully three-dimensional context. We have developed a three-dimensional framework for lunar modeling we call the Digital Moon. The goal of this work is to enable high fidelity physical modeling and visualization of the Moon in a parallel computing environment. The surface of the Moon is described by a continuous triangular mesh of arbitrary shape and spatial scale. For regions of limited geographic extent, it is convenient to employ meshes on a rectilinear grid. However for global-scale modeling, we employ a continuous geodesic gridding scheme (Teanby, 2008). Each element in the mesh surface is allowed to have a unique set of physical properties. Photon and particle interactions between mesh elements are modeled using efficient ray tracing algorithms. Heat, mass, photon and particle transfer within each mesh element are modeled in one dimension. Each compute node is assigned a portion of the mesh and collective interactions between elements are handled through network interfaces. We have used the model to calculate lunar surface and subsurface temperatures that can be compared directly with radiometric temperatures measured by the Diviner Lunar Radiometer Experiment on the Lunar Reconnaissance Orbiter. The model includes realistic surface photometric functions based on goniometric measurements of lunar soil samples (Foote and Paige, 2009), and one-dimensional thermal models based on lunar remote sensing and Apollo

  7. Modelling competition and dispersal in a statistical phylogeographic framework.

    PubMed

    Ranjard, Louis; Welch, David; Paturel, Marie; Guindon, Stéphane

    2014-09-01

    Competition between organisms influences the processes governing the colonization of new habitats. As a consequence, species or populations arriving first at a suitable location may prevent secondary colonization. Although adaptation to environmental variables (e.g., temperature, altitude, etc.) is essential, the presence or absence of certain species at a particular location often depends on whether or not competing species co-occur. For example, competition is thought to play an important role in structuring mammalian communities assembly. It can also explain spatial patterns of low genetic diversity following rapid colonization events or the "progression rule" displayed by phylogenies of species found on archipelagos. Despite the potential of competition to maintain populations in isolation, past quantitative analyses have largely ignored it because of the difficulty in designing adequate methods for assessing its impact. We present here a new model that integrates competition and dispersal into a Bayesian phylogeographic framework. Extensive simulations and analysis of real data show that our approach clearly outperforms the traditional Mantel test for detecting correlation between genetic and geographic distances. But most importantly, we demonstrate that competition can be detected with high sensitivity and specificity from the phylogenetic analysis of genetic variation in space. PMID:24929898

  8. Adaptive invasive species distribution models: A framework for modeling incipient invasions

    USGS Publications Warehouse

    Uden, Daniel R.; Allen, Craig R.; Angeler, David G.; Corral, Lucia; Fricke, Kent A.

    2015-01-01

    The utilization of species distribution model(s) (SDM) for approximating, explaining, and predicting changes in species’ geographic locations is increasingly promoted for proactive ecological management. Although frameworks for modeling non-invasive species distributions are relatively well developed, their counterparts for invasive species—which may not be at equilibrium within recipient environments and often exhibit rapid transformations—are lacking. Additionally, adaptive ecological management strategies address the causes and effects of biological invasions and other complex issues in social-ecological systems. We conducted a review of biological invasions, species distribution models, and adaptive practices in ecological management, and developed a framework for adaptive, niche-based, invasive species distribution model (iSDM) development and utilization. This iterative, 10-step framework promotes consistency and transparency in iSDM development, allows for changes in invasive drivers and filters, integrates mechanistic and correlative modeling techniques, balances the avoidance of type 1 and type 2 errors in predictions, encourages the linking of monitoring and management actions, and facilitates incremental improvements in models and management across space, time, and institutional boundaries. These improvements are useful for advancing coordinated invasive species modeling, management and monitoring from local scales to the regional, continental and global scales at which biological invasions occur and harm native ecosystems and economies, as well as for anticipating and responding to biological invasions under continuing global change.

  9. The BlueSky Smoke Modeling Framework: Recent Developments

    NASA Astrophysics Data System (ADS)

    Sullivan, D. C.; Larkin, N.; Raffuse, S. M.; Strand, T.; ONeill, S. M.; Leung, F. T.; Qu, J. J.; Hao, X.

    2012-12-01

    (TRMM) Multi-satellite Precipitation Analysis Real-Time (TMPA-RT) data set is being used to improve dead fuel moisture estimates. - EastFire live fuel moisture estimates, which are derived from NASA's MODIS direct broadcast, are being used to improve live fuel moisture estimates. - NASA's Multi-angle Imaging Spectroradiometer (MISR) stereo heights are being used to improve estimates of plume injection heights. Further, the Fire Location and Modeling of Burning Emissions (FLAMBÉ) model was incorporated into the BlueSky Framework as an alternative means of calculating fire emissions. FLAMBÉ directly estimates emissions on the basis of fire detections and radiance measures from NASA's MODIS and NOAA's GOES satellites. (The authors gratefully acknowledge NASA's Applied Sciences Program [Grant Nos. NN506AB52A and NNX09AV76G)], the USDA Forest Service, and the Joint Fire Science Program for their support.)

  10. D Geological Framework Models as a Teaching Aid for Geoscience

    NASA Astrophysics Data System (ADS)

    Kessler, H.; Ward, E.; Geological ModelsTeaching Project Team

    2010-12-01

    3D geological models have great potential as a resource for universities when teaching foundation geological concepts as it allows the student to visualise and interrogate UK geology. They are especially useful when dealing with the conversion of 2D field, map and GIS outputs into three dimensional geological units, which is a common problem for all students of geology. Today’s earth science students use a variety of skills and processes during their learning experience including the application of schema’s, spatial thinking, image construction, detecting patterns, memorising figures, mental manipulation and interpretation, making predictions and deducing the orientation of themselves and the rocks. 3D geological models can reinforce spatial thinking strategies and encourage students to think about processes and properties, in turn helping the student to recognise pre-learnt geological principles in the field and to convert what they see at the surface into a picture of what is going on at depth. Learning issues faced by students may also be encountered by experts, policy managers, and stakeholders when dealing with environmental problems. Therefore educational research of student learning in earth science may also improve environmental decision making. 3D geological framework models enhance the learning of Geosciences because they: ● enable a student to observe, manipulate and interpret geology; in particular the models instantly convert two-dimensional geology (maps, boreholes and cross-sections) into three dimensions which is a notoriously difficult geospatial skill to acquire. ● can be orientated to whatever the user finds comfortable and most aids recognition and interpretation. ● can be used either to teach geosciences to complete beginners or add to experienced students body of knowledge (whatever point that may be at). Models could therefore be packaged as a complete educational journey or students and tutor can select certain areas of the model

  11. Simulation-optimization framework for multi-season hybrid stochastic models

    NASA Astrophysics Data System (ADS)

    Srivastav, R. K.; Srinivasan, K.; Sudheer, K. P.

    2011-07-01

    SummaryA novel simulation-optimization framework is proposed that enables the automation of the hybrid stochastic modeling process for synthetic generation of multi-season streamflows. This framework aims to minimize the drudgery, judgment and subjectivity involved in the selection of the most appropriate hybrid stochastic model. It consists of a multi-objective optimization model as the driver and the hybrid multi-season stochastic streamflow generation model, hybrid matched block boostrap (HMABB) as the simulation engine. For the estimation of the hybrid model parameters, the proposed framework employs objective functions that aim to minimize the overall errors in the preservation of storage capacities at various demand levels, unlike the traditional approaches that are simulation based. Moreover this framework yields a number of competent hybrid stochastic models in a single run of the simulation-optimization framework. The efficacy of the proposed simulation-optimization framework is brought out through application to two monthly streamflow data sets from USA of varying sample sizes that exhibit multi-modality and a complex dependence structure. The results show that the hybrid models obtained from the proposed framework are able to preserve the statistical characteristics as well as the storage characteristics better than the simulation based HMABB model, while minimizing the manual effort and the subjectivity involved in the modeling process. The proposed framework can be easily extended to model multi-site multi-season streamflow data.

  12. Legitimising neural network river forecasting models: a new data-driven mechanistic modelling framework

    NASA Astrophysics Data System (ADS)

    Mount, N. J.; Dawson, C. W.; Abrahart, R. J.

    2013-01-01

    In this paper we address the difficult problem of gaining an internal, mechanistic understanding of a neural network river forecasting (NNRF) model. Neural network models in hydrology have long been criticised for their black-box character, which prohibits adequate understanding of their modelling mechanisms and has limited their broad acceptance by hydrologists. In response, we here present a new, data-driven mechanistic modelling (DDMM) framework that incorporates an evaluation of the legitimacy of a neural network's internal modelling mechanism as a core element in the model development process. The framework is exemplified for two NNRF modelling scenarios, and uses a novel adaptation of first order, partial derivate, relative sensitivity analysis methods as the means by which each model's mechanistic legitimacy is explored. The results demonstrate the limitations of standard, goodness-of-fit validation procedures applied by NNRF modellers, by highlighting how the internal mechanisms of complex models that produce the best fit scores can have much lower legitimacy than simpler counterparts whose scores are only slightly inferior. The study emphasises the urgent need for better mechanistic understanding of neural network-based hydrological models and the further development of methods for elucidating their mechanisms.

  13. Devising a New Model-Driven Framework for Developing GUI for Enterprise Applications

    NASA Astrophysics Data System (ADS)

    Akiki, Pierre

    The main goal of this chapter is to demonstrate the design and development of a GUI framework that is model driven and is not directly linked to one presentation technology or any specific presentation subsystem of a certain programming language. This framework will allow us to create graphical user interfaces that are not only dynamically customizable but also multilingual. In order to demonstrate this new concept we design in this chapter a new framework called Customizable Enterprise Data Administrator (CEDAR). Additionally, we build a prototype of this framework and a technology-dependent engine which would transform the output of our framework into a known presentation technology.

  14. Subsurface and Surface Characterization using an Information Framework Model

    NASA Astrophysics Data System (ADS)

    Samuel-Ojo, Olusola

    Groundwater plays a critical dual role as a reservoir of fresh water for human consumption and as a cause of the most severe problems when dealing with construction works below the water table. This is why it is critical to monitor groundwater recharge, distribution, and discharge on a continuous basis. The conventional method of monitoring groundwater employs a network of sparsely distributed monitoring wells and it is laborious, expensive, and intrusive. The problem of sparse data and undersampling reduces the accuracy of sampled survey data giving rise to poor interpretation. This dissertation addresses this problem by investigating groundwater-deformation response in order to augment the conventional method. A blend of three research methods was employed, namely design science research, geological methods, and geophysical methods, to examine whether persistent scatterer interferometry, a remote sensing technique, might augment conventional groundwater monitoring. Observation data (including phase information for displacement deformation from permanent scatterer interferometric synthetic aperture radar and depth to groundwater data) was obtained from the Water District, Santa Clara Valley, California. An information framework model was built and applied, and then evaluated. Data was preprocessed and decomposed into five components or parts: trend, seasonality, low frequency, high frequency and octave bandwidth. Digital elevation models of observed and predicted hydraulic head were produced, illustrating the piezometric or potentiometric surface. The potentiometric surface characterizes the regional aquifer of the valley showing areal variation of rate of percolation, velocity and permeability, and completely defines flow direction, advising characteristics and design levels. The findings show a geologic forcing phenomenon which explains in part the long-term deformation behavior of the valley, characterized by poroelastic, viscoelastic, elastoplastic and

  15. Insight into model mechanisms through automatic parameter fitting: a new methodological framework for model development

    PubMed Central

    2014-01-01

    Background Striking a balance between the degree of model complexity and parameter identifiability, while still producing biologically feasible simulations using modelling is a major challenge in computational biology. While these two elements of model development are closely coupled, parameter fitting from measured data and analysis of model mechanisms have traditionally been performed separately and sequentially. This process produces potential mismatches between model and data complexities that can compromise the ability of computational frameworks to reveal mechanistic insights or predict new behaviour. In this study we address this issue by presenting a generic framework for combined model parameterisation, comparison of model alternatives and analysis of model mechanisms. Results The presented methodology is based on a combination of multivariate metamodelling (statistical approximation of the input–output relationships of deterministic models) and a systematic zooming into biologically feasible regions of the parameter space by iterative generation of new experimental designs and look-up of simulations in the proximity of the measured data. The parameter fitting pipeline includes an implicit sensitivity analysis and analysis of parameter identifiability, making it suitable for testing hypotheses for model reduction. Using this approach, under-constrained model parameters, as well as the coupling between parameters within the model are identified. The methodology is demonstrated by refitting the parameters of a published model of cardiac cellular mechanics using a combination of measured data and synthetic data from an alternative model of the same system. Using this approach, reduced models with simplified expressions for the tropomyosin/crossbridge kinetics were found by identification of model components that can be omitted without affecting the fit to the parameterising data. Our analysis revealed that model parameters could be constrained to a standard

  16. Linking Tectonics and Surface Processes through SNAC-CHILD Coupling: Preliminary Results Towards Interoperable Modeling Frameworks

    NASA Astrophysics Data System (ADS)

    Choi, E.; Kelbert, A.; Peckham, S. D.

    2014-12-01

    We demonstrate that code coupling can be an efficient and flexible method for modeling complicated two-way interactions between tectonic and surface processes with SNAC-CHILD coupling as an example. SNAC is a deep earth process model (a geodynamic/tectonics model), built upon a scientific software framework called StGermain and also compatible with a model coupling framework called Pyre. CHILD is a popular surface process model (a landscape evolution model), interfaced to the CSDMS (Community Surface Dynamics Modeling System) modeling framework. We first present proof-of-concept but non-trivial results from a simplistic coupling scheme. We then report progress towards augmenting SNAC with a Basic Model Interface (BMI), a framework-agnostic standard interface developed by CSDMS that uses the CSDMS Standard Names as controlled vocabulary for model communication across domains. Newly interfaced to BMI, SNAC will be easily coupled with CHILD as well as other BMI-compatible models. In broader context, this work will test BMI as a general and easy-to-implement mechanism for sharing models between modeling frameworks and is a part of the NSF-funded EarthCube Building Blocks project, "Earth System Bridge: Spanning Scientific Communities with Interoperable Modeling Frameworks."

  17. A Framework of Operating Models for Interdisciplinary Research Programs in Clinical Service Organizations

    ERIC Educational Resources Information Center

    King, Gillian; Currie, Melissa; Smith, Linda; Servais, Michelle; McDougall, Janette

    2008-01-01

    A framework of operating models for interdisciplinary research programs in clinical service organizations is presented, consisting of a "clinician-researcher" skill development model, a program evaluation model, a researcher-led knowledge generation model, and a knowledge conduit model. Together, these models comprise a tailored, collaborative…

  18. GULF OF MEXICO HYPOXIA MONITORING AND MODELING FRAMEWORK

    EPA Science Inventory

    The USEPA ORD in partnership with the Gulf of Mexico Program Office, the Office of Water and Regions 4 and 6 have developed and implemented plans for a framework that will help guide the science needed to address the hypoxia problem in the Gulf of Mexico. ORD's Gulf Ecology Divis...

  19. A unified framework for modeling landscape evolution by discrete flows

    NASA Astrophysics Data System (ADS)

    Shelef, Eitan; Hilley, George E.

    2016-05-01

    Topographic features such as branched valley networks and undissected convex-up hillslopes are observed in disparate physical environments. In some cases, these features are formed by sediment transport processes that occur discretely in space and time, while in others, by transport processes that are uniformly distributed across the landscape. This paper presents an analytical framework that reconciles the basic attributes of such sediment transport processes with the topographic features that they form and casts those in terms that are likely common to different physical environments. In this framework, temporal changes in surface elevation reflect the frequency with which the landscape is traversed by geophysical flows generated discretely in time and space. This frequency depends on the distance to which flows travel downslope, which depends on the dynamics of individual flows, the lithologic and topographic properties of the underlying substrate, and the coevolution of topography, erosion, and the routing of flows over the topographic surface. To explore this framework, we postulate simple formulations for sediment transport and flow runout distance and demonstrate that the conditions for hillslope and channel network formation can be cast in terms of fundamental parameters such as distance from drainage divide and a friction-like coefficient that describes a flow's resistance to motion. The framework we propose is intentionally general, but the postulated formulas can be substituted with those that aim to describe a specific process and to capture variations in the size distribution of such flow events.

  20. The Foundations of Learning Framework: A Model for School Readiness

    ERIC Educational Resources Information Center

    Sorrels, Barbara

    2012-01-01

    Since the National Education Goals Panel was convened in 1991, school readiness for all children has remained a high priority across our nation. The Foundations of Learning Framework is a tool to understand what it means for a child to be "ready." Preparation for educational success requires two key ingredients--relationships and play. In the…

  1. Holland's RIASEC Model as an Integrative Framework for Individual Differences

    ERIC Educational Resources Information Center

    Armstrong, Patrick Ian; Day, Susan X.; McVay, Jason P.; Rounds, James

    2008-01-01

    Using data from published sources, the authors investigated J. L. Holland's (1959, 1997) theory of interest types as an integrative framework for organizing individual differences variables that are used in counseling psychology. Holland's interest types were used to specify 2- and 3-dimensional interest structures. In Study 1, measures of…

  2. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    PubMed

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department. PMID:23565356

  3. System modeling with the DISC framework: evidence from safety-critical domains.

    PubMed

    Reiman, Teemu; Pietikäinen, Elina; Oedewald, Pia; Gotcheva, Nadezhda

    2012-01-01

    The objective of this paper is to illustrate the development and application of the Design for Integrated Safety Culture (DISC) framework for system modeling by evaluating organizational potential for safety in nuclear and healthcare domains. The DISC framework includes criteria for good safety culture and a description of functions that the organization needs to implement in order to orient the organization toward the criteria. Three case studies will be used to illustrate the utilization of the DISC framework in practice. PMID:22317179

  4. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    PubMed Central

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  5. Advances in Geoscience Modeling: Smart Modeling Frameworks, Self-Describing Models and the Role of Standardized Metadata

    NASA Astrophysics Data System (ADS)

    Peckham, Scott

    2016-04-01

    Over the last decade, model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that make it much easier for modelers to connect heterogeneous sets of process models in a plug-and-play manner to create composite "system models". These mechanisms greatly simplify code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing with standardized metadata. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can use the self description functions to learn about each process model in a collection to be coupled and then automatically call framework service components (e.g. regridders

  6. Collaborative Project. A Flexible Atmospheric Modeling Framework for the Community Earth System Model (CESM)

    SciTech Connect

    Gettelman, Andrew

    2015-10-01

    In this project we have been upgrading the Multiscale Modeling Framework (MMF) in the Community Atmosphere Model (CAM), also known as Super-Parameterized CAM (SP-CAM). This has included a major effort to update the coding standards and interface with CAM so that it can be placed on the main development trunk. It has also included development of a new software structure for CAM to be able to handle sub-grid column information. These efforts have formed the major thrust of the work.

  7. Integration of the DAYCENT Biogeochemical Model within a Multi-Model Framework

    SciTech Connect

    David Muth

    2012-07-01

    Agricultural residues are the largest near term source of cellulosic 13 biomass for bioenergy production, but removing agricultural residues sustainably 14 requires considering the critical roles that residues play in the agronomic system. 15 Determining sustainable removal rates for agricultural residues has received 16 significant attention and integrated modeling strategies have been built to evaluate 17 sustainable removal rates considering soil erosion and organic matter constraints. 18 However the current integrated model does not quantitatively assess soil carbon 19 and long term crop yields impacts of residue removal. Furthermore the current 20 integrated model does not evaluate the greenhouse gas impacts of residue 21 removal, specifically N2O and CO2 gas fluxes from the soil surface. The DAYCENT 22 model simulates several important processes for determining agroecosystem 23 performance. These processes include daily Nitrogen-gas flux, daily carbon dioxide 24 flux from soil respiration, soil organic carbon and nitrogen, net primary productivity, 25 and daily water and nitrate leaching. Each of these processes is an indicator of 26 sustainability when evaluating emerging cellulosic biomass production systems for 27 bioenergy. A potentially vulnerable cellulosic biomass resource is agricultural 28 residues. This paper presents the integration of the DAYCENT model with the 29 existing integration framework modeling tool to investigate additional environment 30 impacts of agricultural residue removal. The integrated model is extended to 31 facilitate two-way coupling between DAYCENT and the existing framework. The 32 extended integrated model is applied to investigate additional environmental 33 impacts from a recent sustainable agricultural residue removal dataset. The 34 integrated model with DAYCENT finds some differences in sustainable removal 35 rates compared to previous results for a case study county in Iowa. The extended 36 integrated model with

  8. Model Curriculum Standards, Program Framework, and Process Guide for Industrial and Technology Education in California.

    ERIC Educational Resources Information Center

    California State Dept. of Education, Sacramento. Div. of Career-Vocational Education.

    This three-section document contains the model curriculum standards, program framework, and process guide that will assist schools in California in providing career-vocational education programs that are responsive to a world marketplace characterized by constantly changing technology. The standards and frameworks can be implemented to provide a…

  9. Applying the Nominal Response Model within a Longitudinal Framework to Construct the Positive Family Relationships Scale

    ERIC Educational Resources Information Center

    Preston, Kathleen Suzanne Johnson; Parral, Skye N.; Gottfried, Allen W.; Oliver, Pamella H.; Gottfried, Adele Eskeles; Ibrahim, Sirena M.; Delany, Danielle

    2015-01-01

    A psychometric analysis was conducted using the nominal response model under the item response theory framework to construct the Positive Family Relationships scale. Using data from the Fullerton Longitudinal Study, this scale was constructed within a long-term longitudinal framework spanning middle childhood through adolescence. Items tapping…

  10. Linear models of coregionalization for multivariate lattice data: a general framework for coregionalized multivariate CAR models.

    PubMed

    MacNab, Ying C

    2016-09-20

    We present a general coregionalization framework for developing coregionalized multivariate Gaussian conditional autoregressive (cMCAR) models for Bayesian analysis of multivariate lattice data in general and multivariate disease mapping data in particular. This framework is inclusive of cMCARs that facilitate flexible modelling of spatially structured symmetric or asymmetric cross-variable local interactions, allowing a wide range of separable or non-separable covariance structures, and symmetric or asymmetric cross-covariances, to be modelled. We present a brief overview of established univariate Gaussian conditional autoregressive (CAR) models for univariate lattice data and develop coregionalized multivariate extensions. Classes of cMCARs are presented by formulating precision structures. The resulting conditional properties of the multivariate spatial models are established, which cast new light on cMCARs with richly structured covariances and cross-covariances of different spatial ranges. The related methods are illustrated via an in-depth Bayesian analysis of a Minnesota county-level cancer data set. We also bring a new dimension to the traditional enterprize of Bayesian disease mapping: estimating and mapping covariances and cross-covariances of the underlying disease risks. Maps of covariances and cross-covariances bring to light spatial characterizations of the cMCARs and inform on spatial risk associations between areas and diseases. Copyright © 2016 John Wiley & Sons, Ltd. PMID:27091685

  11. Assessing Students' Understandings of Biological Models and their Use in Science to Evaluate a Theoretical Framework

    NASA Astrophysics Data System (ADS)

    Grünkorn, Juliane; Belzen, Annette Upmeier zu; Krüger, Dirk

    2014-07-01

    Research in the field of students' understandings of models and their use in science describes different frameworks concerning these understandings. Currently, there is no conjoint framework that combines these structures and so far, no investigation has focused on whether it reflects students' understandings sufficiently (empirical evaluation). Therefore, the purpose of this article is to present the results of an empirical evaluation of a conjoint theoretical framework. The theoretical framework integrates relevant research findings and comprises five aspects which are subdivided into three levels each: nature of models, multiple models, purpose of models, testing, and changing models. The study was conducted with a sample of 1,177 seventh to tenth graders (aged 11-19 years) using open-ended items. The data were analysed by identifying students' understandings of models (nature of models and multiple models) and their use in science (purpose of models, testing, and changing models), and comparing as well as assigning them to the content of the theoretical framework. A comprehensive category system of students' understandings was thus developed. Regarding the empirical evaluation, the students' understandings of the nature and the purpose of models were sufficiently described by the theoretical framework. Concerning the understandings of multiple, testing, and changing models, additional initial understandings (only one model possible, no testing of models, and no change of models) need to be considered. This conjoint and now empirically tested framework for students' understandings can provide a common basis for future science education research. Furthermore, evidence-based indications can be provided for teachers and their instructional practice.

  12. Design of a component-based integrated environmental modeling framework

    EPA Science Inventory

    Integrated environmental modeling (IEM) includes interdependent science-based components (e.g., models, databases, viewers, assessment protocols) that comprise an appropriate software modeling system. The science-based components are responsible for consuming and producing inform...

  13. EarthCube - Earth System Bridge: Spanning Scientific Communities with Interoperable Modeling Frameworks

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.; DeLuca, C.; Gochis, D. J.; Arrigo, J.; Kelbert, A.; Choi, E.; Dunlap, R.

    2014-12-01

    In order to better understand and predict environmental hazards of weather/climate, ecology and deep earth processes, geoscientists develop and use physics-based computational models. These models are used widely both in academic and federal communities. Because of the large effort required to develop and test models, there is widespread interest in component-based modeling, which promotes model reuse and simplified coupling to tackle problems that often cross discipline boundaries. In component-based modeling, the goal is to make relatively small changes to models that make it easy to reuse them as "plug-and-play" components. Sophisticated modeling frameworks exist to rapidly couple these components to create new composite models. They allow component models to exchange variables while accommodating different programming languages, computational grids, time-stepping schemes, variable names and units. Modeling frameworks have arisen in many modeling communities. CSDMS (Community Surface Dynamics Modeling System) serves the academic earth surface process dynamics community, while ESMF (Earth System Modeling Framework) serves many federal Earth system modeling projects. Others exist in both the academic and federal domains and each satisfies design criteria that are determined by the community they serve. While they may use different interface standards or semantic mediation strategies, they share fundamental similarities. The purpose of the Earth System Bridge project is to develop mechanisms for interoperability between modeling frameworks, such as the ability to share a model or service component. This project has three main goals: (1) Develop a Framework Description Language (ES-FDL) that allows modeling frameworks to be described in a standard way so that their differences and similarities can be assessed. (2) Demonstrate that if a model is augmented with a framework-agnostic Basic Model Interface (BMI), then simple, universal adapters can go from BMI to a

  14. Temporo-spatial model construction using the MML and software framework.

    PubMed

    Chang, David C; Dokos, Socrates; Lovell, Nigel H

    2011-12-01

    Development of complex temporo-spatial biological computational models can be a time consuming and arduous task. These models may contain hundreds of differential equations as well as realistic geometries that may require considerable investment in time to ensure that all model components are correctly implemented and error free. To tackle this problem, the Modeling Markup Languages (MML) and software framework is a modular XML/HDF5-based specification and toolkits that aims to simplify this process. The main goal of this framework is to encourage reusability, sharing and storage. To achieve this, the MML framework utilizes the CellML specification and repository, which comprises an extensive range of curated models available for use. The MML framework is an open-source project available at http://mml.gsbme.unsw.edu.au. PMID:21947514

  15. Using a scalable modeling and simulation framework to evaluate the benefits of intelligent transportation systems.

    SciTech Connect

    Ewing, T.; Tentner, A.

    2000-03-21

    A scalable, distributed modeling and simulation framework has been developed at Argonne National Laboratory to study Intelligent Transportation Systems. The framework can run on a single-processor workstation, or run distributed on a multiprocessor computer or network of workstations. The framework is modular and supports plug-in models, hardware, and live data sources. The initial set of models currently includes road network and traffic flow, probe and smart vehicles, traffic management centers, communications between vehicles and centers, in-vehicle navigation systems, roadway traffic advisories. The modeling and simulation capability has been used to examine proposed ITS concepts. Results are presented from modeling scenarios from the Advanced Driver and Vehicle Advisory Navigation Concept (ADVANCE) experimental program to demonstrate how the framework can be used to evaluate the benefits of ITS and to plan future ITS operational tests and deployment initiatives.

  16. Model-based reasoning in the physics laboratory: Framework and initial results

    NASA Astrophysics Data System (ADS)

    Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.

    2015-12-01

    [This paper is part of the Focused Collection on Upper Division Physics Courses.] We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable process, within physics education, it has been preferentially applied to the iterative development of broadly applicable principles (e.g., Newton's laws of motion in introductory mechanics). A significant feature of the new framework is that measurement tools (in addition to the physical system being studied) are subjected to the process of modeling. Think-aloud interviews were used to refine the framework and demonstrate its utility by documenting examples of model-based reasoning in the laboratory. When applied to the think-aloud interviews, the framework captures and differentiates students' model-based reasoning and helps identify areas of future research. The interviews showed how students productively applied similar facets of modeling to the physical system and measurement tools: construction, prediction, interpretation of data, identification of model limitations, and revision. Finally, we document students' challenges in explicitly articulating assumptions when constructing models of experimental systems and further challenges in model construction due to students' insufficient prior conceptual understanding. A modeling perspective reframes many of the seemingly arbitrary technical details of measurement tools and apparatus as an opportunity for authentic and engaging scientific sense making.

  17. Hydrogeologic Framework Model for the Saturated Zone Site Scale flow and Transport Model

    SciTech Connect

    T. Miller

    2004-11-15

    The purpose of this report is to document the 19-unit, hydrogeologic framework model (19-layer version, output of this report) (HFM-19) with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results in accordance with AP-SIII.10Q, Models. The HFM-19 is developed as a conceptual model of the geometric extent of the hydrogeologic units at Yucca Mountain and is intended specifically for use in the development of the ''Saturated Zone Site-Scale Flow Model'' (BSC 2004 [DIRS 170037]). Primary inputs to this model report include the GFM 3.1 (DTN: MO9901MWDGFM31.000 [DIRS 103769]), borehole lithologic logs, geologic maps, geologic cross sections, water level data, topographic information, and geophysical data as discussed in Section 4.1. Figure 1-1 shows the information flow among all of the saturated zone (SZ) reports and the relationship of this conceptual model in that flow. The HFM-19 is a three-dimensional (3-D) representation of the hydrogeologic units surrounding the location of the Yucca Mountain geologic repository for spent nuclear fuel and high-level radioactive waste. The HFM-19 represents the hydrogeologic setting for the Yucca Mountain area that covers about 1,350 km2 and includes a saturated thickness of about 2.75 km. The boundaries of the conceptual model were primarily chosen to be coincident with grid cells in the Death Valley regional groundwater flow model (DTN: GS960808312144.003 [DIRS 105121]) such that the base of the site-scale SZ flow model is consistent with the base of the regional model (2,750 meters below a smoothed version of the potentiometric surface), encompasses the exploratory boreholes, and provides a framework over the area of interest for groundwater flow and radionuclide transport modeling. In depth, the model domain extends from land surface to the base of the regional groundwater flow model (D'Agnese et al. 1997 [DIRS 100131], p 2). For the site-scale SZ flow model, the HFM

  18. A Framework for Multifaceted Evaluation of Student Models

    ERIC Educational Resources Information Center

    Huang, Yun; González-Brenes, José P.; Kumar, Rohit; Brusilovsky, Peter

    2015-01-01

    Latent variable models, such as the popular Knowledge Tracing method, are often used to enable adaptive tutoring systems to personalize education. However, finding optimal model parameters is usually a difficult non-convex optimization problem when considering latent variable models. Prior work has reported that latent variable models obtained…

  19. A Model Independent S/W Framework for Search-Based Software Testing

    PubMed Central

    Baik, Jongmoon

    2014-01-01

    In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model. PMID:25302314

  20. Design of a Model Execution Framework: Repetitive Object-Oriented Simulation Environment (ROSE)

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Briggs, Jeffery L.

    2008-01-01

    The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.

  1. An Integrated Modeling Framework Forcasting Ecosystem Services--Application to the Albemarle Pamlico Basins, NC and VA (USA) and Beyond

    EPA Science Inventory

    We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...

  2. An Integrated Modeling Framework Forecasting Ecosystem Services: Application to the Albemarle Pamlico Basins, NC and VA (USA)

    EPA Science Inventory

    We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...

  3. Deep inelastic phenomena

    SciTech Connect

    Prescott, C.Y.

    1980-10-01

    Nucleon structure as seen in the context of deep inelastic scattering is discussed. The lectures begin with consideration of the quark-parton model. The model forms the basis of understanding lepton-nucleon inelastic scattering. As improved data in lepton-nucleon scattering at high energies became available, the quark-parton model failed to explain some crucial features of these data. At approximately the same time a candidate theory of strong interactions based on a SU(3) gauge theory of color was being discussed in the literature, and new ideas on the explanation of inelastic scattering data became popular. A new theory of strong interactions, now called quantum chromodynamics provides a new framework for understanding the data, with a much stronger theoretical foundation, and seems to explain well the features of the data. The lectures conclude with a look at some recent experiments which provide new data at very high energies. These lectures are concerned primarily with charged lepton inelastic scattering and to a lesser extent with neutrino results. Furthermore, due to time and space limitations, topics such as final state hadron studies, and multi-muon production are omitted here. The lectures concentrate on the more central issues: the quark-parton model and concepts of scaling, scale breaking and the ideas of quantum chromodynamics, the Q/sup 2/ dependence of structure function, moments, and the important parameter R.

  4. Landscape - Soilscape Modelling: Proposed framework for a model comparison benchmarking exercise, who wants to join?

    NASA Astrophysics Data System (ADS)

    Schoorl, Jeroen M.; Jetten, Victor G.; Coulthard, Thomas J.; Hancock, Greg R.; Renschler, Chris S.; Irvine, Brian J.; Cerdan, Olivier; Kirkby, Mike J.; (A) Veldkamp, Tom

    2014-05-01

    Current landscape - soilscape modelling frameworks are developed under a wide range of spatial and temporal resolutions and extents, from the so called event-based models, soil erosion models to the landscape evolution models. In addition, these models are based on different assumptions, include variable and different processes descriptions and produce different outcomes. Consequently, the models often need specific input data and their development and calibration is best linked to a specific area and local conditions. Model validation is often limited and restricted to the shorter time scales and single events. A first workshop on catchment based modelling (6 event based models were challenged then) was organised in the late 90's and the results lead to some excellent discussions on predictive modelling, equifinality and a special issue in Catena. It is time for a similar exercise: new models have been made, older models have been updated, and judging from literature there is a lot more experience in calibration/validation and reflections on processes observed in the field and how these should be simulated. In addition there are new data sources, such as high resolution remote sensing (including DEMs), new pattern analysis, comparison techniques and continuous developments and results in dating sediment archives and erosion rates. The main goal of this renewed exercise will be to come up with a benchmarking methodology for comparing and judging model behaviour including the issues of upscaling and downscaling of results. Model comparison may lead to the development of new research questions and lead to a firmer understanding of different models performance under different circumstances.

  5. ASSESSING POPULATION EXPOSURES TO MULTIPLE AIR POLLUTANTS USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    The Modeling Environment for Total Risks studies (MENTOR) system, combined with an extension of the SHEDS (Stochastic Human Exposure and Dose Simulation) methodology, provide a mechanistically consistent framework for conducting source-to-dose exposure assessments of multiple pol...

  6. A Framework for Sharing and Integrating Remote Sensing and GIS Models Based on Web Service

    PubMed Central

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a “black box” and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users. PMID:24901016

  7. A flexible and efficient multi-model framework in support of water management

    NASA Astrophysics Data System (ADS)

    Wolfs, Vincent; Tran Quoc, Quan; Willems, Patrick

    2016-05-01

    Flexible, fast and accurate water quantity models are essential tools in support of water management. Adjustable levels of model detail and the ability to handle varying spatial and temporal resolutions are requisite model characteristics to ensure that such models can be employed efficiently in various applications. This paper uses a newly developed flexible modelling framework that aims to generate such models. The framework incorporates several approaches to model catchment hydrology, rivers and floodplains, and the urban drainage system by lumping processes on different levels. To illustrate this framework, a case study of integrated hydrological-hydraulic modelling is elaborated for the Grote Nete catchment in Belgium. Three conceptual rainfall-runoff models (NAM, PDM and VHM) were implemented in a generalized model structure, allowing flexibility in the spatial resolution by means of an innovative disaggregation/aggregation procedure. They were linked to conceptual hydraulic models of the rivers in the catchment, which were developed by means of an advanced model structure identification and calibration procedure. The conceptual models manage to emulate the simulation results of a detailed full hydrodynamic model accurately. The models configured using the approaches of this framework are well-suited for many applications in water management due to their very short calculation time, interfacing possibilities and adjustable level of detail.

  8. CONCEPTUAL FRAMEWORK FOR REGRESSION MODELING OF GROUND-WATER FLOW.

    USGS Publications Warehouse

    Cooley, Richard L.

    1985-01-01

    The author examines the uses of ground-water flow models and which classes of use require treatment of stochastic components. He then compares traditional and stochastic procedures for modeling actual (as distinguished from hypothetical) systems. Finally, he examines the conceptual basis and characteristics of the regression approach to modeling ground-water flow.

  9. An Interactive Reference Framework for Modeling a Dynamic Immune System

    PubMed Central

    Spitzer, Matthew H.; Gherardini, Pier Federico; Fragiadakis, Gabriela K.; Bhattacharya, Nupur; Yuan, Robert T.; Hotson, Andrew N.; Finck, Rachel; Carmi, Yaron; Zunder, Eli R.; Fantl, Wendy J.; Bendall, Sean C.; Engleman, Edgar G.; Nolan, Garry P.

    2015-01-01

    Immune cells function in an interacting hierarchy that coordinates activities of various cell types according to genetic and environmental contexts. We developed graphical approaches to construct an extensible immune reference map from mass cytometry data of cells from different organs, incorporating landmark cell populations as flags on the map to compare cells from distinct samples. The maps recapitulated canonical cellular phenotypes and revealed reproducible, tissue-specific deviations. The approach revealed influences of genetic variation and circadian rhythms on immune system structure, enabled direct comparisons of murine and human blood cell phenotypes, and even enabled archival fluorescence-based flow cytometry data to be mapped onto the reference framework. This foundational reference map provides a working definition of systemic immune organization to which new data can be integrated to reveal deviations driven by genetics, environment, or pathology. PMID:26160952

  10. BioASF: a framework for automatically generating executable pathway models specified in BioPAX

    PubMed Central

    Haydarlou, Reza; Jacobsen, Annika; Bonzanni, Nicola; Feenstra, K. Anton; Abeln, Sanne; Heringa, Jaap

    2016-01-01

    Motivation: Biological pathways play a key role in most cellular functions. To better understand these functions, diverse computational and cell biology researchers use biological pathway data for various analysis and modeling purposes. For specifying these biological pathways, a community of researchers has defined BioPAX and provided various tools for creating, validating and visualizing BioPAX models. However, a generic software framework for simulating BioPAX models is missing. Here, we attempt to fill this gap by introducing a generic simulation framework for BioPAX. The framework explicitly separates the execution model from the model structure as provided by BioPAX, with the advantage that the modelling process becomes more reproducible and intrinsically more modular; this ensures natural biological constraints are satisfied upon execution. The framework is based on the principles of discrete event systems and multi-agent systems, and is capable of automatically generating a hierarchical multi-agent system for a given BioPAX model. Results: To demonstrate the applicability of the framework, we simulated two types of biological network models: a gene regulatory network modeling the haematopoietic stem cell regulators and a signal transduction network modeling the Wnt/β-catenin signaling pathway. We observed that the results of the simulations performed using our framework were entirely consistent with the simulation results reported by the researchers who developed the original models in a proprietary language. Availability and Implementation: The framework, implemented in Java, is open source and its source code, documentation and tutorial are available at http://www.ibi.vu.nl/programs/BioASF. Contact: j.heringa@vu.nl PMID:27307645

  11. Approaches to implementing deterministic models in a probabilistic framework

    SciTech Connect

    Talbott, D.V.

    1995-04-01

    The increasing use of results from probabilistic risk assessments in the decision-making process makes it ever more important to eliminate simplifications in probabilistic models that might lead to conservative results. One area in which conservative simplifications are often made is modeling the physical interactions that occur during the progression of an accident sequence. This paper demonstrates and compares different approaches for incorporating deterministic models of physical parameters into probabilistic models; parameter range binning, response curves, and integral deterministic models. An example that combines all three approaches in a probabilistic model for the handling of an energetic material (i.e. high explosive, rocket propellant,...) is then presented using a directed graph model.

  12. Multiple-species analysis of point count data: A more parsimonious modelling framework

    USGS Publications Warehouse

    Alldredge, M.W.; Pollock, K.H.; Simons, T.R.; Shriner, S.A.

    2007-01-01

    1. Although population surveys often provide information on multiple species, these data are rarely analysed within a multiple-species framework despite the potential for more efficient estimation of population parameters. 2. We have developed a multiple-species modelling framework that uses similarities in capture/detection processes among species to model multiple species data more parsimoniously. We present examples of this approach applied to distance, time of detection and multiple observer sampling for avian point count data. 3. Models that included species as a covariate and individual species effects were generally selected as the best models for distance sampling, but group models without species effects performed best for the time of detection and multiple observer methods. Population estimates were more precise for no-species-effect models than for species-effect models, demonstrating the benefits of exploiting species' similarities when modelling multiple species data. Partial species-effect models and additive models were also useful because they modelled similarities among species while allowing for species differences. 4. Synthesis and applications. We recommend the adoption of multiple-species modelling because of its potential for improved population estimates. This framework will be particularly beneficial for modelling count data from rare species because information on the detection process can be 'borrowed' from more common species. The multiple-species modelling framework presented here is applicable to a wide range of sampling techniques and taxa. ?? 2007 The Authors.

  13. Enhancing a socio-hydrological modelling framework through field observations: a case study in India

    NASA Astrophysics Data System (ADS)

    den Besten, Nadja; Pande, Saket; Savenije, Huub H. G.

    2016-04-01

    Recently a smallholder socio-hydrological modelling framework was proposed and deployed to understand the underlying dynamics of Agrarian Crisis in Maharashtra state of India. It was found that cotton and sugarcane smallholders whom lack irrigation and storage techniques are most susceptible to distress. This study further expands the application of the modelling framework to other crops that are abundant in the state of Maharashtra, such as Paddy, Jowar and Soyabean to assess whether the conclusions on the possible causes behind smallholder distress still hold. Further, a fieldwork will be undertaken in March 2016 in the district of Pune. During the fieldwork 50 smallholders will be interviewed in which socio-hydrological assumptions on hydrology and capital equations and corresponding closure relationships, incorporated the current model, will be put to test. Besides the assumptions, the questionnaires will be used to better understand the hydrological reality of the farm holders, in terms of water usage and storage capacity. In combination with historical records on the smallholders' socio-economic data acquired over the last thirty years available through several NGOs in the region, socio-hydrological realism of the modelling framework will be enhanced. The preliminary outcomes of a desktop study show the possibilities of a water-centric modelling framework in understanding the constraints on smallholder farming. The results and methods described can be a first step guiding following research on the modelling framework: a start in testing the framework in multiple rural locations around the globe.

  14. Comparing droplet activation parameterisations against adiabatic parcel models using a novel inverse modelling framework

    NASA Astrophysics Data System (ADS)

    Partridge, Daniel; Morales, Ricardo; Stier, Philip

    2015-04-01

    Many previous studies have compared droplet activation parameterisations against adiabatic parcel models (e.g. Ghan et al., 2001). However, these have often involved comparisons for a limited number of parameter combinations based upon certain aerosol regimes. Recent studies (Morales et al., 2014) have used wider ranges when evaluating their parameterisations, however, no study has explored the full possible multi-dimensional parameter space that would be experienced by droplet activations within a global climate model (GCM). It is important to be able to efficiently highlight regions of the entire multi-dimensional parameter space in which we can expect the largest discrepancy between parameterisation and cloud parcel models in order to ascertain which regions simulated by a GCM can be expected to be a less accurate representation of the process of cloud droplet activation. This study provides a new, efficient, inverse modelling framework for comparing droplet activation parameterisations to more complex cloud parcel models. To achieve this we couple a Markov Chain Monte Carlo algorithm (Partridge et al., 2012) to two independent adiabatic cloud parcel models and four droplet activation parameterisations. This framework is computationally faster than employing a brute force Monte Carlo simulation, and allows us to transparently highlight which parameterisation provides the closest representation across all aerosol physiochemical and meteorological environments. The parameterisations are demonstrated to perform well for a large proportion of possible parameter combinations, however, for certain key parameters; most notably the vertical velocity and accumulation mode aerosol concentration, large discrepancies are highlighted. These discrepancies correspond for parameter combinations that result in very high/low simulated values of maximum supersaturation. By identifying parameter interactions or regimes within the multi-dimensional parameter space we hope to guide

  15. Integrated Bayesian network framework for modeling complex ecological issues.

    PubMed

    Johnson, Sandra; Mengersen, Kerrie

    2012-07-01

    The management of environmental problems is multifaceted, requiring varied and sometimes conflicting objectives and perspectives to be considered. Bayesian network (BN) modeling facilitates the integration of information from diverse sources and is well suited to tackling the management challenges of complex environmental problems. However, combining several perspectives in one model can lead to large, unwieldy BNs that are difficult to maintain and understand. Conversely, an oversimplified model may lead to an unrealistic representation of the environmental problem. Environmental managers require the current research and available knowledge about an environmental problem of interest to be consolidated in a meaningful way, thereby enabling the assessment of potential impacts and different courses of action. Previous investigations of the environmental problem of interest may have already resulted in the construction of several disparate ecological models. On the other hand, the opportunity may exist to initiate this modeling. In the first instance, the challenge is to integrate existing models and to merge the information and perspectives from these models. In the second instance, the challenge is to include different aspects of the environmental problem incorporating both the scientific and management requirements. Although the paths leading to the combined model may differ for these 2 situations, the common objective is to design an integrated model that captures the available information and research, yet is simple to maintain, expand, and refine. BN modeling is typically an iterative process, and we describe a heuristic method, the iterative Bayesian network development cycle (IBNDC), for the development of integrated BN models that are suitable for both situations outlined above. The IBNDC approach facilitates object-oriented BN (OOBN) modeling, arguably viewed as the next logical step in adaptive management modeling, and that embraces iterative development

  16. Introducing MERGANSER: A Flexible Framework for Ecological Niche Modeling

    NASA Astrophysics Data System (ADS)

    Klawonn, M.; Dow, E. M.

    2015-12-01

    Ecological Niche Modeling (ENM) is a collection of techniques to find a "fundamental niche", the range of environmental conditions suitable for a species' survival in the absence of inter-species interactions, given a set of environmental parameters. Traditional approaches to ENM face a number of obstacles including limited data accessibility, data management problems, computational costs, interface usability, and model validation. The MERGANSER system, which stands for Modeling Ecological Residency Given A Normalized Set of Environmental Records, addresses these issues through powerful data persistence and flexible data access, coupled with a clear presentation of results and fine-tuned control over model parameters. MERGANSER leverages data measuring 72 weather related phenomena, land cover, soil type, population, species occurrence, general species information, and elevation, totaling over 1.5 TB of data. To the best of the authors' knowledge, MERGANSER uses higher-resolution spatial data sets than previously published models. Since MERGANSER stores data in an instance of Apache SOLR, layers generated in support of niche models are accessible to users via simplified Apache Lucene queries. This is made even simpler via an HTTP front end that generates Lucene queries automatically. Specifically, a user need only enter the name of a place and a species to run a model. Using this approach to synthesizing model layers, the MERGANSER system has successfully reproduced previously published niche model results with a simplified user experience. Input layers for the model are generated dynamically using OpenStreetMap and SOLR's spatial search functionality. Models are then run using either user-specified or automatically determined parameters after normalizing them into a common grid. Finally, results are visualized in the web interface, which allows for quick validation. Model results and all surrounding metadata are also accessible to the user for further study.

  17. Towards a Framework for Modeling Space Systems Architectures

    NASA Technical Reports Server (NTRS)

    Shames, Peter; Skipper, Joseph

    2006-01-01

    Topics covered include: 1) Statement of the problem: a) Space system architecture is complex; b) Existing terrestrial approaches must be adapted for space; c) Need a common architecture methodology and information model; d) Need appropriate set of viewpoints. 2) Requirements on a space systems model. 3) Model Based Engineering and Design (MBED) project: a) Evaluated different methods; b) Adapted and utilized RASDS & RM-ODP; c) Identified useful set of viewpoints; d) Did actual model exchanges among selected subset of tools. 4) Lessons learned & future vision.

  18. Integration of the Radiation Belt Environment Model Into the Space Weather Modeling Framework

    NASA Technical Reports Server (NTRS)

    Glocer, A.; Toth, G.; Fok, M.; Gombosi, T.; Liemohn, M.

    2009-01-01

    We have integrated the Fok radiation belt environment (RBE) model into the space weather modeling framework (SWMF). RBE is coupled to the global magnetohydrodynamics component (represented by the Block-Adaptive-Tree Solar-wind Roe-type Upwind Scheme, BATS-R-US, code) and the Ionosphere Electrodynamics component of the SWMF, following initial results using the Weimer empirical model for the ionospheric potential. The radiation belt (RB) model solves the convection-diffusion equation of the plasma in the energy range of 10 keV to a few MeV. In stand-alone mode RBE uses Tsyganenko's empirical models for the magnetic field, and Weimer's empirical model for the ionospheric potential. In the SWMF the BATS-R-US model provides the time dependent magnetic field by efficiently tracing the closed magnetic field-lines and passing the geometrical and field strength information to RBE at a regular cadence. The ionosphere electrodynamics component uses a two-dimensional vertical potential solver to provide new potential maps to the RBE model at regular intervals. We discuss the coupling algorithm and show some preliminary results with the coupled code. We run our newly coupled model for periods of steady solar wind conditions and compare our results to the RB model using an empirical magnetic field and potential model. We also simulate the RB for an active time period and find that there are substantial differences in the RB model results when changing either the magnetic field or the electric field, including the creation of an outer belt enhancement via rapid inward transport on the time scale of tens of minutes.

  19. A general framework for modeling growth and division of mammalian cells

    PubMed Central

    2011-01-01

    Background Modeling the cell-division cycle has been practiced for many years. As time has progressed, this work has gone from understanding the basic principles to addressing distinct biological problems, e.g., the nature of the restriction point, how checkpoints operate, the nonlinear dynamics of the cell cycle, the effect of localization, etc. Most models consist of coupled ordinary differential equations developed by the researchers, restricted to deal with the interactions of a limited number of molecules. In the future, cell-cycle modeling--and indeed all modeling of complex biologic processes--will increase in scope and detail. Results A framework for modeling complex cell-biologic processes is proposed here. The framework is based on two constructs: one describing the entire lifecycle of a molecule and the second describing the basic cellular machinery. Use of these constructs allows complex models to be built in a straightforward manner that fosters rigor and completeness. To demonstrate the framework, an example model of the mammalian cell cycle is presented that consists of several hundred differential equations of simple mass action kinetics. The model calculates energy usage, amino acid and nucleotide usage, membrane transport, RNA synthesis and destruction, and protein synthesis and destruction for 33 proteins to give an in-depth look at the cell cycle. Conclusions The framework presented here addresses how to develop increasingly descriptive models of complex cell-biologic processes. The example model of cellular growth and division constructed with the framework demonstrates that large structured models can be created with the framework, and these models can generate non-trivial descriptions of cellular processes. Predictions from the example model include those at both the molecular level--e.g., Wee1 spontaneously reactivates--and at the system level--e.g., pathways for timing-critical processes must shut down redundant pathways. A future effort is

  20. Experimental analysis of chaotic neural network models for combinatorial optimization under a unifying framework.

    PubMed

    Kwok, T; Smith, K A

    2000-09-01

    The aim of this paper is to study both the theoretical and experimental properties of chaotic neural network (CNN) models for solving combinatorial optimization problems. Previously we have proposed a unifying framework which encompasses the three main model types, namely, Chen and Aihara's chaotic simulated annealing (CSA) with decaying self-coupling, Wang and Smith's CSA with decaying timestep, and the Hopfield network with chaotic noise. Each of these models can be represented as a special case under the framework for certain conditions. This paper combines the framework with experimental results to provide new insights into the effect of the chaotic neurodynamics of each model. By solving the N-queen problem of various sizes with computer simulations, the CNN models are compared in different parameter spaces, with optimization performance measured in terms of feasibility, efficiency, robustness and scalability. Furthermore, characteristic chaotic neurodynamics crucial to effective optimization are identified, together with a guide to choosing the corresponding model parameters. PMID:11152205

  1. Developing an Interdisciplinary Curriculum Framework for Aquatic-Ecosystem Modeling

    ERIC Educational Resources Information Center

    Saito, Laurel; Segale, Heather M.; DeAngelis, Donald L.; Jenkins, Stephen H.

    2007-01-01

    This paper presents results from a July 2005 workshop and course aimed at developing an interdisciplinary course on modeling aquatic ecosystems that will provide the next generation of practitioners with critical skills for which formal training is presently lacking. Five different course models were evaluated: (1) fundamentals/general principles…

  2. An integrated hydrologic modeling framework for coupling SWAT with MODFLOW

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Soil and Water Assessment Tool (SWAT), MODFLOW, and Energy Balance based Evapotranspiration (EB_ET) models are extensively used to estimate different components of the hydrological cycle. Surface and subsurface hydrological processes are modeled in SWAT but limited to the extent of shallow aquif...

  3. KINEROS2 and the AGWA Modeling Framework 2013

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Kinematic Runoff and Erosion Model, KINEROS2, is a distributed, physically-based, event model describing the processes of interception, dynamic infiltration, surface runoff, and erosion from watersheds characterized by predominantly overland flow. The watershed is conceptualized as a cascade of...

  4. A Model Framework for Course Materials Construction (Second Edition).

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    Designed for use by Coast Guard course writers, curriculum developers, course coordinators, and instructors as a decision-support system, this publication presents a model that translates the Intraservices Procedures for Instructional Systems Development curriculum design model into materials usable by classroom teachers and students. Although…

  5. Simulation Framework for Teaching in Modeling and Simulation Areas

    ERIC Educational Resources Information Center

    De Giusti, Marisa Raquel; Lira, Ariel Jorge; Villarreal, Gonzalo Lujan

    2008-01-01

    Simulation is the process of executing a model that describes a system with enough detail; this model has its entities, an internal state, some input and output variables and a list of processes bound to these variables. Teaching a simulation language such as general purpose simulation system (GPSS) is always a challenge, because of the way it…

  6. Abdominal surgery process modeling framework for simulation using spreadsheets.

    PubMed

    Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja

    2015-08-01

    We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. PMID:26004999

  7. Source-to-Outcome Microbial Exposure and Risk Modeling Framework

    EPA Science Inventory

    A Quantitative Microbial Risk Assessment (QMRA) is a computer-based data-delivery and modeling approach that integrates interdisciplinary fate/transport, exposure, and impact models and databases to characterize potential health impacts/risks due to pathogens. As such, a QMRA ex...

  8. A MULTISCALE, CELL-BASED FRAMEWORK FOR MODELING CANCER DEVELOPMENT

    SciTech Connect

    JIANG, YI

    2007-01-16

    Cancer remains to be one of the leading causes of death due to diseases. We use a systems approach that combines mathematical modeling, numerical simulation, in vivo and in vitro experiments, to develop a predictive model that medical researchers can use to study and treat cancerous tumors. The multiscale, cell-based model includes intracellular regulations, cellular level dynamics and intercellular interactions, and extracellular level chemical dynamics. The intracellular level protein regulations and signaling pathways are described by Boolean networks. The cellular level growth and division dynamics, cellular adhesion and interaction with the extracellular matrix is described by a lattice Monte Carlo model (the Cellular Potts Model). The extracellular dynamics of the signaling molecules and metabolites are described by a system of reaction-diffusion equations. All three levels of the model are integrated through a hybrid parallel scheme into a high-performance simulation tool. The simulation results reproduce experimental data in both avasular tumors and tumor angiogenesis. By combining the model with experimental data to construct biologically accurate simulations of tumors and their vascular systems, this model will enable medical researchers to gain a deeper understanding of the cellular and molecular interactions associated with cancer progression and treatment.

  9. A framework for multi-criteria assessment of model enhancements

    NASA Astrophysics Data System (ADS)

    Francke, Till; Foerster, Saskia; Brosinsky, Arlena; Delgado, José; Güntner, Andreas; López-Tarazón, José A.; Bronstert, Axel

    2016-04-01

    Modellers are often faced with unsatisfactory model performance for a specific setup of a hydrological model. In these cases, the modeller may try to improve the setup by addressing selected causes for the model errors (i.e. data errors, structural errors). This leads to adding certain "model enhancements" (MEs), e.g. climate data based on more monitoring stations, improved calibration data, modifications in process formulations. However, deciding on which MEs to implement remains a matter of expert knowledge, guided by some sensitivity analysis at best. When multiple MEs have been implemented, a resulting improvement in model performance is not easily attributed, especially when considering different aspects of this improvement (e.g. better performance dynamics vs. reduced bias). In this study we present an approach for comparing the effect of multiple MEs in the face of multiple improvement aspects. A stepwise selection approach and structured plots help in addressing the multidimensionality of the problem. The approach is applied to a case study, which employs the meso-scale hydrosedimentological model WASA-SED for a sub-humid catchment. The results suggest that the effect of the MEs is quite diverse, with some MEs (e.g. augmented rainfall data) cause improvements for almost all aspects, while the effect of other MEs is restricted to few aspects or even deteriorate some. These specific results may not be generalizable. However, we suggest that based on studies like this, identifying the most promising MEs to implement may be facilitated.

  10. The Conceptual Framework of Factors Affecting Shared Mental Model

    ERIC Educational Resources Information Center

    Lee, Miyoung; Johnson, Tristan; Lee, Youngmin; O'Connor, Debra; Khalil, Mohammed

    2004-01-01

    Many researchers have paid attention to the potentiality and possibility of the shared mental model because it enables teammates to perform their job better by sharing team knowledge, skills, attitudes, dynamics and environments. Even though theoretical and experimental evidences provide a close relationship between the shared mental model and…

  11. The Relational-Cultural Model: A Framework for Group Process

    ERIC Educational Resources Information Center

    Comstock, Dana L.; Duffey, Thelma; St. George, Holly

    2002-01-01

    The relational-cultural model of psychotherapy has been evolving for the past 20 years. Within this model, difficult group dynamics are conceptualized as the playing out of the central relational paradox. This paradox recognizes that an individual may yearn for connection but, out of a sense of fear, simultaneously employ strategies that restrict…

  12. A general framework for application of prestrain to computational models of biological materials.

    PubMed

    Maas, Steve A; Erdemir, Ahmet; Halloran, Jason P; Weiss, Jeffrey A

    2016-08-01

    It is often important to include prestress in computational models of biological tissues. The prestress can represent residual stresses (stresses that exist after the tissue is excised from the body) or in situ stresses (stresses that exist in vivo, in the absence of loading). A prestressed reference configuration may also be needed when modeling the reference geometry of biological tissues in vivo. This research developed a general framework for representing prestress in finite element models of biological materials. It is assumed that the material is elastic, allowing the prestress to be represented via a prestrain. For prestrain fields that are not compatible with the reference geometry, the computational framework provides an iterative algorithm for updating the prestrain until equilibrium is satisfied. The iterative framework allows for enforcement of two different constraints: elimination of distortion in order to address the incompatibility issue, and enforcing a specified in situ fiber strain field while allowing for distortion. The framework was implemented as a plugin in FEBio (www.febio.org), making it easy to maintain the software and to extend the framework if needed. Several examples illustrate the application and effectiveness of the approach, including the application of in situ strains to ligaments in the Open Knee model (simtk.org/home/openknee). A novel method for recovering the stress-free configuration from the prestrain deformation gradient is also presented. This general purpose theoretical and computational framework for applying prestrain will allow analysts to overcome the challenges in modeling this important aspect of biological tissue mechanics. PMID:27131609

  13. Incorporating physically-based microstructures in materials modeling: Bridging phase field and crystal plasticity frameworks

    NASA Astrophysics Data System (ADS)

    Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.; Hanks, Byron W.; Foulk, James W.; Battaile, Corbett C.

    2016-05-01

    The mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FE meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.

  14. Incorporating physically-based microstructures in materials modeling: Bridging phase field and crystal plasticity frameworks

    DOE PAGESBeta

    Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.; Hanks, Byron W.; Foulk, James W.; Battaile, Corbett C.

    2016-04-25

    Here, the mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FEmore » meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.« less

  15. a Spatio-Temporal Framework for Modeling Active Layer Thickness

    NASA Astrophysics Data System (ADS)

    Touyz, J.; Streletskiy, D. A.; Nelson, F. E.; Apanasovich, T. V.

    2015-07-01

    The Arctic is experiencing an unprecedented rate of environmental and climate change. The active layer (the uppermost layer of soil between the atmosphere and permafrost that freezes in winter and thaws in summer) is sensitive to both climatic and environmental changes, and plays an important role in the functioning, planning, and economic activities of Arctic human and natural ecosystems. This study develops a methodology for modeling and estimating spatial-temporal variations in active layer thickness (ALT) using data from several sites of the Circumpolar Active Layer Monitoring network, and demonstrates its use in spatial-temporal interpolation. The simplest model's stochastic component exhibits no spatial or spatio-temporal dependency and is referred to as the naïve model, against which we evaluate the performance of the other models, which assume that the stochastic component exhibits either spatial or spatio-temporal dependency. The methods used to fit the models are then discussed, along with point forecasting. We compare the predicted fit of the various models at key study sites located in the North Slope of Alaska and demonstrate the advantages of space-time models through a series of error statistics such as mean squared error, mean absolute and percent deviance from observed data. We find the difference in performance between the spatio-temporal and remaining models is significant for all three error statistics. The best stochastic spatio-temporal model increases predictive accuracy, compared to the naïve model, of 33.3%, 36.2% and 32.5% on average across the three error metrics at the key sites for a one-year hold out period.

  16. A FRAMEWORK FOR FINE-SCALE COMPUTATIONAL FLUID DYNAMICS AIR QUALITY MODELING AND ANALYSIS

    EPA Science Inventory

    This paper discusses a framework for fine-scale CFD modeling that may be developed to complement the present Community Multi-scale Air Quality (CMAQ) modeling system which itself is a computational fluid dynamics model. A goal of this presentation is to stimulate discussions on w...

  17. The Community Earth System Model: A Framework for Collaborative Research

    SciTech Connect

    Hurrell, Jim; Holland, Marika M.; Gent, Peter R.; Ghan, Steven J.; Kay, Jennifer; Kushner, P.; Lamarque, J.-F.; Large, William G.; Lawrence, David M.; Lindsay, Keith; Lipscomb, William; Long , Matthew; Mahowald, N.; Marsh, D.; Neale, Richard; Rasch, Philip J.; Vavrus, Steven J.; Vertenstein, Mariana; Bader, David C.; Collins, William D.; Hack, James; Kiehl, J. T.; Marshall, Shawn

    2013-09-30

    The Community Earth System Model (CESM) is a flexible and extensible community tool used to investigate a diverse set of earth system interactions across multiple time and space scales. This global coupled model is a natural evolution from its predecessor, the Community Climate System Model, following the incorporation of new earth system capabilities. These include the ability to simulate biogeochemical cycles, atmospheric chemistry, ice sheets, and a high-top atmosphere. These and other new model capabilities are enabling investigations into a wide range of pressing scientific questions, providing new predictive capabilities and increasing our collective knowledge about the behavior and interactions of the earth system. Simulations with numerous configurations of the CESM have been provided to the Coupled Model Intercomparison Project Phase 5 (CMIP5) and are being analyzed by the broader community of scientists. Additionally, the model source code and associated documentation are freely available to the scientific community to use for earth system studies, making it a true community tool. Here we describe this earth modeling system, its various possible configurations, and illustrate its capabilities with a few science highlights.

  18. Brokering as a framework for hydrological model repeatability

    NASA Astrophysics Data System (ADS)

    Fuka, Daniel; Collick, Amy; MacAlister, Charlotte; Braeckel, Aaron; Wright, Dawn; Jodha Khalsa, Siri; Boldrini, Enrico; Easton, Zachary

    2015-04-01

    Data brokering aims to provide those in the the sciences with quick and repeatable access to data that represents physical, biological, and chemical characteristics; specifically to accelerate scientific discovery. Environmental models are useful tools to understand the behavior of hydrological systems. Unfortunately, parameterization of these hydrological models requires many different data, from different sources, and from different disciplines (e.g., atmospheric, geoscience, ecology). In basin scale hydrological modeling, the traditional procedure for model initialization starts with obtaining elevation models, land-use characterizations, soils maps, and weather data. It is often the researcher's past experience with these datasets that determines which datasets will be used in a study, and often newer, or more suitable data products will exist. An added complexity is that various science communities have differing data formats, storage protocols, and manipulation methods, which makes use by a non native user exceedingly difficult and time consuming. We demonstrate data brokering as a means to address several of these challenges. We present two test case scenarios in which researchers attempt to reproduce hydrological model results using 1) general internet based data gathering techniques, and 2) a scientific data brokering interface. We show that data brokering can increase the efficiency with which data are obtained, models are initialized, and results are analyzed. As an added benefit, it appears brokering can significantly increase the repeatability of a given study.

  19. Bayesian model selection framework for identifying growth patterns in filamentous fungi.

    PubMed

    Lin, Xiao; Terejanu, Gabriel; Shrestha, Sajan; Banerjee, Sourav; Chanda, Anindya

    2016-06-01

    This paper describes a rigorous methodology for quantification of model errors in fungal growth models. This is essential to choose the model that best describes the data and guide modeling efforts. Mathematical modeling of growth of filamentous fungi is necessary in fungal biology for gaining systems level understanding on hyphal and colony behaviors in different environments. A critical challenge in the development of these mathematical models arises from the indeterminate nature of their colony architecture, which is a result of processing diverse intracellular signals induced in response to a heterogeneous set of physical and nutritional factors. There exists a practical gap in connecting fungal growth models with measurement data. Here, we address this gap by introducing the first unified computational framework based on Bayesian inference that can quantify individual model errors and rank the statistical models based on their descriptive power against data. We show that this Bayesian model comparison is just a natural formalization of Occam׳s razor. The application of this framework is discussed in comparing three models in the context of synthetic data generated from a known true fungal growth model. This framework of model comparison achieves a trade-off between data fitness and model complexity and the quantified model error not only helps in calibrating and comparing the models, but also in making better predictions and guiding model refinements. PMID:27000772

  20. A Model Framework for Science and Other Course Materials Construction.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    A model is presented to provide guidance for Coast Guard writers, curriculum developers, course coordinators, and instructors who intend to update, or draft course materials. Detailed instructions are provided for developing instructor's guides and student's guides. (CS)

  1. A model integration framework for linking SWAT and MODFLOW

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Hydrological response and transport phenomena are driven by atmospheric, surface and subsurface processes. These complex processes occur at different spatiotemporal scales requiring comprehensive modeling to assess the impact of anthropogenic activity on hydrology and fate and transport of chemical ...

  2. A PROBABILISTIC MODELING FRAMEWORK FOR PREDICTING POPULATION EXPOSURES TO BENZENE

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) is modifying their probabilistic Stochastic Human Exposure Dose Simulation (SHEDS) model to assess aggregate exposures to air toxics. Air toxics include urban Hazardous Air Pollutants (HAPS) such as benzene from mobile sources, part...

  3. Effective Thermal Conductivity Modeling of Sandstones: SVM Framework Analysis

    NASA Astrophysics Data System (ADS)

    Rostami, Alireza; Masoudi, Mohammad; Ghaderi-Ardakani, Alireza; Arabloo, Milad; Amani, Mahmood

    2016-06-01

    Among the most significant physical characteristics of porous media, the effective thermal conductivity (ETC) is used for estimating the thermal enhanced oil recovery process efficiency, hydrocarbon reservoir thermal design, and numerical simulation. This paper reports the implementation of an innovative least square support vector machine (LS-SVM) algorithm for the development of enhanced model capable of predicting the ETCs of dry sandstones. By means of several statistical parameters, the validity of the presented model was evaluated. The prediction of the developed model for determining the ETCs of dry sandstones was in excellent agreement with the reported data with a coefficient of determination value ({R}2) of 0.983 and an average absolute relative deviation of 0.35 %. Results from present research show that the proposed LS-SVM model is robust, reliable, and efficient in calculating the ETCs of sandstones.

  4. CONCEPTUAL MODEL DEVELOPMENT AND INFORMATION MANAGEMENT FRAMEWORK FOR DIAGNOSTICS RESEARCH

    EPA Science Inventory

    Conceptual model development will focus on the effects of habitat alteration, nutrients,suspended and bedded sediments, and toxic chemicals on appropriate endpoints (individuals, populations, communities, ecosystems) across spatial scales (habitats, water body, watershed, region)...

  5. The Inter-Sectoral Impact Model Intercomparison Project (ISI–MIP): Project framework

    PubMed Central

    Warszawski, Lila; Frieler, Katja; Huber, Veronika; Piontek, Franziska; Serdeczny, Olivia; Schewe, Jacob

    2014-01-01

    The Inter-Sectoral Impact Model Intercomparison Project offers a framework to compare climate impact projections in different sectors and at different scales. Consistent climate and socio-economic input data provide the basis for a cross-sectoral integration of impact projections. The project is designed to enable quantitative synthesis of climate change impacts at different levels of global warming. This report briefly outlines the objectives and framework of the first, fast-tracked phase of Inter-Sectoral Impact Model Intercomparison Project, based on global impact models, and provides an overview of the participating models, input data, and scenario set-up. PMID:24344316

  6. The Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP): project framework.

    PubMed

    Warszawski, Lila; Frieler, Katja; Huber, Veronika; Piontek, Franziska; Serdeczny, Olivia; Schewe, Jacob

    2014-03-01

    The Inter-Sectoral Impact Model Intercomparison Project offers a framework to compare climate impact projections in different sectors and at different scales. Consistent climate and socio-economic input data provide the basis for a cross-sectoral integration of impact projections. The project is designed to enable quantitative synthesis of climate change impacts at different levels of global warming. This report briefly outlines the objectives and framework of the first, fast-tracked phase of Inter-Sectoral Impact Model Intercomparison Project, based on global impact models, and provides an overview of the participating models, input data, and scenario set-up. PMID:24344316

  7. Model-driven CDA Clinical Document Development Framework.

    PubMed

    Li, Jingdong; Lincoln, Michael J

    2007-01-01

    The Health Level 7 (HL7) Clinical Document Architecture, Release 2 (CDA R2) standardizes the structure and semantics of clinical documents in order to permit interchange. We have applied this standard to generate a platform independent CDA model and write a toolset that permits model specialization, generation of XML implementation artifacts, and provides an interface for clinical data managers. The resulting work was tested using US Department of Veterans Affairs Operative Note templates. PMID:18694129

  8. Multiscale Model of Colorectal Cancer Using the Cellular Potts Framework

    PubMed Central

    Osborne, James M

    2015-01-01

    Colorectal cancer (CRC) is one of the major causes of death in the developed world and forms a canonical example of tumorigenesis. CRC arises from a string of mutations of individual cells in the colorectal crypt, making it particularly suited for multiscale multicellular modeling, where mutations of individual cells can be clearly represented and their effects readily tracked. In this paper, we present a multicellular model of the onset of colorectal cancer, utilizing the cellular Potts model (CPM). We use the model to investigate how, through the modification of their mechanical properties, mutant cells colonize the crypt. Moreover, we study the influence of mutations on the shape of cells in the crypt, suggesting possible cell- and tissue-level indicators for identifying early-stage cancerous crypts. Crucially, we discuss the effect that the motility parameters of the model (key factors in the behavior of the CPM) have on the distribution of cells within a homeostatic crypt, resulting in an optimal parameter regime that accurately reflects biological assumptions. In summary, the key results of this paper are 1) how to couple the CPM with processes occurring on other spatial scales, using the example of the crypt to motivate suitable motility parameters; 2) modeling mutant cells with the CPM; 3) and investigating how mutations influence the shape of cells in the crypt. PMID:26461973

  9. Model Adaptation for Prognostics in a Particle Filtering Framework

    NASA Technical Reports Server (NTRS)

    Saha, Bhaskar; Goebel, Kai Frank

    2011-01-01

    One of the key motivating factors for using particle filters for prognostics is the ability to include model parameters as part of the state vector to be estimated. This performs model adaptation in conjunction with state tracking, and thus, produces a tuned model that can used for long term predictions. This feature of particle filters works in most part due to the fact that they are not subject to the "curse of dimensionality", i.e. the exponential growth of computational complexity with state dimension. However, in practice, this property holds for "well-designed" particle filters only as dimensionality increases. This paper explores the notion of wellness of design in the context of predicting remaining useful life for individual discharge cycles of Li-ion batteries. Prognostic metrics are used to analyze the tradeoff between different model designs and prediction performance. Results demonstrate how sensitivity analysis may be used to arrive at a well-designed prognostic model that can take advantage of the model adaptation properties of a particle filter.

  10. A full annual cycle modeling framework for American black ducks

    USGS Publications Warehouse

    Robinson, Orin J.; McGowan, Conor; Devers, Patrick K.; Brook, Rodney W.; Huang, Min; Jones, Malcom; McAuley, Daniel G.; Zimmerman, Guthrie

    2016-01-01

    American black ducks (Anas rubripes) are a harvested, international migratory waterfowl species in eastern North America. Despite an extended period of restrictive harvest regulations, the black duck population is still below the population goal identified in the North American Waterfowl Management Plan (NAWMP). It has been hypothesized that density-dependent factors restrict population growth in the black duck population and that habitat management (increases, improvements, etc.) may be a key component of growing black duck populations and reaching the prescribed NAWMP population goal. Using banding data from 1951 to 2011 and breeding population survey data from 1990 to 2014, we developed a full annual cycle population model for the American black duck. This model uses the seven management units as set by the Black Duck Joint Venture, allows movement into and out of each unit during each season, and models survival and fecundity for each region separately. We compare model population trajectories with observed population data and abundance estimates from the breeding season counts to show the accuracy of this full annual cycle model. With this model, we then show how to simulate the effects of habitat management on the continental black duck population.