Science.gov

Sample records for quark-parton model framework

  1. Semi-inclusive charged-pion electroproduction off protons and deuterons: Cross sections, ratios, and access to the quark-parton model at low energies

    DOE PAGES

    Asaturyan, R.; Ent, R.; Mkrtchyan, H.; Navasardyan, T.; Tadevosyan, V.; Adams, G. S.; Ahmidouch, A.; Angelescu, T.; Arrington, J.; Asaturyan, A.; et al

    2012-01-01

    A large set of cross sections for semi-inclusive electroproduction of charged pions (π±) from both proton and deuteron targets was measured. The data are in the deep-inelastic scattering region with invariant mass squared W2 > 4 GeV2 and range in four-momentum transfer squared 2 < Q2 < 4 (GeV/c)2, and cover a range in the Bjorken scaling variable 0.2 < x < 0.6. The fractional energy of the pions spans a range 0.3 < z < 1, with small transverse momenta with respect to the virtual-photon direction, Pt2 < 0.2 (GeV/c)2. The invariant mass that goes undetected, Mx or W',more » is in the nucleon resonance region, W' < 2 GeV. The new data conclusively show the onset of quark-hadron duality in this process, and the relation of this phenomenon to the high-energy factorization ansatz of electron-quark scattering and subsequent quark → pion production mechanisms. The x, z and Pt2 dependences of several ratios (the ratios of favored-unfavored fragmentation functions, charged pion ratios, deuteron-hydrogen and aluminum-deuteron ratios for π+ and π-) have been studied. The ratios are found to be in good agreement with expectations based upon a high-energy quark-parton model description. We find the azimuthal dependences to be small, as compared to exclusive pion electroproduction, and consistent with theoretical expectations based on tree-level factorization in terms of transverse-momentum-dependent parton distribution and fragmentation functions. In the context of a simple model, the initial transverse momenta of d quarks are found to be slightly smaller than for u quarks, while the transverse momentum width of the favored fragmentation function is about the same as for the unfavored one, and both fragmentation widths are larger than the quark widths.« less

  2. Semi-inclusive charged-pion electroproduction off protons and deuterons: Cross sections, ratios, and access to the quark-parton model at low energies

    SciTech Connect

    Asaturyan, R.; Ent, R.; Mkrtchyan, H.; Navasardyan, T.; Tadevosyan, V.; Adams, G. S.; Ahmidouch, A.; Angelescu, T.; Arrington, J.; Asaturyan, A.; Baker, O. K.; Benmouna, N.; Bertoncini, C.; Blok, H. P.; Boeglin, W. U.; Bosted, P. E.; Breuer, H.; Christy, M. E.; Connell, S. H.; Cui, Y.; Dalton, M. M.; Danagoulian, S.; Day, D.; Dunne, J. A.; Dutta, D.; El Khayari, N.; Fenker, H. C.; Frolov, V. V.; Gan, L.; Gaskell, D.; Hafidi, K.; Hinton, W.; Holt, R. J.; Horn, T.; Huber, G. M.; Hungerford, E.; Jiang, X.; Jones, M.; Joo, K.; Kalantarians, N.; Kelly, J. J.; Keppel, C. E.; Kubarovsky, V.; Li, Y.; Liang, Y.; Mack, D.; Malace, S. P.; Markowitz, P.; McGrath, E.; McKee, P.; Meekins, D. G.; Mkrtchyan, A.; Moziak, B.; Niculescu, G.; Niculescu, I.; Opper, A. K.; Ostapenko, T.; Reimer, P. E.; Reinhold, J.; Roche, J.; Rock, S. E.; Schulte, E.; Segbefia, E.; Smith, C.; Smith, G. R.; Stoler, P.; Tang, L.; Ungaro, M.; Uzzle, A.; Vidakovic, S.; Villano, A.; Vulcan, W. F.; Wang, M.; Warren, G.; Wesselmann, F. R.; Wojtsekhowski, B.; Wood, S. A.; Xu, C.; Yuan, L.; Zheng, X.

    2012-01-01

    A large set of cross sections for semi-inclusive electroproduction of charged pions (π±) from both proton and deuteron targets was measured. The data are in the deep-inelastic scattering region with invariant mass squared W2 > 4 GeV2 and range in four-momentum transfer squared 2 < Q2 < 4 (GeV/c)2, and cover a range in the Bjorken scaling variable 0.2 < x < 0.6. The fractional energy of the pions spans a range 0.3 < z < 1, with small transverse momenta with respect to the virtual-photon direction, Pt2 < 0.2 (GeV/c)2. The invariant mass that goes undetected, Mx or W', is in the nucleon resonance region, W' < 2 GeV. The new data conclusively show the onset of quark-hadron duality in this process, and the relation of this phenomenon to the high-energy factorization ansatz of electron-quark scattering and subsequent quark → pion production mechanisms. The x, z and Pt2 dependences of several ratios (the ratios of favored-unfavored fragmentation functions, charged pion ratios, deuteron-hydrogen and aluminum-deuteron ratios for π+ and π-) have been studied. The ratios are found to be in good agreement with expectations based upon a high-energy quark-parton model description. We find the azimuthal dependences to be small, as compared to exclusive pion electroproduction, and consistent with theoretical expectations based on tree-level factorization in terms of transverse-momentum-dependent parton distribution and fragmentation functions. In the context of a simple model, the initial transverse momenta of d quarks are found to be slightly smaller than for u quarks, while the transverse momentum width of the favored fragmentation function is about the same as for the unfavored one, and both fragmentation widths are larger than the quark widths.

  3. Pion and kaon valence-quark parton distribution functions

    SciTech Connect

    Nguyen, Trang; Bashir, Adnan; Roberts, Craig D.; Tandy, Peter C.

    2011-06-15

    A rainbow-ladder truncation of QCD's Dyson-Schwinger equations, constrained by existing applications to hadron physics, is employed to compute the valence-quark parton distribution functions of the pion and kaon. Comparison is made to {pi}-N Drell-Yan data for the pion's u-quark distribution and to Drell-Yan data for the ratio u{sub K}(x)/u{sub {pi}}(x): the environmental influence of this quantity is a parameter-free prediction, which agrees well with existing data. Our analysis unifies the computation of distribution functions with that of numerous other properties of pseudoscalar mesons.

  4. Pion and kaon valence-quark parton distribution functions.

    SciTech Connect

    Nguyen, T.; Bashir, A.; Roberts, C. D.; Tandy, P. C.

    2011-06-16

    A rainbow-ladder truncation of QCD's Dyson-Schwinger equations, constrained by existing applications to hadron physics, is employed to compute the valence-quark parton distribution functions of the pion and kaon. Comparison is made to {pi}-N Drell-Yan data for the pion's u-quark distribution and to Drell-Yan data for the ratio u{sub K}(x)/u{sub {pi}}(x): the environmental influence of this quantity is a parameter-free prediction, which agrees well with existing data. Our analysis unifies the computation of distribution functions with that of numerous other properties of pseudoscalar mesons.

  5. Pion and kaon valence-quark parton distribution functions

    NASA Astrophysics Data System (ADS)

    Nguyen, Trang; Bashir, Adnan; Roberts, Craig D.; Tandy, Peter C.

    2011-06-01

    A rainbow-ladder truncation of QCD’s Dyson-Schwinger equations, constrained by existing applications to hadron physics, is employed to compute the valence-quark parton distribution functions of the pion and kaon. Comparison is made to π-N Drell-Yan data for the pion’s u-quark distribution and to Drell-Yan data for the ratio uK(x)/uπ(x): the environmental influence of this quantity is a parameter-free prediction, which agrees well with existing data. Our analysis unifies the computation of distribution functions with that of numerous other properties of pseudoscalar mesons.

  6. Geologic Framework Model (GFM2000)

    SciTech Connect

    T. Vogt

    2004-08-26

    The purpose of this report is to document the geologic framework model, version GFM2000 with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, and the differences between GFM2000 and previous versions. The version number of this model reflects the year during which the model was constructed. This model supersedes the previous model version, documented in Geologic Framework Model (GFM 3.1) (CRWMS M&O 2000 [DIRS 138860]). The geologic framework model represents a three-dimensional interpretation of the geology surrounding the location of the monitored geologic repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain. The geologic framework model encompasses and is limited to an area of 65 square miles (168 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the geologic framework model (shown in Figure 1-1) were chosen to encompass the exploratory boreholes and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The upper surface of the model is made up of the surface topography and the depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The geologic framework model was constructed from geologic map and borehole data. Additional information from measured stratigraphic sections, gravity profiles, and seismic profiles was also considered. The intended use of the geologic framework model is to provide a geologic framework over the area of interest consistent with the level of detailed needed for hydrologic flow and radionuclide transport modeling through the UZ and for repository design. The model is limited by the availability of data and relative amount of geologic complexity found in an area. The geologic framework model is inherently limited by scale and content. The grid spacing used in the

  7. Dicyanometallates as Model Extended Frameworks

    PubMed Central

    2016-01-01

    We report the structures of eight new dicyanometallate frameworks containing molecular extra-framework cations. These systems include a number of hybrid inorganic–organic analogues of conventional ceramics, such as Ruddlesden–Popper phases and perovskites. The structure types adopted are rationalized in the broader context of all known dicyanometallate framework structures. We show that the structural diversity of this family can be understood in terms of (i) the charge and coordination preferences of the particular metal cation acting as framework node, and (ii) the size, shape, and extent of incorporation of extra-framework cations. In this way, we suggest that dicyanometallates form a particularly attractive model family of extended frameworks in which to explore the interplay between molecular degrees of freedom, framework topology, and supramolecular interactions. PMID:27057759

  8. Environmental modeling framework invasiveness: analysis and implications

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Environmental modeling frameworks support scientific model development by providing an Application Programming Interface (API) which model developers use to implement models. This paper presents results of an investigation on the framework invasiveness of environmental modeling frameworks. Invasiven...

  9. Environmental modeling framework invasiveness: analysis and implications

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Environmental modeling frameworks support scientific model development by providing an Application Programming Interface (API) which model developers use to implement models. This paper presents results of an investigation on the framework invasiveness of environmental modeling frameworks. Invasiv...

  10. Sequentially Executed Model Evaluation Framework

    SciTech Connect

    2014-02-14

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and 1/0 through the time domain (or other discrete domain), and sample 1/0 drivers. This is a Framework library framework, and does not, itself, solve any problems or execute any modelling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Ha) applications in quality monitoring, and was developed as part of the CANARY-EDS software, where real-time water quality data is being analyzed

  11. Sequentially Executed Model Evaluation Framework

    2014-02-14

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and 1/0 through the time domain (or other discrete domain), and sample 1/0 drivers. This is a Framework library framework, and does not, itself, solve any problems or execute any modelling. The SeMe framework aids in development of models which operate on sequential information, suchmore » as time-series, where evaluation is based on prior results combined with new data for this iteration. Ha) applications in quality monitoring, and was developed as part of the CANARY-EDS software, where real-time water quality data is being analyzed« less

  12. Sequentially Executed Model Evaluation Framework

    SciTech Connect

    2015-10-20

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as part of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.

  13. Sequentially Executed Model Evaluation Framework

    2015-10-20

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such asmore » time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as part of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.« less

  14. Pion valence-quark parton distribution function

    NASA Astrophysics Data System (ADS)

    Chang, Lei; Thomas, Anthony W.

    2015-10-01

    Within the Dyson-Schwinger equation formulation of QCD, a rainbow ladder truncation is used to calculate the pion valence-quark distribution function (PDF). The gap equation is renormalized at a typical hadronic scale, of order 0.5 GeV, which is also set as the default initial scale for the pion PDF. We implement a corrected leading-order expression for the PDF which ensures that the valence-quarks carry all of the pion's light-front momentum at the initial scale. The scaling behavior of the pion PDF at a typical partonic scale of order 5.2 GeV is found to be (1 - x) ν, with ν ≃ 1.6, as x approaches one.

  15. Geologic Framework Model Analysis Model Report

    SciTech Connect

    R. Clayton

    2000-12-19

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the

  16. A UML profile for framework modeling.

    PubMed

    Xu, Xiao-liang; Wang, Le-yu; Zhou, Hong

    2004-01-01

    The current standard Unified Modeling Language(UML) could not model framework flexibility and extendability adequately due to lack of appropriate constructs to distinguish framework hot-spots from kernel elements. A new UML profile that may customize UML for framework modeling was presented using the extension mechanisms of UML, providing a group of UML extensions to meet the needs of framework modeling. In this profile, the extended class diagrams and sequence diagrams were defined to straightforwardly identify the hot-spots and describe their instantiation restrictions. A transformation model based on design patterns was also put forward, such that the profile based framework design diagrams could be automatically mapped to the corresponding implementation diagrams. It was proved that the presented profile makes framework modeling more straightforwardly and therefore easier to understand and instantiate.

  17. The Generalized DINA Model Framework

    ERIC Educational Resources Information Center

    de la Torre, Jimmy

    2011-01-01

    The G-DINA ("generalized deterministic inputs, noisy and gate") model is a generalization of the DINA model with more relaxed assumptions. In its saturated form, the G-DINA model is equivalent to other general models for cognitive diagnosis based on alternative link functions. When appropriate constraints are applied, several commonly used…

  18. A Traceability Framework to facilitate model evaluation

    NASA Astrophysics Data System (ADS)

    Luo, Yiqi; Xia, Jianyang; Hararuk, Sasha; Wang, Ying Ping

    2013-04-01

    Land models have been developed to account for more and more processes, making their complex structures difficult to be understood and evaluated. Here we introduced a framework to decompose a complex land model into traceable components based on their mutually independent properties of modeled biogeochemical processes. The framework traces modeled ecosystem carbon storage capacity (Xss) to (1) a product of net primary productivity (NPP) and ecosystem residence time (τ_E). The latter τE can be further traced to (2) baseline carbon residence times (τ_(E )^'), which are usually preset in a model according to vegetation characteristics and soil types, (3) environmental scalars (ξ) including temperature and water scalars, and (4) environmental forcings. We have applied the framework to the Australian Community Atmosphere Biosphere Land Exchange (CABLE) model to help understand differences in modeled carbon processes among biomes and as influenced by nitrogen processes. Our framework could be used to facilitate data-model comparisons and model intercomparisons via tracking a few traceable components for all terrestrial carbon cycle models.

  19. Knowledge Encapsulation Framework for Collaborative Social Modeling

    SciTech Connect

    Cowell, Andrew J.; Gregory, Michelle L.; Marshall, Eric J.; McGrath, Liam R.

    2009-03-24

    This paper describes the Knowledge Encapsulation Framework (KEF), a suite of tools to enable knowledge inputs (relevant, domain-specific facts) to modeling and simulation projects, as well as other domains that require effective collaborative workspaces for knowledge-based task. This framework can be used to capture evidence (e.g., trusted material such as journal articles and government reports), discover new evidence (covering both trusted and social media), enable discussions surrounding domain-specific topics and provide automatically generated semantic annotations for improved corpus investigation. The current KEF implementation is presented within a wiki environment, providing a simple but powerful collaborative space for team members to review, annotate, discuss and align evidence with their modeling frameworks. The novelty in this approach lies in the combination of automatically tagged and user-vetted resources, which increases user trust in the environment, leading to ease of adoption for the collaborative environment.

  20. A Framework to Manage Information Models

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; King, T.; Crichton, D.; Walker, R.; Roberts, A.; Thieman, J.

    2008-05-01

    The Information Model is the foundation on which an Information System is built. It defines the entities to be processed, their attributes, and the relationships that add meaning. The development and subsequent management of the Information Model is the single most significant factor for the development of a successful information system. A framework of tools has been developed that supports the management of an information model with the rigor typically afforded to software development. This framework provides for evolutionary and collaborative development independent of system implementation choices. Once captured, the modeling information can be exported to common languages for the generation of documentation, application databases, and software code that supports both traditional and semantic web applications. This framework is being successfully used for several science information modeling projects including those for the Planetary Data System (PDS), the International Planetary Data Alliance (IPDA), the National Cancer Institute's Early Detection Research Network (EDRN), and several Consultative Committee for Space Data Systems (CCSDS) projects. The objective of the Space Physics Archive Search and Exchange (SPASE) program is to promote collaboration and coordination of archiving activity for the Space Plasma Physics community and ensure the compatibility of the architectures used for a global distributed system and the individual data centers. Over the past several years, the SPASE data model working group has made great progress in developing the SPASE Data Model and supporting artifacts including a data dictionary, XML Schema, and two ontologies. The authors have captured the SPASE Information Model in this framework. This allows the generation of documentation that presents the SPASE Information Model in object-oriented notation including UML class diagrams and class hierarchies. The modeling information can also be exported to semantic web languages such

  1. Rethinking modeling framework design: object modeling system 3.0

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Object Modeling System (OMS) is a framework for environmental model development, data provisioning, testing, validation, and deployment. It provides a bridge for transferring technology from the research organization to the program delivery agency. The framework provides a consistent and efficie...

  2. Modelling Diffusion of a Personalized Learning Framework

    ERIC Educational Resources Information Center

    Karmeshu; Raman, Raghu; Nedungadi, Prema

    2012-01-01

    A new modelling approach for diffusion of personalized learning as an educational process innovation in social group comprising adopter-teachers is proposed. An empirical analysis regarding the perception of 261 adopter-teachers from 18 schools in India about a particular personalized learning framework has been made. Based on this analysis,…

  3. Overarching framework for data-based modelling

    NASA Astrophysics Data System (ADS)

    Schelter, Björn; Mader, Malenka; Mader, Wolfgang; Sommerlade, Linda; Platt, Bettina; Lai, Ying-Cheng; Grebogi, Celso; Thiel, Marco

    2014-02-01

    One of the main modelling paradigms for complex physical systems are networks. When estimating the network structure from measured signals, typically several assumptions such as stationarity are made in the estimation process. Violating these assumptions renders standard analysis techniques fruitless. We here propose a framework to estimate the network structure from measurements of arbitrary non-linear, non-stationary, stochastic processes. To this end, we propose a rigorous mathematical theory that underlies this framework. Based on this theory, we present a highly efficient algorithm and the corresponding statistics that are immediately sensibly applicable to measured signals. We demonstrate its performance in a simulation study. In experiments of transitions between vigilance stages in rodents, we infer small network structures with complex, time-dependent interactions; this suggests biomarkers for such transitions, the key to understand and diagnose numerous diseases such as dementia. We argue that the suggested framework combines features that other approaches followed so far lack.

  4. DANA: distributed numerical and adaptive modelling framework.

    PubMed

    Rougier, Nicolas P; Fix, Jérémy

    2012-01-01

    DANA is a python framework ( http://dana.loria.fr ) whose computational paradigm is grounded on the notion of a unit that is essentially a set of time dependent values varying under the influence of other units via adaptive weighted connections. The evolution of a unit's value are defined by a set of differential equations expressed in standard mathematical notation which greatly ease their definition. The units are organized into groups that form a model. Each unit can be connected to any other unit (including itself) using a weighted connection. The DANA framework offers a set of core objects needed to design and run such models. The modeler only has to define the equations of a unit as well as the equations governing the training of the connections. The simulation is completely transparent to the modeler and is handled by DANA. This allows DANA to be used for a wide range of numerical and distributed models as long as they fit the proposed framework (e.g. cellular automata, reaction-diffusion system, decentralized neural networks, recurrent neural networks, kernel-based image processing, etc.).

  5. An evaluation framework for participatory modelling

    NASA Astrophysics Data System (ADS)

    Krueger, T.; Inman, A.; Chilvers, J.

    2012-04-01

    Strong arguments for participatory modelling in hydrology can be made on substantive, instrumental and normative grounds. These arguments have led to increasingly diverse groups of stakeholders (here anyone affecting or affected by an issue) getting involved in hydrological research and the management of water resources. In fact, participation has become a requirement of many research grants, programs, plans and policies. However, evidence of beneficial outcomes of participation as suggested by the arguments is difficult to generate and therefore rare. This is because outcomes are diverse, distributed, often tacit, and take time to emerge. In this paper we develop an evaluation framework for participatory modelling focussed on learning outcomes. Learning encompasses many of the potential benefits of participation, such as better models through diversity of knowledge and scrutiny, stakeholder empowerment, greater trust in models and ownership of subsequent decisions, individual moral development, reflexivity, relationships, social capital, institutional change, resilience and sustainability. Based on the theories of experiential, transformative and social learning, complemented by practitioner experience our framework examines if, when and how learning has occurred. Special emphasis is placed on the role of models as learning catalysts. We map the distribution of learning between stakeholders, scientists (as a subgroup of stakeholders) and models. And we analyse what type of learning has occurred: instrumental learning (broadly cognitive enhancement) and/or communicative learning (change in interpreting meanings, intentions and values associated with actions and activities; group dynamics). We demonstrate how our framework can be translated into a questionnaire-based survey conducted with stakeholders and scientists at key stages of the participatory process, and show preliminary insights from applying the framework within a rural pollution management situation in

  6. A framework for benchmarking land models

    SciTech Connect

    Luo, Yiqi; Randerson, James T.; Hoffman, Forrest; Norby, Richard J

    2012-01-01

    Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1) targeted aspects of model performance to be evaluated, (2) a set of benchmarks as defined references to test model performance, (3) metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4) model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1) a priori thresholds of acceptable model performance and (2) a scoring system to combine data model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties of land models

  7. A Framework for Considering Comprehensibility in Modeling.

    PubMed

    Gleicher, Michael

    2016-06-01

    Comprehensibility in modeling is the ability of stakeholders to understand relevant aspects of the modeling process. In this article, we provide a framework to help guide exploration of the space of comprehensibility challenges. We consider facets organized around key questions: Who is comprehending? Why are they trying to comprehend? Where in the process are they trying to comprehend? How can we help them comprehend? How do we measure their comprehension? With each facet we consider the broad range of options. We discuss why taking a broad view of comprehensibility in modeling is useful in identifying challenges and opportunities for solutions.

  8. A Framework for Considering Comprehensibility in Modeling.

    PubMed

    Gleicher, Michael

    2016-06-01

    Comprehensibility in modeling is the ability of stakeholders to understand relevant aspects of the modeling process. In this article, we provide a framework to help guide exploration of the space of comprehensibility challenges. We consider facets organized around key questions: Who is comprehending? Why are they trying to comprehend? Where in the process are they trying to comprehend? How can we help them comprehend? How do we measure their comprehension? With each facet we consider the broad range of options. We discuss why taking a broad view of comprehensibility in modeling is useful in identifying challenges and opportunities for solutions. PMID:27441712

  9. An entropic framework for modeling economies

    NASA Astrophysics Data System (ADS)

    Caticha, Ariel; Golan, Amos

    2014-08-01

    We develop an information-theoretic framework for economic modeling. This framework is based on principles of entropic inference that are designed for reasoning on the basis of incomplete information. We take the point of view of an external observer who has access to limited information about broad macroscopic economic features. We view this framework as complementary to more traditional methods. The economy is modeled as a collection of agents about whom we make no assumptions of rationality (in the sense of maximizing utility or profit). States of statistical equilibrium are introduced as those macrostates that maximize entropy subject to the relevant information codified into constraints. The basic assumption is that this information refers to supply and demand and is expressed in the form of the expected values of certain quantities (such as inputs, resources, goods, production functions, utility functions and budgets). The notion of economic entropy is introduced. It provides a measure of the uniformity of the distribution of goods and resources. It captures both the welfare state of the economy as well as the characteristics of the market (say, monopolistic, concentrated or competitive). Prices, which turn out to be the Lagrange multipliers, are endogenously generated by the economy. Further studies include the equilibrium between two economies and the conditions for stability. As an example, the case of the nonlinear economy that arises from linear production and utility functions is treated in some detail.

  10. Architecting a Simulation Framework for Model Rehosting

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2004-01-01

    The utility of vehicle math models extends beyond human-in-the-loop simulation. It is desirable to deploy a given model across a multitude of applications that target design, analysis, and research. However, the vehicle model alone represents an incomplete simulation. One must also replicate the environment models (e.g., atmosphere, gravity, terrain) to achieve identical vehicle behavior across all applications. Environment models are increasing in complexity and represent a substantial investment to re-engineer for a new application. A software component that can be rehosted in each application is one solution to the deployment problem. The component must encapsulate both the vehicle and environment models. The component must have a well-defined interface that abstracts the bulk of the logic to operate the models. This paper examines the characteristics of a rehostable modeling component from the perspective of a human-in-the-loop simulation framework. The Langley Standard Real-Time Simulation in C++ (LaSRS++) is used as an example. LaSRS++ was recently redesigned to transform its modeling package into a rehostable component.

  11. Conceptual Frameworks in the Doctoral Research Process: A Pedagogical Model

    ERIC Educational Resources Information Center

    Berman, Jeanette; Smyth, Robyn

    2015-01-01

    This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…

  12. A Smallholder Socio-hydrological Modelling Framework

    NASA Astrophysics Data System (ADS)

    Pande, S.; Savenije, H.; Rathore, P.

    2014-12-01

    Small holders are farmers who own less than 2 ha of farmland. They often have low productivity and thus remain at subsistence level. A fact that nearly 80% of Indian farmers are smallholders, who merely own a third of total farmlands and belong to the poorest quartile, but produce nearly 40% of countries foodgrains underlines the importance of understanding the socio-hydrology of a small holder. We present a framework to understand the socio-hydrological system dynamics of a small holder. It couples the dynamics of 6 main variables that are most relevant at the scale of a small holder: local storage (soil moisture and other water storage), capital, knowledge, livestock production, soil fertility and grass biomass production. The model incorporates rule-based adaptation mechanisms (for example: adjusting expenditures on food and fertilizers, selling livestocks etc.) of small holders when they face adverse socio-hydrological conditions, such as low annual rainfall, higher intra-annual variability in rainfall or variability in agricultural prices. It allows us to study sustainability of small holder farming systems under various settings. We apply the framework to understand the socio-hydrology of small holders in Aurangabad, Maharashtra, India. This district has witnessed suicides of many sugarcane farmers who could not extricate themselves out of the debt trap. These farmers lack irrigation and are susceptible to fluctuating sugar prices and intra-annual hydroclimatic variability. This presentation discusses two aspects in particular: whether government interventions to absolve the debt of farmers is enough and what is the value of investing in local storages that can buffer intra-annual variability in rainfall and strengthening the safety-nets either by creating opportunities for alternative sources of income or by crop diversification.

  13. Field-theoretical description of deep inelastic scattering

    SciTech Connect

    Geyer, B.; Robaschik, D.; Wieczorek, E.

    1980-01-01

    The most important theoretical notions concerning deep inelastic scattering are reviewed. Topics discussed are the model-independent approach, which is based on the general principles of quantum field theory, the application of quantum chromodynamics to deep inelastic scattering, approaches based on the quark--parton model, the light cone algebra, and conformal invariance, and also investigations in the framework of perturbation theory.

  14. Electroweak radiative effects in the single $W$-production at Tevatron and LHC

    SciTech Connect

    I. Akushevich; A. Ilyichev; N. Shumeiko; V. Zykunov

    2003-08-01

    An alternative calculation of the lowest order electroweak radiative corrections to the single W-boson production in hadron-hadron collision in the framework of the quark parton model without any absorption of the collinear quark singularity into the parton distributions is carried out. Numerical analysis under Tevatron and LHC kinematic conditions is performed.

  15. PARCC Model Content Frameworks: Mathematics--Grades 3-11

    ERIC Educational Resources Information Center

    Partnership for Assessment of Readiness for College and Careers (NJ1), 2011

    2011-01-01

    As part of its proposal to the U.S. Department of Education, the Partnership for Assessment of Readiness for College and Careers (PARCC) committed to developing model content frameworks for mathematics to serve as a bridge between the Common Core State Standards and the PARCC assessments. The PARCC Model Content Frameworks were developed through a…

  16. Critical Thinking: Frameworks and Models for Teaching

    ERIC Educational Resources Information Center

    Fahim, Mansoor; Eslamdoost, Samaneh

    2014-01-01

    Developing critical thinking since the educational revolution gave rise to flourishing movements toward embedding critical thinking (CT henceforth) stimulating classroom activities in educational settings. Nevertheless the process faced with complications such as teachability potentiality, lack of practical frameworks concerning actualization of…

  17. A Simulation and Modeling Framework for Space Situational Awareness

    SciTech Connect

    Olivier, S S

    2008-09-15

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.

  18. A traceability framework for diagnostics of global land models

    NASA Astrophysics Data System (ADS)

    Luo, Yiqi; Xia, Jianyang; Liang, Junyi; Jiang, Lifen; Shi, Zheng; KC, Manoj; Hararuk, Oleksandra; Rafique, Rashid; Wang, Ying-Ping

    2015-04-01

    The biggest impediment to model diagnostics and improvement at present is model intractability. The more processes incorporated, the more difficult it becomes to understand or evaluate model behavior. As a result, uncertainty in predictions among global land models cannot be easily diagnosed and attributed to their sources. We have recently developed an approach to analytically decompose a complex land model into traceable components based on mutually independent properties of modeled core biogeochemical processes. As all global land carbon models share those common properties, this traceability framework is applicable to all of them to improve their tractability. Indeed, we have applied the traceability framework to improve model diagnostics in several aspects. First, we have applied the framework to the Australian Community Atmosphere Biosphere Land Exchange (CABLE) model and Community Land Model version 3.5 (CLM3.5) to identify sources of those model differences. The major causes of different predictions were found to be parameter setting related to carbon input and baseline residence times between the two models. Second, we have used the framework to diagnose impacts of adding nitrogen processes into CABLE on its carbon simulation. Adding nitrogen processes not only reduces net primary production but also shortens residence times in the CABLE model. Third, the framework helps isolate components of CLM3.5 for data assimilation. Data assimilation with global land models has been computationally extremely difficult. By isolating traceable components, we have improved parameterization of CLM3.4 related to soil organic decomposition, microbial kinetics and carbon use efficiency, and litter decomposition. Further, we are currently developing the traceability framework to analyze transient simulations of models that were involved in the coupled Model Intercomparison Project Phase 5 (CMIP5) to improve our understanding on parameter space of global carbon models. This

  19. A framework for modeling human evolution.

    PubMed

    Gintis, Herbert

    2016-01-01

    Culture-led gene-culture coevolution is a framework within which substantive explanations of human evolution must be located. It is not itself an explanation. Explanations depend on such concrete historical evolutionary factors such as the control of fire, collective child-rearing, lethal weapon technology, altruistic cooperation and punishment, and the mastery of complex collaboration protocols leading to an effective division of social labor. PMID:27561218

  20. A framework for modeling uncertainty in regional climate change

    EPA Science Inventory

    In this study, we present a new modeling framework and a large ensemble of climate projections to investigate the uncertainty in regional climate change over the United States associated with four dimensions of uncertainty. The sources of uncertainty considered in this framework ...

  1. A Framework for Dimensionality Assessment for Multidimensional Item Response Models

    ERIC Educational Resources Information Center

    Svetina, Dubravka; Levy, Roy

    2014-01-01

    A framework is introduced for considering dimensionality assessment procedures for multidimensional item response models. The framework characterizes procedures in terms of their confirmatory or exploratory approach, parametric or nonparametric assumptions, and applicability to dichotomous, polytomous, and missing data. Popular and emerging…

  2. Coastal Ecosystem Integrated Compartment Model (ICM): Modeling Framework

    NASA Astrophysics Data System (ADS)

    Meselhe, E. A.; White, E. D.; Reed, D.

    2015-12-01

    The Integrated Compartment Model (ICM) was developed as part of the 2017 Coastal Master Plan modeling effort. It is a comprehensive and numerical hydrodynamic model coupled to various geophysical process models. Simplifying assumptions related to some of the flow dynamics are applied to increase the computational efficiency of the model. The model can be used to provide insights about coastal ecosystems and evaluate restoration strategies. It builds on existing tools where possible and incorporates newly developed tools where necessary. It can perform decadal simulations (~ 50 years) across the entire Louisiana coast. It includes several improvements over the approach used to support the 2012 Master Plan, such as: additional processes in the hydrology, vegetation, wetland and barrier island morphology subroutines, increased spatial resolution, and integration of previously disparate models into a single modeling framework. The ICM includes habitat suitability indices (HSIs) to predict broad spatial patterns of habitat change, and it provides an additional integration to a dynamic fish and shellfish community model which quantitatively predicts potential changes in important fishery resources. It can be used to estimate the individual and cumulative effects of restoration and protection projects on the landscape, including a general estimate of water levels associated with flooding. The ICM is also used to examine possible impacts of climate change and future environmental scenarios (e.g. precipitation, Eustatic sea level rise, subsidence, tropical storms, etc.) on the landscape and on the effectiveness of restoration projects. The ICM code is publically accessible, and coastal restoration and protection groups interested in planning-level modeling are encouraged to explore its utility as a computationally efficient tool to examine ecosystem response to future physical or ecological changes, including the implementation of restoration and protection strategies.

  3. Theoretical Frameworks for Multiscale Modeling and Simulation

    PubMed Central

    Zhou, Huan-Xiang

    2014-01-01

    Biomolecular systems have been modeled at a variety of scales, ranging from explicit treatment of electrons and nuclei to continuum description of bulk deformation or velocity. Many challenges of interfacing between scales have been overcome. Multiple models at different scales have been used to study the same system or calculate the same property (e.g., channel conductance). Accurate modeling of biochemical processes under in vivo conditions and the bridging of molecular and subcellular scales will likely soon become reality. PMID:24492203

  4. A Computational Framework for Realistic Retina Modeling.

    PubMed

    Martínez-Cañada, Pablo; Morillas, Christian; Pino, Begoña; Ros, Eduardo; Pelayo, Francisco

    2016-11-01

    Computational simulations of the retina have led to valuable insights about the biophysics of its neuronal activity and processing principles. A great number of retina models have been proposed to reproduce the behavioral diversity of the different visual processing pathways. While many of these models share common computational stages, previous efforts have been more focused on fitting specific retina functions rather than generalizing them beyond a particular model. Here, we define a set of computational retinal microcircuits that can be used as basic building blocks for the modeling of different retina mechanisms. To validate the hypothesis that similar processing structures may be repeatedly found in different retina functions, we implemented a series of retina models simply by combining these computational retinal microcircuits. Accuracy of the retina models for capturing neural behavior was assessed by fitting published electrophysiological recordings that characterize some of the best-known phenomena observed in the retina: adaptation to the mean light intensity and temporal contrast, and differential motion sensitivity. The retinal microcircuits are part of a new software platform for efficient computational retina modeling from single-cell to large-scale levels. It includes an interface with spiking neural networks that allows simulation of the spiking response of ganglion cells and integration with models of higher visual areas. PMID:27354192

  5. A Computational Framework for Realistic Retina Modeling.

    PubMed

    Martínez-Cañada, Pablo; Morillas, Christian; Pino, Begoña; Ros, Eduardo; Pelayo, Francisco

    2016-11-01

    Computational simulations of the retina have led to valuable insights about the biophysics of its neuronal activity and processing principles. A great number of retina models have been proposed to reproduce the behavioral diversity of the different visual processing pathways. While many of these models share common computational stages, previous efforts have been more focused on fitting specific retina functions rather than generalizing them beyond a particular model. Here, we define a set of computational retinal microcircuits that can be used as basic building blocks for the modeling of different retina mechanisms. To validate the hypothesis that similar processing structures may be repeatedly found in different retina functions, we implemented a series of retina models simply by combining these computational retinal microcircuits. Accuracy of the retina models for capturing neural behavior was assessed by fitting published electrophysiological recordings that characterize some of the best-known phenomena observed in the retina: adaptation to the mean light intensity and temporal contrast, and differential motion sensitivity. The retinal microcircuits are part of a new software platform for efficient computational retina modeling from single-cell to large-scale levels. It includes an interface with spiking neural networks that allows simulation of the spiking response of ganglion cells and integration with models of higher visual areas.

  6. Mediation Analysis in a Latent Growth Curve Modeling Framework

    ERIC Educational Resources Information Center

    von Soest, Tilmann; Hagtvet, Knut A.

    2011-01-01

    This article presents several longitudinal mediation models in the framework of latent growth curve modeling and provides a detailed account of how such models can be constructed. Logical and statistical challenges that might arise when such analyses are conducted are also discussed. Specifically, we discuss how the initial status (intercept) and…

  7. Entity-Centric Abstraction and Modeling Framework for Transportation Architectures

    NASA Technical Reports Server (NTRS)

    Lewe, Jung-Ho; DeLaurentis, Daniel A.; Mavris, Dimitri N.; Schrage, Daniel P.

    2007-01-01

    A comprehensive framework for representing transpportation architectures is presented. After discussing a series of preceding perspectives and formulations, the intellectual underpinning of the novel framework using an entity-centric abstraction of transportation is described. The entities include endogenous and exogenous factors and functional expressions are offered that relate these and their evolution. The end result is a Transportation Architecture Field which permits analysis of future concepts under the holistic perspective. A simulation model which stems from the framework is presented and exercised producing results which quantify improvements in air transportation due to advanced aircraft technologies. Finally, a modeling hypothesis and its accompanying criteria are proposed to test further use of the framework for evaluating new transportation solutions.

  8. Modeling Spatial Relationships within a Fuzzy Framework.

    ERIC Educational Resources Information Center

    Petry, Frederick E.; Cobb, Maria A.

    1998-01-01

    Presents a model for representing and storing binary topological and directional relationships between 2-dimensional objects that is used to provide a basis for fuzzy querying capabilities. A data structure called an abstract spatial graph (ASG) is defined for the binary relationships that maintains all necessary information regarding topology and…

  9. Evolutionary Framework for Lepidoptera Model Systems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Model systems” are specific organisms upon which detailed studies have been conducted examining a fundamental biological question. If the studies are robust, their results can be extrapolated among an array of organisms that possess features in common with the subject organism. The true power of...

  10. Theoretical Tinnitus Framework: A Neurofunctional Model.

    PubMed

    Ghodratitoostani, Iman; Zana, Yossi; Delbem, Alexandre C B; Sani, Siamak S; Ekhtiari, Hamed; Sanchez, Tanit G

    2016-01-01

    Subjective tinnitus is the conscious (attended) awareness perception of sound in the absence of an external source and can be classified as an auditory phantom perception. Earlier literature establishes three distinct states of conscious perception as unattended, attended, and attended awareness conscious perception. The current tinnitus development models depend on the role of external events congruently paired with the causal physical events that precipitate the phantom perception. We propose a novel Neurofunctional Tinnitus Model to indicate that the conscious (attended) awareness perception of phantom sound is essential in activating the cognitive-emotional value. The cognitive-emotional value plays a crucial role in governing attention allocation as well as developing annoyance within tinnitus clinical distress. Structurally, the Neurofunctional Tinnitus Model includes the peripheral auditory system, the thalamus, the limbic system, brainstem, basal ganglia, striatum, and the auditory along with prefrontal cortices. Functionally, we assume the model includes presence of continuous or intermittent abnormal signals at the peripheral auditory system or midbrain auditory paths. Depending on the availability of attentional resources, the signals may or may not be perceived. The cognitive valuation process strengthens the lateral-inhibition and noise canceling mechanisms in the mid-brain, which leads to the cessation of sound perception and renders the signal evaluation irrelevant. However, the "sourceless" sound is eventually perceived and can be cognitively interpreted as suspicious or an indication of a disease in which the cortical top-down processes weaken the noise canceling effects. This results in an increase in cognitive and emotional negative reactions such as depression and anxiety. The negative or positive cognitive-emotional feedbacks within the top-down approach may have no relation to the previous experience of the patients. They can also be

  11. Theoretical Tinnitus Framework: A Neurofunctional Model

    PubMed Central

    Ghodratitoostani, Iman; Zana, Yossi; Delbem, Alexandre C. B.; Sani, Siamak S.; Ekhtiari, Hamed; Sanchez, Tanit G.

    2016-01-01

    Subjective tinnitus is the conscious (attended) awareness perception of sound in the absence of an external source and can be classified as an auditory phantom perception. Earlier literature establishes three distinct states of conscious perception as unattended, attended, and attended awareness conscious perception. The current tinnitus development models depend on the role of external events congruently paired with the causal physical events that precipitate the phantom perception. We propose a novel Neurofunctional Tinnitus Model to indicate that the conscious (attended) awareness perception of phantom sound is essential in activating the cognitive-emotional value. The cognitive-emotional value plays a crucial role in governing attention allocation as well as developing annoyance within tinnitus clinical distress. Structurally, the Neurofunctional Tinnitus Model includes the peripheral auditory system, the thalamus, the limbic system, brainstem, basal ganglia, striatum, and the auditory along with prefrontal cortices. Functionally, we assume the model includes presence of continuous or intermittent abnormal signals at the peripheral auditory system or midbrain auditory paths. Depending on the availability of attentional resources, the signals may or may not be perceived. The cognitive valuation process strengthens the lateral-inhibition and noise canceling mechanisms in the mid-brain, which leads to the cessation of sound perception and renders the signal evaluation irrelevant. However, the “sourceless” sound is eventually perceived and can be cognitively interpreted as suspicious or an indication of a disease in which the cortical top-down processes weaken the noise canceling effects. This results in an increase in cognitive and emotional negative reactions such as depression and anxiety. The negative or positive cognitive-emotional feedbacks within the top-down approach may have no relation to the previous experience of the patients. They can also be

  12. Theoretical Tinnitus Framework: A Neurofunctional Model.

    PubMed

    Ghodratitoostani, Iman; Zana, Yossi; Delbem, Alexandre C B; Sani, Siamak S; Ekhtiari, Hamed; Sanchez, Tanit G

    2016-01-01

    Subjective tinnitus is the conscious (attended) awareness perception of sound in the absence of an external source and can be classified as an auditory phantom perception. Earlier literature establishes three distinct states of conscious perception as unattended, attended, and attended awareness conscious perception. The current tinnitus development models depend on the role of external events congruently paired with the causal physical events that precipitate the phantom perception. We propose a novel Neurofunctional Tinnitus Model to indicate that the conscious (attended) awareness perception of phantom sound is essential in activating the cognitive-emotional value. The cognitive-emotional value plays a crucial role in governing attention allocation as well as developing annoyance within tinnitus clinical distress. Structurally, the Neurofunctional Tinnitus Model includes the peripheral auditory system, the thalamus, the limbic system, brainstem, basal ganglia, striatum, and the auditory along with prefrontal cortices. Functionally, we assume the model includes presence of continuous or intermittent abnormal signals at the peripheral auditory system or midbrain auditory paths. Depending on the availability of attentional resources, the signals may or may not be perceived. The cognitive valuation process strengthens the lateral-inhibition and noise canceling mechanisms in the mid-brain, which leads to the cessation of sound perception and renders the signal evaluation irrelevant. However, the "sourceless" sound is eventually perceived and can be cognitively interpreted as suspicious or an indication of a disease in which the cortical top-down processes weaken the noise canceling effects. This results in an increase in cognitive and emotional negative reactions such as depression and anxiety. The negative or positive cognitive-emotional feedbacks within the top-down approach may have no relation to the previous experience of the patients. They can also be

  13. Theoretical Tinnitus Framework: A Neurofunctional Model

    PubMed Central

    Ghodratitoostani, Iman; Zana, Yossi; Delbem, Alexandre C. B.; Sani, Siamak S.; Ekhtiari, Hamed; Sanchez, Tanit G.

    2016-01-01

    Subjective tinnitus is the conscious (attended) awareness perception of sound in the absence of an external source and can be classified as an auditory phantom perception. Earlier literature establishes three distinct states of conscious perception as unattended, attended, and attended awareness conscious perception. The current tinnitus development models depend on the role of external events congruently paired with the causal physical events that precipitate the phantom perception. We propose a novel Neurofunctional Tinnitus Model to indicate that the conscious (attended) awareness perception of phantom sound is essential in activating the cognitive-emotional value. The cognitive-emotional value plays a crucial role in governing attention allocation as well as developing annoyance within tinnitus clinical distress. Structurally, the Neurofunctional Tinnitus Model includes the peripheral auditory system, the thalamus, the limbic system, brainstem, basal ganglia, striatum, and the auditory along with prefrontal cortices. Functionally, we assume the model includes presence of continuous or intermittent abnormal signals at the peripheral auditory system or midbrain auditory paths. Depending on the availability of attentional resources, the signals may or may not be perceived. The cognitive valuation process strengthens the lateral-inhibition and noise canceling mechanisms in the mid-brain, which leads to the cessation of sound perception and renders the signal evaluation irrelevant. However, the “sourceless” sound is eventually perceived and can be cognitively interpreted as suspicious or an indication of a disease in which the cortical top-down processes weaken the noise canceling effects. This results in an increase in cognitive and emotional negative reactions such as depression and anxiety. The negative or positive cognitive-emotional feedbacks within the top-down approach may have no relation to the previous experience of the patients. They can also be

  14. Multicriteria framework for selecting a process modelling language

    NASA Astrophysics Data System (ADS)

    Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel

    2016-01-01

    The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.

  15. Mathematical models for biodegradation of chlorinated solvents. 1: Model framework

    SciTech Connect

    Zhang, X.; Banerji, S.; Bajpai, R.

    1996-12-31

    Complete mineralization of chlorinated solvents by microbial action has been demonstrated under aerobic as well as anaerobic conditions. In most of the cases, it is believed that the biodegradation is initiated by broad-specificity enzymes involved in metabolism of a primary substrate. Under aerobic conditions, some of the primary carbon and energy substrates are methane, propane, toluene, phenol, and ammonia; under anaerobic conditions, glucose, sucrose, acetate, propionate, isopropanol, methanol, and even natural organics act as the carbon source. Published biochemical studies suggest that the limiting step is often the initial part of the biodegradation pathway within the microbial system. For aerobic systems, the limiting step is thought to be the reaction catalyzed by mono- and dioxygenases which are induced by most primary substrates, although some constitutive strains have been reported. Other critical features of the biodegradative pathway include: (1) activity losses of critical enzyme(s) through the action of metabolic byproducts, (2) energetic needs of contaminant biodegradation which must be met by catabolism of the primary substrates, (3) changes in metabolic patterns in mixed cultures found in nature depending on the availability of electron acceptors, and (4) the associated accumulation and disappearance of metabolic intermediates. Often, the contaminant pool itself consists of several chlorinated solvents with separate and interactive biochemical needs. The existing models address some of the issues mentioned above. However, their ability to successfully predict biological fate of chlorinated solvents in nature is severely limited due to the existing mathematical models. Limiting step(s), inactivation of critical enzymes, recovery action, energetics, and a framework for multiple degradative pathways will be presented as a comprehensive model. 91 refs.

  16. First-Order Frameworks for Managing Models in Engineering Optimization

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natlia M.; Lewis, Robert Michael

    2000-01-01

    Approximation/model management optimization (AMMO) is a rigorous methodology for attaining solutions of high-fidelity optimization problems with minimal expense in high- fidelity function and derivative evaluation. First-order AMMO frameworks allow for a wide variety of models and underlying optimization algorithms. Recent demonstrations with aerodynamic optimization achieved three-fold savings in terms of high- fidelity function and derivative evaluation in the case of variable-resolution models and five-fold savings in the case of variable-fidelity physics models. The savings are problem dependent but certain trends are beginning to emerge. We give an overview of the first-order frameworks, current computational results, and an idea of the scope of the first-order framework applicability.

  17. Frameworks for Assessing the Quality of Modeling and Simulation Capabilities

    NASA Astrophysics Data System (ADS)

    Rider, W. J.

    2012-12-01

    The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are

  18. The QCAD Framework for Quantum Device Modeling

    SciTech Connect

    Gao, Xujiao; Nielsen, Erik; Muller, Richard P; Young, Ralph Watson; Salinger, Andrew G; Carroll, Malcolm S

    2012-06-01

    We present the Quantum Computer Aided Design (QCAD) simulator that targets modeling quantum devices, particularly Si double quantum dots (DQDs) developed for quantum computing. The simulator core includes Poisson, Schrodinger, and Configuration Interaction solvers which can be run individually or combined self-consistently. The simulator is built upon Sandia-developed Trilinos and Albany components, and is interfaced with the Dakota optimization tool. It is being developed for seamless integration, high flexibility and throughput, and is intended to be open source. The QCAD tool has been used to simulate a large number of fabricated silicon DQDs and has provided fast feedback for design comparison and optimization.

  19. Argumentation in Science Education: A Model-Based Framework

    ERIC Educational Resources Information Center

    Bottcher, Florian; Meisert, Anke

    2011-01-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons…

  20. Characteristics and Conceptual Framework of the Easy-Play Model

    ERIC Educational Resources Information Center

    Lu, Chunlei; Steele, Kyle

    2014-01-01

    The Easy-Play Model offers a defined framework to organize games that promote an inclusive and enjoyable sport experience. The model can be implemented by participants playing sports in educational, recreational or social contexts with the goal of achieving an active lifestyle in an inclusive, cooperative and enjoyable environment. The Easy-Play…

  1. A Model Framework for Course Materials Construction. Third Edition.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    A model framework for course materials construction is presented as an aid to Coast Guard course writers and coordinators, curriculum developers, and instructors who must modify a course or draft a new one. The model assumes that the instructor or other designated person has: (1) completed a task analysis which identifies the competencies, skills…

  2. A software engineering perspective on environmental modeling framework design: The object modeling system

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The environmental modeling community has historically been concerned with the proliferation of models and the effort associated with collective model development tasks (e.g., code generation, data provisioning and transformation, etc.). Environmental modeling frameworks (EMFs) have been developed to...

  3. 3-D HYDRODYNAMIC MODELING IN A GEOSPATIAL FRAMEWORK

    SciTech Connect

    Bollinger, J; Alfred Garrett, A; Larry Koffman, L; David Hayes, D

    2006-08-24

    3-D hydrodynamic models are used by the Savannah River National Laboratory (SRNL) to simulate the transport of thermal and radionuclide discharges in coastal estuary systems. Development of such models requires accurate bathymetry, coastline, and boundary condition data in conjunction with the ability to rapidly discretize model domains and interpolate the required geospatial data onto the domain. To facilitate rapid and accurate hydrodynamic model development, SRNL has developed a pre- and post-processor application in a geospatial framework to automate the creation of models using existing data. This automated capability allows development of very detailed models to maximize exploitation of available surface water radionuclide sample data and thermal imagery.

  4. A unified framework for Schelling's model of segregation

    NASA Astrophysics Data System (ADS)

    Rogers, Tim; McKane, Alan J.

    2011-07-01

    Schelling's model of segregation is one of the first and most influential models in the field of social simulation. There are many variations of the model which have been proposed and simulated over the last forty years, though the present state of the literature on the subject is somewhat fragmented and lacking comprehensive analytical treatments. In this paper a unified mathematical framework for Schelling's model and its many variants is developed. This methodology is useful in two regards: firstly, it provides a tool with which to understand the differences observed between models; secondly, phenomena which appear in several model variations may be understood in more depth through analytic studies of simpler versions.

  5. A Liver-Centric Multiscale Modeling Framework for Xenobiotics.

    PubMed

    Sluka, James P; Fu, Xiao; Swat, Maciej; Belmonte, Julio M; Cosmanescu, Alin; Clendenon, Sherry G; Wambaugh, John F; Glazier, James A

    2016-01-01

    We describe a multi-scale, liver-centric in silico modeling framework for acetaminophen pharmacology and metabolism. We focus on a computational model to characterize whole body uptake and clearance, liver transport and phase I and phase II metabolism. We do this by incorporating sub-models that span three scales; Physiologically Based Pharmacokinetic (PBPK) modeling of acetaminophen uptake and distribution at the whole body level, cell and blood flow modeling at the tissue/organ level and metabolism at the sub-cellular level. We have used standard modeling modalities at each of the three scales. In particular, we have used the Systems Biology Markup Language (SBML) to create both the whole-body and sub-cellular scales. Our modeling approach allows us to run the individual sub-models separately and allows us to easily exchange models at a particular scale without the need to extensively rework the sub-models at other scales. In addition, the use of SBML greatly facilitates the inclusion of biological annotations directly in the model code. The model was calibrated using human in vivo data for acetaminophen and its sulfate and glucuronate metabolites. We then carried out extensive parameter sensitivity studies including the pairwise interaction of parameters. We also simulated population variation of exposure and sensitivity to acetaminophen. Our modeling framework can be extended to the prediction of liver toxicity following acetaminophen overdose, or used as a general purpose pharmacokinetic model for xenobiotics. PMID:27636091

  6. A Liver-Centric Multiscale Modeling Framework for Xenobiotics

    PubMed Central

    Swat, Maciej; Cosmanescu, Alin; Clendenon, Sherry G.; Wambaugh, John F.; Glazier, James A.

    2016-01-01

    We describe a multi-scale, liver-centric in silico modeling framework for acetaminophen pharmacology and metabolism. We focus on a computational model to characterize whole body uptake and clearance, liver transport and phase I and phase II metabolism. We do this by incorporating sub-models that span three scales; Physiologically Based Pharmacokinetic (PBPK) modeling of acetaminophen uptake and distribution at the whole body level, cell and blood flow modeling at the tissue/organ level and metabolism at the sub-cellular level. We have used standard modeling modalities at each of the three scales. In particular, we have used the Systems Biology Markup Language (SBML) to create both the whole-body and sub-cellular scales. Our modeling approach allows us to run the individual sub-models separately and allows us to easily exchange models at a particular scale without the need to extensively rework the sub-models at other scales. In addition, the use of SBML greatly facilitates the inclusion of biological annotations directly in the model code. The model was calibrated using human in vivo data for acetaminophen and its sulfate and glucuronate metabolites. We then carried out extensive parameter sensitivity studies including the pairwise interaction of parameters. We also simulated population variation of exposure and sensitivity to acetaminophen. Our modeling framework can be extended to the prediction of liver toxicity following acetaminophen overdose, or used as a general purpose pharmacokinetic model for xenobiotics. PMID:27636091

  7. A Liver-Centric Multiscale Modeling Framework for Xenobiotics.

    PubMed

    Sluka, James P; Fu, Xiao; Swat, Maciej; Belmonte, Julio M; Cosmanescu, Alin; Clendenon, Sherry G; Wambaugh, John F; Glazier, James A

    2016-01-01

    We describe a multi-scale, liver-centric in silico modeling framework for acetaminophen pharmacology and metabolism. We focus on a computational model to characterize whole body uptake and clearance, liver transport and phase I and phase II metabolism. We do this by incorporating sub-models that span three scales; Physiologically Based Pharmacokinetic (PBPK) modeling of acetaminophen uptake and distribution at the whole body level, cell and blood flow modeling at the tissue/organ level and metabolism at the sub-cellular level. We have used standard modeling modalities at each of the three scales. In particular, we have used the Systems Biology Markup Language (SBML) to create both the whole-body and sub-cellular scales. Our modeling approach allows us to run the individual sub-models separately and allows us to easily exchange models at a particular scale without the need to extensively rework the sub-models at other scales. In addition, the use of SBML greatly facilitates the inclusion of biological annotations directly in the model code. The model was calibrated using human in vivo data for acetaminophen and its sulfate and glucuronate metabolites. We then carried out extensive parameter sensitivity studies including the pairwise interaction of parameters. We also simulated population variation of exposure and sensitivity to acetaminophen. Our modeling framework can be extended to the prediction of liver toxicity following acetaminophen overdose, or used as a general purpose pharmacokinetic model for xenobiotics.

  8. A computational framework for a database of terrestrial biosphere models

    NASA Astrophysics Data System (ADS)

    Metzler, Holger; Müller, Markus; Ceballos-Núñez, Verónika; Sierra, Carlos A.

    2016-04-01

    Most terrestrial biosphere models consist of a set of coupled ordinary first order differential equations. Each equation represents a pool containing carbon with a certain turnover rate. Although such models share some basic mathematical structures, they can have very different properties such as number of pools, cycling rates, and internal fluxes. We present a computational framework that helps analyze the structure and behavior of terrestrial biosphere models using as an example the process of soil organic matter decomposition. The same framework can also be used for other sub-processes such as carbon fixation or allocation. First, the models have to be fed into a database consisting of simple text files with a common structure. Then they are read in using Python and transformed into an internal 'Model Class' that can be used to automatically create an overview stating the model's structure, state variables, internal and external fluxes. SymPy, a Python library for symbolic mathematics, helps to also calculate the Jacobian matrix at possibly given steady states and the eigenvalues of this matrix. If complete parameter sets are available, the model can also be run using R to simulate its behavior under certain conditions and to support a deeper stability analysis. In this case, the framework is also able to provide phase-plane plots if appropriate. Furthermore, an overview of all the models in the database can be given to help identify their similarities and differences.

  9. Framework for Understanding Structural Errors (FUSE): a modular framework to diagnose differences between hydrological models

    USGS Publications Warehouse

    Clark, Martyn P.; Slater, Andrew G.; Rupp, David E.; Woods, Ross A.; Vrugt, Jasper A.; Gupta, Hoshin V.; Wagener, Thorsten; Hay, Lauren E.

    2008-01-01

    The problems of identifying the most appropriate model structure for a given problem and quantifying the uncertainty in model structure remain outstanding research challenges for the discipline of hydrology. Progress on these problems requires understanding of the nature of differences between models. This paper presents a methodology to diagnose differences in hydrological model structures: the Framework for Understanding Structural Errors (FUSE). FUSE was used to construct 79 unique model structures by combining components of 4 existing hydrological models. These new models were used to simulate streamflow in two of the basins used in the Model Parameter Estimation Experiment (MOPEX): the Guadalupe River (Texas) and the French Broad River (North Carolina). Results show that the new models produced simulations of streamflow that were at least as good as the simulations produced by the models that participated in the MOPEX experiment. Our initial application of the FUSE method for the Guadalupe River exposed relationships between model structure and model performance, suggesting that the choice of model structure is just as important as the choice of model parameters. However, further work is needed to evaluate model simulations using multiple criteria to diagnose the relative importance of model structural differences in various climate regimes and to assess the amount of independent information in each of the models. This work will be crucial to both identifying the most appropriate model structure for a given problem and quantifying the uncertainty in model structure. To facilitate research on these problems, the FORTRAN-90 source code for FUSE is available upon request from the lead author.

  10. Language Arts Curriculum Framework: Sample Curriculum Model, Grade 4.

    ERIC Educational Resources Information Center

    Arkansas State Dept. of Education, Little Rock.

    Based on the 1998 Arkansas State Language Arts Framework, this sample curriculum model for grade four language arts is divided into sections focusing on writing; listening, speaking, and viewing; and reading. Each section lists standards; benchmarks; assessments; and strategies/activities. The reading section itself is divided into print…

  11. Language Arts Curriculum Framework: Sample Curriculum Model, Grade 3.

    ERIC Educational Resources Information Center

    Arkansas State Dept. of Education, Little Rock.

    Based on the 1998 Arkansas State Language Arts Framework, this sample curriculum model for grade three language arts is divided into sections focusing on writing; listening, speaking, and viewing; and reading. Each section lists standards; benchmarks; assessments; and strategies/activities. The reading section itself is divided into print…

  12. The BMW Model: A New Framework for Teaching Monetary Economics

    ERIC Educational Resources Information Center

    Bofinger, Peter; Mayer, Eric; Wollmershauser, Timo

    2006-01-01

    Although the IS/LM-AS/AD model is still the central tool of macroeconomic teaching in most macroeconomic textbooks, it has been criticized by several economists. Colander (1995) demonstrated that the framework is logically inconsistent, Romer (2000) showed that it is unable to deal with a monetary policy that uses the interest rate as its…

  13. A Theoretical Framework for Physics Education Research: Modeling Student Thinking

    ERIC Educational Resources Information Center

    Redish, Edward F.

    2004-01-01

    Education is a goal-oriented field. But if we want to treat education scientifically so we can accumulate, evaluate, and refine what we learn, then we must develop a theoretical framework that is strongly rooted in objective observations and through which different theoretical models of student thinking can be compared. Much that is known in the…

  14. A Framework for Understanding Physics Students' Computational Modeling Practices

    NASA Astrophysics Data System (ADS)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by

  15. Theoretical Models and Operational Frameworks in Public Health Ethics

    PubMed Central

    Petrini, Carlo

    2010-01-01

    The article is divided into three sections: (i) an overview of the main ethical models in public health (theoretical foundations); (ii) a summary of several published frameworks for public health ethics (practical frameworks); and (iii) a few general remarks. Rather than maintaining the superiority of one position over the others, the main aim of the article is to summarize the basic approaches proposed thus far concerning the development of public health ethics by describing and comparing the various ideas in the literature. With this in mind, an extensive list of references is provided. PMID:20195441

  16. Theoretical models and operational frameworks in public health ethics.

    PubMed

    Petrini, Carlo

    2010-01-01

    The article is divided into three sections: (i) an overview of the main ethical models in public health (theoretical foundations); (ii) a summary of several published frameworks for public health ethics (practical frameworks); and (iii) a few general remarks. Rather than maintaining the superiority of one position over the others, the main aim of the article is to summarize the basic approaches proposed thus far concerning the development of public health ethics by describing and comparing the various ideas in the literature. With this in mind, an extensive list of references is provided.

  17. A High Performance Bayesian Computing Framework for Spatiotemporal Uncertainty Modeling

    NASA Astrophysics Data System (ADS)

    Cao, G.

    2015-12-01

    All types of spatiotemporal measurements are subject to uncertainty. With spatiotemporal data becomes increasingly involved in scientific research and decision making, it is important to appropriately model the impact of uncertainty. Quantitatively modeling spatiotemporal uncertainty, however, is a challenging problem considering the complex dependence and dataheterogeneities.State-space models provide a unifying and intuitive framework for dynamic systems modeling. In this paper, we aim to extend the conventional state-space models for uncertainty modeling in space-time contexts while accounting for spatiotemporal effects and data heterogeneities. Gaussian Markov Random Field (GMRF) models, also known as conditional autoregressive models, are arguably the most commonly used methods for modeling of spatially dependent data. GMRF models basically assume that a geo-referenced variable primarily depends on its neighborhood (Markov property), and the spatial dependence structure is described via a precision matrix. Recent study has shown that GMRFs are efficient approximation to the commonly used Gaussian fields (e.g., Kriging), and compared with Gaussian fields, GMRFs enjoy a series of appealing features, such as fast computation and easily accounting for heterogeneities in spatial data (e.g, point and areal). This paper represents each spatial dataset as a GMRF and integrates them into a state-space form to statistically model the temporal dynamics. Different types of spatial measurements (e.g., categorical, count or continuous), can be accounted for by according link functions. A fast alternative to MCMC framework, so-called Integrated Nested Laplace Approximation (INLA), was adopted for model inference.Preliminary case studies will be conducted to showcase the advantages of the described framework. In the first case, we apply the proposed method for modeling the water table elevation of Ogallala aquifer over the past decades. In the second case, we analyze the

  18. An enhanced BSIM modeling framework for selfheating aware circuit design

    NASA Astrophysics Data System (ADS)

    Schleyer, M.; Leuschner, S.; Baumgartner, P.; Mueller, J.-E.; Klar, H.

    2014-11-01

    This work proposes a modeling framework to enhance the industry-standard BSIM4 MOSFET models with capabilities for coupled electro-thermal simulations. An automated simulation environment extracts thermal information from model data as provided by the semiconductor foundry. The standard BSIM4 model is enhanced with a Verilog-A based wrapper module, adding thermal nodes which can be connected to a thermal-equivalent RC network. The proposed framework allows a fully automated extraction process based on the netlist of the top-level design and the model library. A numerical analysis tool is used to control the extraction flow and to obtain all required parameters. The framework is used to model self-heating effects on a fully integrated class A/AB power amplifier (PA) designed in a standard 65 nm CMOS process. The PA is driven with +30 dBm output power, leading to an average temperature rise of approximately 40 °C over ambient temperature.

  19. Possibilities: A framework for modeling students' deductive reasoning in physics

    NASA Astrophysics Data System (ADS)

    Gaffney, Jonathan David Housley

    Students often make errors when trying to solve qualitative or conceptual physics problems, and while many successful instructional interventions have been generated to prevent such errors, the process of deduction that students use when solving physics problems has not been thoroughly studied. In an effort to better understand that reasoning process, I have developed a new framework, which is based on the mental models framework in psychology championed by P. N. Johnson-Laird. My new framework models how students search possibility space when thinking about conceptual physics problems and suggests that errors arise from failing to flesh out all possibilities. It further suggests that instructional interventions should focus on making apparent those possibilities, as well as all physical consequences those possibilities would incur. The possibilities framework emerged from the analysis of data from a unique research project specifically invented for the purpose of understanding how students use deductive reasoning. In the selection task, participants were given a physics problem along with three written possible solutions with the goal of identifying which one of the three possible solutions was correct. Each participant was also asked to identify the errors in the incorrect solutions. For the study presented in this dissertation, participants not only performed the selection task individually on four problems, but they were also placed into groups of two or three and asked to discuss with each other the reasoning they used in making their choices and attempt to reach a consensus about which solution was correct. Finally, those groups were asked to work together to perform the selection task on three new problems. The possibilities framework appropriately models the reasoning that students use, and it makes useful predictions about potentially helpful instructional interventions. The study reported in this dissertation emphasizes the useful insight the

  20. A modeling framework for system restoration from cascading failures.

    PubMed

    Liu, Chaoran; Li, Daqing; Zio, Enrico; Kang, Rui

    2014-01-01

    System restoration from cascading failures is an integral part of the overall defense against catastrophic breakdown in networked critical infrastructures. From the outbreak of cascading failures to the system complete breakdown, actions can be taken to prevent failure propagation through the entire network. While most analysis efforts have been carried out before or after cascading failures, restoration during cascading failures has been rarely studied. In this paper, we present a modeling framework to investigate the effects of in-process restoration, which depends strongly on the timing and strength of the restoration actions. Furthermore, in the model we also consider additional disturbances to the system due to restoration actions themselves. We demonstrate that the effect of restoration is also influenced by the combination of system loading level and restoration disturbance. Our modeling framework will help to provide insights on practical restoration from cascading failures and guide improvements of reliability and resilience of actual network systems.

  1. Modeling QCD for Hadron Physics

    SciTech Connect

    Tandy, P. C.

    2011-10-24

    We review the approach to modeling soft hadron physics observables based on the Dyson-Schwinger equations of QCD. The focus is on light quark mesons and in particular the pseudoscalar and vector ground states, their decays and electromagnetic couplings. We detail the wide variety of observables that can be correlated by a ladder-rainbow kernel with one infrared parameter fixed to the chiral quark condensate. A recently proposed novel perspective in which the quark condensate is contained within hadrons and not the vacuum is mentioned. The valence quark parton distributions, in the pion and kaon, as measured in the Drell Yan process, are investigated with the same ladder-rainbow truncation of the Dyson-Schwinger and Bethe-Salpeter equations.

  2. Modeling QCD for Hadron Physics

    NASA Astrophysics Data System (ADS)

    Tandy, P. C.

    2011-10-01

    We review the approach to modeling soft hadron physics observables based on the Dyson-Schwinger equations of QCD. The focus is on light quark mesons and in particular the pseudoscalar and vector ground states, their decays and electromagnetic couplings. We detail the wide variety of observables that can be correlated by a ladder-rainbow kernel with one infrared parameter fixed to the chiral quark condensate. A recently proposed novel perspective in which the quark condensate is contained within hadrons and not the vacuum is mentioned. The valence quark parton distributions, in the pion and kaon, as measured in the Drell Yan process, are investigated with the same ladder-rainbow truncation of the Dyson-Schwinger and Bethe-Salpeter equations.

  3. Open source data assimilation framework for hydrological modeling

    NASA Astrophysics Data System (ADS)

    Ridler, Marc; Hummel, Stef; van Velzen, Nils; Katrine Falk, Anne; Madsen, Henrik

    2013-04-01

    An open-source data assimilation framework is proposed for hydrological modeling. Data assimilation (DA) in hydrodynamic and hydrological forecasting systems has great potential to improve predictions and improve model result. The basic principle is to incorporate measurement information into a model with the aim to improve model results by error minimization. Great strides have been made to assimilate traditional in-situ measurements such as discharge, soil moisture, hydraulic head and snowpack into hydrologic models. More recently, remotely sensed data retrievals of soil moisture, snow water equivalent or snow cover area, surface water elevation, terrestrial water storage and land surface temperature have been successfully assimilated in hydrological models. The assimilation algorithms have become increasingly sophisticated to manage measurement and model bias, non-linear systems, data sparsity (time & space) and undetermined system uncertainty. It is therefore useful to use a pre-existing DA toolbox such as OpenDA. OpenDA is an open interface standard for (and free implementation of) a set of tools to quickly implement DA and calibration for arbitrary numerical models. The basic design philosophy of OpenDA is to breakdown DA into a set of building blocks programmed in object oriented languages. To implement DA, a model must interact with OpenDA to create model instances, propagate the model, get/set variables (or parameters) and free the model once DA is completed. An open-source interface for hydrological models exists capable of all these tasks: OpenMI. OpenMI is an open source standard interface already adopted by key hydrological model providers. It defines a universal approach to interact with hydrological models during simulation to exchange data during runtime, thus facilitating the interactions between models and data sources. The interface is flexible enough so that models can interact even if the model is coded in a different language, represent

  4. A VGI data integration framework based on linked data model

    NASA Astrophysics Data System (ADS)

    Wan, Lin; Ren, Rongrong

    2015-12-01

    This paper aims at the geographic data integration and sharing method for multiple online VGI data sets. We propose a semantic-enabled framework for online VGI sources cooperative application environment to solve a target class of geospatial problems. Based on linked data technologies - which is one of core components of semantic web, we can construct the relationship link among geographic features distributed in diverse VGI platform by using linked data modeling methods, then deploy these semantic-enabled entities on the web, and eventually form an interconnected geographic data network to support geospatial information cooperative application across multiple VGI data sources. The mapping and transformation from VGI sources to RDF linked data model is presented to guarantee the unique data represent model among different online social geographic data sources. We propose a mixed strategy which combined spatial distance similarity and feature name attribute similarity as the measure standard to compare and match different geographic features in various VGI data sets. And our work focuses on how to apply Markov logic networks to achieve interlinks of the same linked data in different VGI-based linked data sets. In our method, the automatic generating method of co-reference object identification model according to geographic linked data is discussed in more detail. It finally built a huge geographic linked data network across loosely-coupled VGI web sites. The results of the experiment built on our framework and the evaluation of our method shows the framework is reasonable and practicable.

  5. Framework for the Parametric System Modeling of Space Exploration Architectures

    NASA Technical Reports Server (NTRS)

    Komar, David R.; Hoffman, Jim; Olds, Aaron D.; Seal, Mike D., II

    2008-01-01

    This paper presents a methodology for performing architecture definition and assessment prior to, or during, program formulation that utilizes a centralized, integrated architecture modeling framework operated by a small, core team of general space architects. This framework, known as the Exploration Architecture Model for IN-space and Earth-to-orbit (EXAMINE), enables: 1) a significantly larger fraction of an architecture trade space to be assessed in a given study timeframe; and 2) the complex element-to-element and element-to-system relationships to be quantitatively explored earlier in the design process. Discussion of the methodology advantages and disadvantages with respect to the distributed study team approach typically used within NASA to perform architecture studies is presented along with an overview of EXAMINE s functional components and tools. An example Mars transportation system architecture model is used to demonstrate EXAMINE s capabilities in this paper. However, the framework is generally applicable for exploration architecture modeling with destinations to any celestial body in the solar system.

  6. An Intercomparison of 2-D Models Within a Common Framework

    NASA Technical Reports Server (NTRS)

    Weisenstein, Debra K.; Ko, Malcolm K. W.; Scott, Courtney J.; Jackman, Charles H.; Fleming, Eric L.; Considine, David B.; Kinnison, Douglas E.; Connell, Peter S.; Rotman, Douglas A.; Bhartia, P. K. (Technical Monitor)

    2002-01-01

    A model intercomparison among the Atmospheric and Environmental Research (AER) 2-D model, the Goddard Space Flight Center (GSFC) 2-D model, and the Lawrence Livermore National Laboratory 2-D model allows us to separate differences due to model transport from those due to the model's chemical formulation. This is accomplished by constructing two hybrid models incorporating the transport parameters of the GSFC and LLNL models within the AER model framework. By comparing the results from the native models (AER and e.g. GSFC) with those from the hybrid model (e.g. AER chemistry with GSFC transport), differences due to chemistry and transport can be identified. For the analysis, we examined an inert tracer whose emission pattern is based on emission from a High Speed Civil Transport (HSCT) fleet; distributions of trace species in the 2015 atmosphere; and the response of stratospheric ozone to an HSCT fleet. Differences in NO(y) in the upper stratosphere are found between models with identical transport, implying different model representations of atmospheric chemical processes. The response of O3 concentration to HSCT aircraft emissions differs in the models from both transport-dominated differences in the HSCT-induced perturbations of H2O and NO(y) as well as from differences in the model represent at ions of O3 chemical processes. The model formulations of cold polar processes are found to be the most significant factor in creating large differences in the calculated ozone perturbations

  7. The Michigan Space Weather Modeling Framework (SWMF) Graphical User Interface

    NASA Astrophysics Data System (ADS)

    de Zeeuw, D.; Gombosi, T.; Toth, G.; Ridley, A.

    2007-05-01

    The Michigan Space Weather Modeling Framework (SWMF) is a powerful tool available for the community that has been used to model from the Sun to Earth and beyond. As a research tool, however, it still requires user experience with parallel compute clusters and visualization tools. Thus, we have developed a graphical user interface (GUI) that assists with configuring, compiling, and running the SWMF, as well as visualizing the model output. This is accomplished through a portable web interface. Live examples will be demonstrated and visualization of several archived events will be shown.

  8. An Integrated Modeling Framework for Probable Maximum Precipitation and Flood

    NASA Astrophysics Data System (ADS)

    Gangrade, S.; Rastogi, D.; Kao, S. C.; Ashfaq, M.; Naz, B. S.; Kabela, E.; Anantharaj, V. G.; Singh, N.; Preston, B. L.; Mei, R.

    2015-12-01

    With the increasing frequency and magnitude of extreme precipitation and flood events projected in the future climate, there is a strong need to enhance our modeling capabilities to assess the potential risks on critical energy-water infrastructures such as major dams and nuclear power plants. In this study, an integrated modeling framework is developed through high performance computing to investigate the climate change effects on probable maximum precipitation (PMP) and probable maximum flood (PMF). Multiple historical storms from 1981-2012 over the Alabama-Coosa-Tallapoosa River Basin near the Atlanta metropolitan area are simulated by the Weather Research and Forecasting (WRF) model using the Climate Forecast System Reanalysis (CFSR) forcings. After further WRF model tuning, these storms are used to simulate PMP through moisture maximization at initial and lateral boundaries. A high resolution hydrological model, Distributed Hydrology-Soil-Vegetation Model, implemented at 90m resolution and calibrated by the U.S. Geological Survey streamflow observations, is then used to simulate the corresponding PMF. In addition to the control simulation that is driven by CFSR, multiple storms from the Community Climate System Model version 4 under the Representative Concentrations Pathway 8.5 emission scenario are used to simulate PMP and PMF in the projected future climate conditions. The multiple PMF scenarios developed through this integrated modeling framework may be utilized to evaluate the vulnerability of existing energy-water infrastructures with various aspects associated PMP and PMF.

  9. A Structural Model Decomposition Framework for Systems Health Management

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Daigle, Matthew J.; Bregon, Anibal; Pulido, Belamino

    2013-01-01

    Systems health management (SHM) is an important set of technologies aimed at increasing system safety and reliability by detecting, isolating, and identifying faults; and predicting when the system reaches end of life (EOL), so that appropriate fault mitigation and recovery actions can be taken. Model-based SHM approaches typically make use of global, monolithic system models for online analysis, which results in a loss of scalability and efficiency for large-scale systems. Improvement in scalability and efficiency can be achieved by decomposing the system model into smaller local submodels and operating on these submodels instead. In this paper, the global system model is analyzed offline and structurally decomposed into local submodels. We define a common model decomposition framework for extracting submodels from the global model. This framework is then used to develop algorithms for solving model decomposition problems for the design of three separate SHM technologies, namely, estimation (which is useful for fault detection and identification), fault isolation, and EOL prediction. We solve these model decomposition problems using a three-tank system as a case study.

  10. Modelling Framework and Assistive Device for Peripheral Intravenous Injections

    NASA Astrophysics Data System (ADS)

    Kam, Kin F.; Robinson, Martin P.; Gilbert, Mathew A.; Pelah, Adar

    2016-02-01

    Intravenous access for blood sampling or drug administration that requires peripheral venepuncture is perhaps the most common invasive procedure practiced in hospitals, clinics and general practice surgeries.We describe an idealised mathematical framework for modelling the dynamics of the peripheral venepuncture process. Basic assumptions of the model are confirmed through motion analysis of needle trajectories during venepuncture, taken from video recordings of a skilled practitioner injecting into a practice kit. The framework is also applied to the design and construction of a proposed device for accurate needle guidance during venepuncture administration, assessed as consistent and repeatable in application and does not lead to over puncture. The study provides insights into the ubiquitous peripheral venepuncture process and may contribute to applications in training and in the design of new devices, including for use in robotic automation.

  11. A framework for modeling contaminant impacts on reservoir water quality

    NASA Astrophysics Data System (ADS)

    Jeznach, Lillian C.; Jones, Christina; Matthews, Thomas; Tobiason, John E.; Ahlfeld, David P.

    2016-06-01

    This study presents a framework for using hydrodynamic and water quality models to understand the fate and transport of potential contaminants in a reservoir and to develop appropriate emergency response and remedial actions. In the event of an emergency situation, prior detailed modeling efforts and scenario evaluations allow for an understanding of contaminant plume behavior, including maximum concentrations that could occur at the drinking water intake and contaminant travel time to the intake. A case study assessment of the Wachusett Reservoir, a major drinking water supply for metropolitan Boston, MA, provides an example of an application of the framework and how hydrodynamic and water quality models can be used to quantitatively and scientifically guide management in response to varieties of contaminant scenarios. The model CE-QUAL-W2 was used to investigate the water quality impacts of several hypothetical contaminant scenarios, including hypothetical fecal coliform input from a sewage overflow as well as an accidental railway spill of ammonium nitrate. Scenarios investigated the impacts of decay rates, season, and inter-reservoir transfers on contaminant arrival times and concentrations at the drinking water intake. The modeling study highlights the importance of a rapid operational response by managers to contain a contaminant spill in order to minimize the mass of contaminant that enters the water column, based on modeled reservoir hydrodynamics. The development and use of hydrodynamic and water quality models for surface drinking water sources subject to the potential for contaminant entry can provide valuable guidance for making decisions about emergency response and remediation actions.

  12. New model framework and structure and the commonality evaluation model. [concerning unmanned spacecraft projects

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The development of a framework and structure for shuttle era unmanned spacecraft projects and the development of a commonality evaluation model is documented. The methodology developed for model utilization in performing cost trades and comparative evaluations for commonality studies is discussed. The model framework consists of categories of activities associated with the spacecraft system's development process. The model structure describes the physical elements to be treated as separate identifiable entities. Cost estimating relationships for subsystem and program-level components were calculated.

  13. An Integrated Snow Radiance and Snow Physics Modeling Framework for Cold Land Surface Modeling

    NASA Technical Reports Server (NTRS)

    Kim, Edward J.; Tedesco, Marco

    2006-01-01

    Recent developments in forward radiative transfer modeling and physical land surface modeling are converging to allow the assembly of an integrated snow/cold lands modeling framework for land surface modeling and data assimilation applications. The key elements of this framework include: a forward radiative transfer model (FRTM) for snow, a snowpack physical model, a land surface water/energy cycle model, and a data assimilation scheme. Together these form a flexible framework for self-consistent remote sensing and water/energy cycle studies. In this paper we will describe the elements and the integration plan. Each element of this framework is modular so the choice of element can be tailored to match the emphasis of a particular study. For example, within our framework, four choices of a FRTM are available to simulate the brightness temperature of snow: Two models are available to model the physical evolution of the snowpack and underlying soil, and two models are available to handle the water/energy balance at the land surface. Since the framework is modular, other models-physical or statistical--can be accommodated, too. All modules will operate within the framework of the Land Information System (LIS), a land surface modeling framework with data assimilation capabilities running on a parallel-node computing cluster at the NASA Goddard Space Flight Center. The advantages of such an integrated modular framework built on the LIS will be described through examples-e.g., studies to analyze snow field experiment observations, and simulations of future satellite missions for snow and cold land processes.

  14. A constitutive model for magnetostriction based on thermodynamic framework

    NASA Astrophysics Data System (ADS)

    Ho, Kwangsoo

    2016-08-01

    This work presents a general framework for the continuum-based formulation of dissipative materials with magneto-mechanical coupling in the viewpoint of irreversible thermodynamics. The thermodynamically consistent model developed for the magnetic hysteresis is extended to include the magnetostrictive effect. The dissipative and hysteretic response of magnetostrictive materials is captured through the introduction of internal state variables. The evolution rate of magnetostrictive strain as well as magnetization is derived from thermodynamic and dissipative potentials in accordance with the general principles of thermodynamics. It is then demonstrated that the constitutive model is competent to describe the magneto-mechanical behavior by comparing simulation results with the experimental data reported in the literature.

  15. Velo: A Knowledge Management Framework for Modeling and Simulation

    SciTech Connect

    Gorton, Ian; Sivaramakrishnan, Chandrika; Black, Gary D.; White, Signe K.; Purohit, Sumit; Lansing, Carina S.; Madison, Michael C.; Schuchardt, Karen L.; Liu, Yan

    2012-03-01

    Modern scientific enterprises are inherently knowledge-intensive. Scientific studies in domains such as geosciences, climate, and biology require the acquisition and manipulation of large amounts of experimental and field data to create inputs for large-scale computational simulations. The results of these simulations are then analyzed, leading to refinements of inputs and models and additional simulations. The results of this process must be managed and archived to provide justifications for regulatory decisions and publications that are based on the models. In this paper we introduce our Velo framework that is designed as a reusable, domain independent knowledge management infrastructure for modeling and simulation. Velo leverages, integrates and extends open source collaborative and content management technologies to create a scalable and flexible core platform that can be tailored to specific scientific domains. We describe the architecture of Velo for managing and associating the various types of data that are used and created in modeling and simulation projects, as well as the framework for integrating domain-specific tools. To demonstrate realizations of Velo, we describe examples from two deployed sites for carbon sequestration and climate modeling. These provide concrete example of the inherent extensibility and utility of our approach.

  16. The ontology model of FrontCRM framework

    NASA Astrophysics Data System (ADS)

    Budiardjo, Eko K.; Perdana, Wira; Franshisca, Felicia

    2013-03-01

    Adoption and implementation of Customer Relationship Management (CRM) is not merely a technological installation, but the emphasis is more on the application of customer-centric philosophy and culture as a whole. CRM must begin at the level of business strategy, the only level that thorough organizational changes are possible to be done. Changes agenda can be directed to each departmental plans, and supported by information technology. Work processes related to CRM concept include marketing, sales, and services. FrontCRM is developed as framework to guide in identifying business processes related to CRM in which based on the concept of strategic planning approach. This leads to processes and practices identification in every process area related to marketing, sales, and services. The Ontology model presented on this paper by means serves as tools to avoid framework misunderstanding, to define practices systematically within process area and to find CRM software features related to those practices.

  17. A General Framework for Multiphysics Modeling Based on Numerical Averaging

    NASA Astrophysics Data System (ADS)

    Lunati, I.; Tomin, P.

    2014-12-01

    In the last years, multiphysics (hybrid) modeling has attracted increasing attention as a tool to bridge the gap between pore-scale processes and a continuum description at the meter-scale (laboratory scale). This approach is particularly appealing for complex nonlinear processes, such as multiphase flow, reactive transport, density-driven instabilities, and geomechanical coupling. We present a general framework that can be applied to all these classes of problems. The method is based on ideas from the Multiscale Finite-Volume method (MsFV), which has been originally developed for Darcy-scale application. Recently, we have reformulated MsFV starting with a local-global splitting, which allows us to retain the original degree of coupling for the local problems and to use spatiotemporal adaptive strategies. The new framework is based on the simple idea that different characteristic temporal scales are inherited from different spatial scales, and the global and the local problems are solved with different temporal resolutions. The global (coarse-scale) problem is constructed based on a numerical volume-averaging paradigm and a continuum (Darcy-scale) description is obtained by introducing additional simplifications (e.g., by assuming that pressure is the only independent variable at the coarse scale, we recover an extended Darcy's law). We demonstrate that it is possible to adaptively and dynamically couple the Darcy-scale and the pore-scale descriptions of multiphase flow in a single conceptual and computational framework. Pore-scale problems are solved only in the active front region where fluid distribution changes with time. In the rest of the domain, only a coarse description is employed. This framework can be applied to other important problems such as reactive transport and crack propagation. As it is based on a numerical upscaling paradigm, our method can be used to explore the limits of validity of macroscopic models and to illuminate the meaning of

  18. PyCatch: catchment modelling in the PCRaster framework

    NASA Astrophysics Data System (ADS)

    Karssenberg, Derek; Lana-Renault, Noemí; Schmitz, Oliver

    2015-04-01

    PCRaster is an open source software framework for the construction and execution of stochastic, spatio-temporal, forward, models. It provides a large number of spatial operations on raster maps, with an emphasis on operations that are capable of transporting material (water, sediment) over a drainage network. These operations have been written in C++ and are provided to the model builder as Python functions. Models are constructed by combining these functions in a Python script. To ease implementation of models that use time steps and Monte Carlo iterations, the software comes with a Python framework providing control flow for temporal modelling and Monte Carlo simulation, including options for Bayesian data assimilation (Ensemble Kalman Filter, Particle Filter). A sophisticated visualization tool is provided capable of visualizing, animating, and exploring stochastic, spatio-temporal input or model output data. PCRaster is used for construction of for instance hydrological models (hillslope to global scale), land use change models, and geomorphological models. It is still being improved upon, for instance by adding under the hood functionality for executing models on multiple CPU cores, and by adding components for agent-based and network simulation. The software runs in MS Windows and Linux and is available at http://www.pcraster.eu. We provide an extensive set of online course materials (partly available free of charge). Using the PCRaster software framework, we recently developed the PyCatch model components for hydrological modelling and land degradation modelling at catchment scale. The PyCatch components run at time steps of seconds to weeks, and grid cell sizes of approximately 1-100 m, which can be selected depending on the case study for which PyCatch is used. Hydrological components currently implemented include classes for simulation of incoming solar radiation, evapotranspiration (Penman-Monteith), surface storage, infiltration (Green and Ampt

  19. Modeling air pollution in the Tracking and Analysis Framework (TAF)

    SciTech Connect

    Shannon, J.D.

    1998-12-31

    The Tracking and Analysis Framework (TAF) is a set of interactive computer models for integrated assessment of the Acid Rain Provisions (Title IV) of the 1990 Clean Air Act Amendments. TAF is designed to execute in minutes on a personal computer, thereby making it feasible for a researcher or policy analyst to examine quickly the effects of alternate modeling assumptions or policy scenarios. Because the development of TAF involves researchers in many different disciplines, TAF has been given a modular structure. In most cases, the modules contain reduced-form models that are based on more complete models exercised off-line. The structure of TAF as of December 1996 is shown. Both the Atmospheric Pathways Module produce estimates for regional air pollution variables.

  20. Modelling multimedia teleservices with OSI upper layers framework: Short paper

    NASA Astrophysics Data System (ADS)

    Widya, I.; Vanrijssen, E.; Michiels, E.

    The paper presents the use of the concepts and modelling principles of the Open Systems Interconnection (OSI) upper layers structure in the modelling of multimedia teleservices. It puts emphasis on the revised Application Layer Structure (OSI/ALS). OSI/ALS is an object based reference model which intends to coordinate the development of application oriented services and protocols in a consistent and modular way. It enables the rapid deployment and integrated use of these services. The paper emphasizes further on the nesting structure defined in OSI/ALS which allows the design of scalable and user tailorable/controllable teleservices. OSI/ALS consistent teleservices are moreover implementable on communication platforms of different capabilities. An analysis of distributed multimedia architectures which can be found in the literature, confirms the ability of the OSI/ALS framework to model the interworking functionalities of teleservices.

  1. A Structural Model Decomposition Framework for Hybrid Systems Diagnosis

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Bregon, Anibal; Roychoudhury, Indranil

    2015-01-01

    Nowadays, a large number of practical systems in aerospace and industrial environments are best represented as hybrid systems that consist of discrete modes of behavior, each defined by a set of continuous dynamics. These hybrid dynamics make the on-line fault diagnosis task very challenging. In this work, we present a new modeling and diagnosis framework for hybrid systems. Models are composed from sets of user-defined components using a compositional modeling approach. Submodels for residual generation are then generated for a given mode, and reconfigured efficiently when the mode changes. Efficient reconfiguration is established by exploiting causality information within the hybrid system models. The submodels can then be used for fault diagnosis based on residual generation and analysis. We demonstrate the efficient causality reassignment, submodel reconfiguration, and residual generation for fault diagnosis using an electrical circuit case study.

  2. Flexible Modeling of Epidemics with an Empirical Bayes Framework

    PubMed Central

    Brooks, Logan C.; Farrow, David C.; Hyun, Sangwon; Tibshirani, Ryan J.; Rosenfeld, Roni

    2015-01-01

    Seasonal influenza epidemics cause consistent, considerable, widespread loss annually in terms of economic burden, morbidity, and mortality. With access to accurate and reliable forecasts of a current or upcoming influenza epidemic’s behavior, policy makers can design and implement more effective countermeasures. This past year, the Centers for Disease Control and Prevention hosted the “Predict the Influenza Season Challenge”, with the task of predicting key epidemiological measures for the 2013–2014 U.S. influenza season with the help of digital surveillance data. We developed a framework for in-season forecasts of epidemics using a semiparametric Empirical Bayes framework, and applied it to predict the weekly percentage of outpatient doctors visits for influenza-like illness, and the season onset, duration, peak time, and peak height, with and without using Google Flu Trends data. Previous work on epidemic modeling has focused on developing mechanistic models of disease behavior and applying time series tools to explain historical data. However, tailoring these models to certain types of surveillance data can be challenging, and overly complex models with many parameters can compromise forecasting ability. Our approach instead produces possibilities for the epidemic curve of the season of interest using modified versions of data from previous seasons, allowing for reasonable variations in the timing, pace, and intensity of the seasonal epidemics, as well as noise in observations. Since the framework does not make strict domain-specific assumptions, it can easily be applied to some other diseases with seasonal epidemics. This method produces a complete posterior distribution over epidemic curves, rather than, for example, solely point predictions of forecasting targets. We report prospective influenza-like-illness forecasts made for the 2013–2014 U.S. influenza season, and compare the framework’s cross-validated prediction error on historical data to

  3. A visual interface for the SUPERFLEX hydrological modelling framework

    NASA Astrophysics Data System (ADS)

    Gao, H.; Fenicia, F.; Kavetski, D.; Savenije, H. H. G.

    2012-04-01

    The SUPERFLEX framework is a modular modelling system for conceptual hydrological modelling at the catchment scale. This work reports the development of a visual interface for the SUPERFLEX model. This aims to enhance the communication between the hydrologic experimentalists and modelers, in particular further bridging the gap between the field soft data and the modeler's knowledge. In collaboration with field experimentalists, modelers can visually and intuitively hypothesize different model architectures and combinations of reservoirs, select from a library of constructive functions to describe the relationship between reservoirs' storage and discharge, specify the shape of lag functions and, finally, set parameter values. The software helps hydrologists take advantage of any existing insights into the study site, translate it into a conceptual hydrological model and implement it within a computationally robust algorithm. This tool also helps challenge and contrast competing paradigms such as the "uniqueness of place" vs "one model fits all". Using this interface, hydrologists can test different hypotheses and model representations, and stepwise build deeper understanding of the watershed of interest.

  4. Archetype Model-Driven Development Framework for EHR Web System

    PubMed Central

    Kimura, Eizen; Ishihara, Ken

    2013-01-01

    Objectives This article describes the Web application framework for Electronic Health Records (EHRs) we have developed to reduce construction costs for EHR sytems. Methods The openEHR project has developed clinical model driven architecture for future-proof interoperable EHR systems. This project provides the specifications to standardize clinical domain model implementations, upon which the ISO/CEN 13606 standards are based. The reference implementation has been formally described in Eiffel. Moreover C# and Java implementations have been developed as reference. While scripting languages had been more popular because of their higher efficiency and faster development in recent years, they had not been involved in the openEHR implementations. From 2007, we have used the Ruby language and Ruby on Rails (RoR) as an agile development platform to implement EHR systems, which is in conformity with the openEHR specifications. Results We implemented almost all of the specifications, the Archetype Definition Language parser, and RoR scaffold generator from archetype. Although some problems have emerged, most of them have been resolved. Conclusions We have provided an agile EHR Web framework, which can build up Web systems from archetype models using RoR. The feasibility of the archetype model to provide semantic interoperability of EHRs has been demonstrated and we have verified that that it is suitable for the construction of EHR systems. PMID:24523991

  5. Service-Oriented Approach to Coupling Earth System Models and Modeling Frameworks

    NASA Astrophysics Data System (ADS)

    Goodall, J. L.; Saint, K. D.; Ercan, M. B.; Briley, L. J.; Murphy, S.; You, H.; DeLuca, C.; Rood, R. B.

    2012-12-01

    Modeling water systems often requires coupling models across traditional Earth science disciplinary boundaries. While there has been significant effort within various Earth science disciplines (e.g., atmospheric science, hydrology, and Earth surface dynamics) to create models and, more recently, modeling frameworks, there has been less work on methods for coupling across disciplinary-specific models and modeling frameworks. We present work investigating one possible method for coupling across disciplinary-specific Earth system models and modeling frameworks: service-oriented architectures. In a service-oriented architecture, models act as distinct units or components within a system and are designed to pass well defined messages to consumers of the service. While the approach offers the potential to couple heterogeneous computational models by allowing a high degree of autonomy across models of the Earth system, there are significant scientific and technical challenges to be addressed when coupling models designed for different communities and built for different modeling frameworks. We have addressed some of these challenges through a case study where we coupled a hydrologic model compliant with the OpenMI standard with an atmospheric model compliant with the EMSF standard. In this case study, the two models were coupled through data exchanges of boundary conditions enabled by exposing the atmospheric model as a web service. A discussion of the technical and scientific challenges, some that we have addressed and others that remain open, will be presented including differences in computer architectures, data semantics, and spatial scales between the coupled models.

  6. LQCD workflow execution framework: Models, provenance and fault-tolerance

    NASA Astrophysics Data System (ADS)

    Piccoli, Luciano; Dubey, Abhishek; Simone, James N.; Kowalkowlski, James B.

    2010-04-01

    Large computing clusters used for scientific processing suffer from systemic failures when operated over long continuous periods for executing workflows. Diagnosing job problems and faults leading to eventual failures in this complex environment is difficult, specifically when the success of an entire workflow might be affected by a single job failure. In this paper, we introduce a model-based, hierarchical, reliable execution framework that encompass workflow specification, data provenance, execution tracking and online monitoring of each workflow task, also referred to as participants. The sequence of participants is described in an abstract parameterized view, which is translated into a concrete data dependency based sequence of participants with defined arguments. As participants belonging to a workflow are mapped onto machines and executed, periodic and on-demand monitoring of vital health parameters on allocated nodes is enabled according to pre-specified rules. These rules specify conditions that must be true pre-execution, during execution and post-execution. Monitoring information for each participant is propagated upwards through the reflex and healing architecture, which consists of a hierarchical network of decentralized fault management entities, called reflex engines. They are instantiated as state machines or timed automatons that change state and initiate reflexive mitigation action(s) upon occurrence of certain faults. We describe how this cluster reliability framework is combined with the workflow execution framework using formal rules and actions specified within a structure of first order predicate logic that enables a dynamic management design that reduces manual administrative workload, and increases cluster-productivity.

  7. ASSESSING MULTIMEDIA/MULTIPATHWAY EXPOSURE TO ARSENIC USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    A series of case studies is presented focusing on multimedia/multipathway population exposures to arsenic, employing the Population Based Modeling approach of the MENTOR (Modeling Environment for Total Risks) framework. This framework considers currently five exposure routes: i...

  8. A Robust Control Design Framework for Substructure Models

    NASA Technical Reports Server (NTRS)

    Lim, Kyong B.

    1994-01-01

    A framework for designing control systems directly from substructure models and uncertainties is proposed. The technique is based on combining a set of substructure robust control problems by an interface stiffness matrix which appears as a constant gain feedback. Variations of uncertainties in the interface stiffness are treated as a parametric uncertainty. It is shown that multivariable robust control can be applied to generate centralized or decentralized controllers that guarantee performance with respect to uncertainties in the interface stiffness, reduced component modes and external disturbances. The technique is particularly suited for large, complex, and weakly coupled flexible structures.

  9. A hybrid parallel framework for the cellular Potts model simulations

    SciTech Connect

    Jiang, Yi; He, Kejing; Dong, Shoubin

    2009-01-01

    The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approach achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).

  10. An Integrated Framework Advancing Membrane Protein Modeling and Design

    PubMed Central

    Weitzner, Brian D.; Duran, Amanda M.; Tilley, Drew C.; Elazar, Assaf; Gray, Jeffrey J.

    2015-01-01

    Membrane proteins are critical functional molecules in the human body, constituting more than 30% of open reading frames in the human genome. Unfortunately, a myriad of difficulties in overexpression and reconstitution into membrane mimetics severely limit our ability to determine their structures. Computational tools are therefore instrumental to membrane protein structure prediction, consequently increasing our understanding of membrane protein function and their role in disease. Here, we describe a general framework facilitating membrane protein modeling and design that combines the scientific principles for membrane protein modeling with the flexible software architecture of Rosetta3. This new framework, called RosettaMP, provides a general membrane representation that interfaces with scoring, conformational sampling, and mutation routines that can be easily combined to create new protocols. To demonstrate the capabilities of this implementation, we developed four proof-of-concept applications for (1) prediction of free energy changes upon mutation; (2) high-resolution structural refinement; (3) protein-protein docking; and (4) assembly of symmetric protein complexes, all in the membrane environment. Preliminary data show that these algorithms can produce meaningful scores and structures. The data also suggest needed improvements to both sampling routines and score functions. Importantly, the applications collectively demonstrate the potential of combining the flexible nature of RosettaMP with the power of Rosetta algorithms to facilitate membrane protein modeling and design. PMID:26325167

  11. An Integrated Framework Advancing Membrane Protein Modeling and Design.

    PubMed

    Alford, Rebecca F; Koehler Leman, Julia; Weitzner, Brian D; Duran, Amanda M; Tilley, Drew C; Elazar, Assaf; Gray, Jeffrey J

    2015-09-01

    Membrane proteins are critical functional molecules in the human body, constituting more than 30% of open reading frames in the human genome. Unfortunately, a myriad of difficulties in overexpression and reconstitution into membrane mimetics severely limit our ability to determine their structures. Computational tools are therefore instrumental to membrane protein structure prediction, consequently increasing our understanding of membrane protein function and their role in disease. Here, we describe a general framework facilitating membrane protein modeling and design that combines the scientific principles for membrane protein modeling with the flexible software architecture of Rosetta3. This new framework, called RosettaMP, provides a general membrane representation that interfaces with scoring, conformational sampling, and mutation routines that can be easily combined to create new protocols. To demonstrate the capabilities of this implementation, we developed four proof-of-concept applications for (1) prediction of free energy changes upon mutation; (2) high-resolution structural refinement; (3) protein-protein docking; and (4) assembly of symmetric protein complexes, all in the membrane environment. Preliminary data show that these algorithms can produce meaningful scores and structures. The data also suggest needed improvements to both sampling routines and score functions. Importantly, the applications collectively demonstrate the potential of combining the flexible nature of RosettaMP with the power of Rosetta algorithms to facilitate membrane protein modeling and design. PMID:26325167

  12. CIMS: A FRAMEWORK FOR INFRASTRUCTURE INTERDEPENDENCY MODELING AND ANALYSIS

    SciTech Connect

    Donald D. Dudenhoeffer; May R. Permann; Milos Manic

    2006-12-01

    Today’s society relies greatly upon an array of complex national and international infrastructure networks such as transportation, utilities, telecommunication, and even financial networks. While modeling and simulation tools have provided insight into the behavior of individual infrastructure networks, a far less understood area is that of the interrelationships among multiple infrastructure networks including the potential cascading effects that may result due to these interdependencies. This paper first describes infrastructure interdependencies as well as presenting a formalization of interdependency types. Next the paper describes a modeling and simulation framework called CIMS© and the work that is being conducted at the Idaho National Laboratory (INL) to model and simulate infrastructure interdependencies and the complex behaviors that can result.

  13. A unifying modeling framework for highly multivariate disease mapping.

    PubMed

    Botella-Rocamora, P; Martinez-Beneito, M A; Banerjee, S

    2015-04-30

    Multivariate disease mapping refers to the joint mapping of multiple diseases from regionally aggregated data and continues to be the subject of considerable attention for biostatisticians and spatial epidemiologists. The key issue is to map multiple diseases accounting for any correlations among themselves. Recently, Martinez-Beneito (2013) provided a unifying framework for multivariate disease mapping. While attractive in that it colligates a variety of existing statistical models for mapping multiple diseases, this and other existing approaches are computationally burdensome and preclude the multivariate analysis of moderate to large numbers of diseases. Here, we propose an alternative reformulation that accrues substantial computational benefits enabling the joint mapping of tens of diseases. Furthermore, the approach subsumes almost all existing classes of multivariate disease mapping models and offers substantial insight into the properties of statistical disease mapping models. PMID:25645551

  14. An empirical generative framework for computational modeling of language acquisition.

    PubMed

    Waterfall, Heidi R; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-06-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of generative grammars from raw CHILDES data and give an account of the generative performance of the acquired grammars. Next, we summarize findings from recent longitudinal and experimental work that suggests how certain statistically prominent structural properties of child-directed speech may facilitate language acquisition. We then present a series of new analyses of CHILDES data indicating that the desired properties are indeed present in realistic child-directed speech corpora. Finally, we suggest how our computational results, behavioral findings, and corpus-based insights can be integrated into a next-generation model aimed at meeting the four requirements of our modeling framework.

  15. A unifying modeling framework for highly multivariate disease mapping.

    PubMed

    Botella-Rocamora, P; Martinez-Beneito, M A; Banerjee, S

    2015-04-30

    Multivariate disease mapping refers to the joint mapping of multiple diseases from regionally aggregated data and continues to be the subject of considerable attention for biostatisticians and spatial epidemiologists. The key issue is to map multiple diseases accounting for any correlations among themselves. Recently, Martinez-Beneito (2013) provided a unifying framework for multivariate disease mapping. While attractive in that it colligates a variety of existing statistical models for mapping multiple diseases, this and other existing approaches are computationally burdensome and preclude the multivariate analysis of moderate to large numbers of diseases. Here, we propose an alternative reformulation that accrues substantial computational benefits enabling the joint mapping of tens of diseases. Furthermore, the approach subsumes almost all existing classes of multivariate disease mapping models and offers substantial insight into the properties of statistical disease mapping models.

  16. Using the Mead model as a framework for nursing care.

    PubMed

    Edwards, S L

    1992-12-01

    A model of nursing has no valid purpose unless it serves nurses to help make their nursing better (Fawcett, 1989). The Mead model formed the basis for nursing care of Jason, a young patient who sustained a head injury, a puncture wound and lacerations to his face, in the study presented here. Examination of the Mead Model of nursing is followed by an account of why this model was used in preference to others as a framework for Jason's care. Three components of his nursing care--wound care, communication, involvement of relatives--are discussed in relation to both the model and current knowledge. It was concluded that as a structured way of planning and giving care, the Mead model lacks adequate guidelines. A less experienced nurse using the Mead model may overlook certain aspects of care, an experienced nurse may use his/her knowledge to give high standard care using research-based information. However, models need to be tested so they may be rejected or modified as guidelines for care in this case in the United Kingdom, within a welfare-orientated society.

  17. Sol-Terra - AN Operational Space Weather Forecasting Model Framework

    NASA Astrophysics Data System (ADS)

    Bisi, M. M.; Lawrence, G.; Pidgeon, A.; Reid, S.; Hapgood, M. A.; Bogdanova, Y.; Byrne, J.; Marsh, M. S.; Jackson, D.; Gibbs, M.

    2015-12-01

    The SOL-TERRA project is a collaboration between RHEA Tech, the Met Office, and RAL Space funded by the UK Space Agency. The goal of the SOL-TERRA project is to produce a Roadmap for a future coupled Sun-to-Earth operational space weather forecasting system covering domains from the Sun down to the magnetosphere-ionosphere-thermosphere and neutral atmosphere. The first stage of SOL-TERRA is underway and involves reviewing current models that could potentially contribute to such a system. Within a given domain, the various space weather models will be assessed how they could contribute to such a coupled system. This will be done both by reviewing peer reviewed papers, and via direct input from the model developers to provide further insight. Once the models have been reviewed then the optimal set of models for use in support of forecast-based SWE modelling will be selected, and a Roadmap for the implementation of an operational forecast-based SWE modelling framework will be prepared. The Roadmap will address the current modelling capability, knowledge gaps and further work required, and also the implementation and maintenance of the overall architecture and environment that the models will operate within. The SOL-TERRA project will engage with external stakeholders in order to ensure independently that the project remains on track to meet its original objectives. A group of key external stakeholders have been invited to provide their domain-specific expertise in reviewing the SOL-TERRA project at critical stages of Roadmap preparation; namely at the Mid-Term Review, and prior to submission of the Final Report. This stakeholder input will ensure that the SOL-TERRA Roadmap will be enhanced directly through the input of modellers and end-users. The overall goal of the SOL-TERRA project is to develop a Roadmap for an operational forecast-based SWE modelling framework with can be implemented within a larger subsequent activity. The SOL-TERRA project is supported within

  18. Receptor modeling application framework for particle source apportionment.

    PubMed

    Watson, John G; Zhu, Tan; Chow, Judith C; Engelbrecht, Johann; Fujita, Eric M; Wilson, William E

    2002-12-01

    Receptor models infer contributions from particulate matter (PM) source types using multivariate measurements of particle chemical and physical properties. Receptor models complement source models that estimate concentrations from emissions inventories and transport meteorology. Enrichment factor, chemical mass balance, multiple linear regression, eigenvector. edge detection, neural network, aerosol evolution, and aerosol equilibrium models have all been used to solve particulate air quality problems, and more than 500 citations of their theory and application document these uses. While elements, ions, and carbons were often used to apportion TSP, PM10, and PM2.5 among many source types, many of these components have been reduced in source emissions such that more complex measurements of carbon fractions, specific organic compounds, single particle characteristics, and isotopic abundances now need to be measured in source and receptor samples. Compliance monitoring networks are not usually designed to obtain data for the observables, locations, and time periods that allow receptor models to be applied. Measurements from existing networks can be used to form conceptual models that allow the needed monitoring network to be optimized. The framework for using receptor models to solve air quality problems consists of: (1) formulating a conceptual model; (2) identifying potential sources; (3) characterizing source emissions; (4) obtaining and analyzing ambient PM samples for major components and source markers; (5) confirming source types with multivariate receptor models; (6) quantifying source contributions with the chemical mass balance; (7) estimating profile changes and the limiting precursor gases for secondary aerosols; and (8) reconciling receptor modeling results with source models, emissions inventories, and receptor data analyses.

  19. Modeling the spectral solar irradiance in the SOTERIA Project Framework

    NASA Astrophysics Data System (ADS)

    Vieira, Luis Eduardo; Dudok de Wit, Thierry; Kretzschmar, Matthieu; Cessateur, Gaël

    The evolution of the radiative energy input is a key element to understand the variability of the Earth's neutral and ionized atmospheric components. However, reliable observations are limited to the last decades, when observations realized above the Earth's atmosphere became possible. These observations have provide insights about the variability of the spectral solar irradiance on time scales from days to years, but there is still large uncertainties on the evolu-tion on time scales from decades to centuries. Here we discuss the physics-based modeling of the ultraviolet solar irradiance under development in the Solar-Terrestrial Investigations and Archives (SOTERIA) project framework. In addition, we compare the modeled solar emission with variability observed by LYRA instrument onboard of Proba2 spacecraft.

  20. A Novel Modeling Framework for Heterogeneous Catalyst Design

    NASA Astrophysics Data System (ADS)

    Katare, Santhoji; Bhan, Aditya; Caruthers, James; Delgass, Nicholas; Lauterbach, Jochen; Venkatasubramanian, Venkat

    2002-03-01

    A systems-oriented, integrated knowledge architecture that enables the use of data from High Throughput Experiments (HTE) for catalyst design is being developed. Higher-level critical reasoning is required to extract information efficiently from the increasingly available HTE data and to develop predictive models that can be used for design purposes. Towards this objective, we have developed a framework that aids the catalyst designer in negotiating the data and model complexities. Traditional kinetic and statistical tools have been systematically implemented and novel artificial intelligence tools have been developed and integrated to speed up the process of modeling catalytic reactions. Multiple nonlinear models that describe CO oxidation on supported metals have been screened using qualitative and quantitative features based optimization ideas. Physical constraints of the system have been used to select the optimum model parameters from the multiple solutions to the parameter estimation problem. Preliminary results about the selection of catalyst descriptors that match a target performance and the use of HTE data for refining fundamentals based models will be discussed.

  1. A Categorical Framework for Model Classification in the Geosciences

    NASA Astrophysics Data System (ADS)

    Hauhs, Michael; Trancón y Widemann, Baltasar; Lange, Holger

    2016-04-01

    Models have a mixed record of success in the geosciences. In meteorology, model development and implementation has been among the first and most successful examples of triggering computer technology in science. On the other hand, notorious problems such as the 'equifinality issue' in hydrology lead to a rather mixed reputation of models in other areas. The most successful models in geosciences are applications of dynamic systems theory to non-living systems or phenomena. Thus, we start from the hypothesis that the success of model applications relates to the influence of life on the phenomenon under study. We thus focus on the (formal) representation of life in models. The aim is to investigate whether disappointment in model performance is due to system properties such as heterogeneity and historicity of ecosystems, or rather reflects an abstraction and formalisation problem at a fundamental level. As a formal framework for this investigation, we use category theory as applied in computer science to specify behaviour at an interface. Its methods have been developed for translating and comparing formal structures among different application areas and seems highly suited for a classification of the current "model zoo" in the geosciences. The approach is rather abstract, with a high degree of generality but a low level of expressibility. Here, category theory will be employed to check the consistency of assumptions about life in different models. It will be shown that it is sufficient to distinguish just four logical cases to check for consistency of model content. All four cases can be formalised as variants of coalgebra-algebra homomorphisms. It can be demonstrated that transitions between the four variants affect the relevant observations (time series or spatial maps), the formalisms used (equations, decision trees) and the test criteria of success (prediction, classification) of the resulting model types. We will present examples from hydrology and ecology in

  2. Quasi-3D Multi-scale Modeling Framework Development

    NASA Astrophysics Data System (ADS)

    Arakawa, A.; Jung, J.

    2008-12-01

    When models are truncated in or near an energetically active range of the spectrum, model physics must be changed as the resolution changes. The model physics of GCMs and that of CRMs are, however, quite different from each other and at present there is no unified formulation of model physics that automatically provides transition between these model physics. The Quasi-3D (Q3D) Multi-scale Modeling Framework (MMF) is an attempt to bridge this gap. Like the recently proposed Heterogeneous Multiscale Method (HMM) (E and Engquist 2003), MMF combines a macroscopic model, GCM, and a microscopic model, CRM. Unlike the traditional multiscale methods such as the multi-grid and adapted mesh refinement techniques, HMM and MMF are for solving multi-physics problems. They share the common objective "to design combined macroscopic-microscopic computational methods that are much more efficient than solving the full microscopic model and at the same time give the information we need" (E et al. 2008). The question is then how to meet this objective in practice, which can be highly problem dependent. In HHM, the efficiency is gained typically by localization of the microscale problem. Following the pioneering work by Grabowski and Smolarkiewicz (1999) and Grabowski (2001), MMF takes advantage of the fact that 2D CRMs are reasonably successful in simulating deep clouds. In this approach, the efficiency is gained by sacrificing the three-dimensionality of cloud-scale motion. It also "localizes" the algorithm through embedding a CRM in each GCM grid box using cyclic boundary condition. The Q3D MMF is an attempt to reduce the expense due to these constraints by partially including the cloud-scale 3D effects and extending the CRM beyond individual GCM grid boxes. As currently formulated, the Q3D MMF is a 4D estimation/prediction framework that combines a GCM with a 3D anelastic cloud-resolving vector vorticity equation model (VVM) applied to a network of horizontal grids. The network

  3. Proposed framework for thermomechanical life modeling of metal matrix composites

    NASA Technical Reports Server (NTRS)

    Halford, Gary R.; Lerch, Bradley A.; Saltsman, James F.

    1993-01-01

    The framework of a mechanics of materials model is proposed for thermomechanical fatigue (TMF) life prediction of unidirectional, continuous-fiber metal matrix composites (MMC's). Axially loaded MMC test samples are analyzed as structural components whose fatigue lives are governed by local stress-strain conditions resulting from combined interactions of the matrix, interfacial layer, and fiber constituents. The metallic matrix is identified as the vehicle for tracking fatigue crack initiation and propagation. The proposed framework has three major elements. First, TMF flow and failure characteristics of in situ matrix material are approximated from tests of unreinforced matrix material, and matrix TMF life prediction equations are numerically calibrated. The macrocrack initiation fatigue life of the matrix material is divided into microcrack initiation and microcrack propagation phases. Second, the influencing factors created by the presence of fibers and interfaces are analyzed, characterized, and documented in equation form. Some of the influences act on the microcrack initiation portion of the matrix fatigue life, others on the microcrack propagation life, while some affect both. Influencing factors include coefficient of thermal expansion mismatch strains, residual (mean) stresses, multiaxial stress states, off-axis fibers, internal stress concentrations, multiple initiation sites, nonuniform fiber spacing, fiber debonding, interfacial layers and cracking, fractured fibers, fiber deflections of crack fronts, fiber bridging of matrix cracks, and internal oxidation along internal interfaces. Equations exist for some, but not all, of the currently identified influencing factors. The third element is the inclusion of overriding influences such as maximum tensile strain limits of brittle fibers that could cause local fractures and ensuing catastrophic failure of surrounding matrix material. Some experimental data exist for assessing the plausibility of the proposed

  4. Usage Intention Framework Model: A Fuzzy Logic Interpretation of the Classical Utaut Model

    ERIC Educational Resources Information Center

    Sandaire, Johnny

    2009-01-01

    A fuzzy conjoint analysis (FCA: Turksen, 1992) model for enhancing management decision in the technology adoption domain was implemented as an extension to the UTAUT model (Venkatesh, Morris, Davis, & Davis, 2003). Additionally, a UTAUT-based Usage Intention Framework Model (UIFM) introduced a closed-loop feedback system. The empirical evidence…

  5. Modelling grain growth in the framework of Rational Extended Thermodynamics

    NASA Astrophysics Data System (ADS)

    Kertsch, Lukas; Helm, Dirk

    2016-05-01

    Grain growth is a significant phenomenon for the thermomechanical processing of metals. Since the mobility of the grain boundaries is thermally activated and energy stored in the grain boundaries is released during their motion, a mutual interaction with the process conditions occurs. To model such phenomena, a thermodynamic framework for the representation of thermomechanical coupling phenomena in metals including a microstructure description is required. For this purpose, Rational Extended Thermodynamics appears to be a useful tool. We apply an entropy principle to derive a thermodynamically consistent model for grain coarsening due to the growth and shrinkage of individual grains. Despite the rather different approaches applied, we obtain a grain growth model which is similar to existing ones and can be regarded as a thermodynamic extension of that by Hillert (1965) to more general systems. To demonstrate the applicability of the model, we compare our simulation results to grain growth experiments in pure copper by different authors, which we are able to reproduce very accurately. Finally, we study the implications of the energy release due to grain growth on the energy balance. The present unified approach combining a microstructure description and continuum mechanics is ready to be further used to develop more elaborate material models for complex thermo-chemo-mechanical coupling phenomena.

  6. Sensor models and a framework for sensor management

    NASA Astrophysics Data System (ADS)

    Gaskell, Alex P.; Probert, Penelope J.

    1993-08-01

    We describe the use of Bayesian belief networks and decision theoretic principles for sensor management in multi-sensor systems. This framework provides a way of representing sensory data and choosing actions under uncertainty. The work considers how to distribute functionality between sensors and the controller. Use is made of logical sensors based on complementary physical sensors to provide information at the task level of abstraction represented within the network. We are applying these methods in the area of low level planning in mobile robotics. A key feature of the work is the development of quantified models to represent diverse sensors, in particular the sonar array and infra-red triangulation sensors we use on our AGV. We need to develop a model which can handle these very different sensors but provides a common interface to the sensor management process. We do this by quantifying the uncertainty through probabilistic models of the sensors, taking into account their physical characteristics and interaction with the expected environment. Modelling the sensor characteristics to an appropriate level of detail has the advantage of giving more accurate and robust mapping between the physical and logical sensor, as well as a better understanding of environmental dependency and its limitations. We describe a model of a sonar array, which explicitly takes into account features such as beam-width and ranging errors, and its integration into the sensor management process.

  7. Structural uncertainty in watershed phosphorus modeling: Toward a stochastic framework

    NASA Astrophysics Data System (ADS)

    Chen, Lei; Gong, Yongwei; Shen, Zhenyao

    2016-06-01

    Structural uncertainty is an important source of model predictive errors, but few studies have been conducted on the error-transitivity from model structure to nonpoint source (NPS) prediction. In this study, we focused on the structural uncertainty caused by the algorithms and equations that are used to describe the phosphorus (P) cycle at the watershed scale. The sensitivity of simulated P to each algorithm/equation was quantified using the Soil and Water Assessment Tool (SWAT) in the Three Gorges Reservoir Area, China. The results indicated that the ratios of C:N and P:N for humic materials, as well as the algorithm of fertilization and P leaching contributed the largest output uncertainties. In comparison, the initiation of inorganic P in the soil layer and the transformation algorithm between P pools are less sensitive for the NPS-P predictions. In addition, the coefficient of variation values were quantified as 0.028-0.086, indicating that the structure-induced uncertainty is minor compared to NPS-P prediction uncertainty caused by the model input and parameters. Using the stochastic framework, the cumulative probability of simulated NPS-P data provided a trade-off between expenditure burden and desired risk. In this sense, this paper provides valuable information for the control of model structural uncertainty, and can be extrapolated to other model-based studies.

  8. Assessment of solution uncertainties in single-column modeling frameworks

    SciTech Connect

    Hack, J.J.; Pedretti, J.A.

    2000-01-15

    Single-column models (SCMs) have been extensively promoted in recent years as an effective means to develop and test physical parameterizations targeted for more complex three-dimensional climate models. Although there are some clear advantages associated with single-column modeling, there are also some significant disadvantages, including the absence of large-scale feedbacks. Basic limitations of an SCM framework can make it difficult to interpret solutions, and at times contribute to rather striking failures to identify even first-order sensitivities as they would be observed in a global climate simulation. This manuscript will focus on one of the basic experimental approaches currently exploited by the single-column modeling community, with an emphasis on establishing the inherent uncertainties in the numerical solutions. The analysis will employ the standard physics package from the NCAR CCM3 and will illustrate the nature of solution uncertainties that arise from nonlinearities in parameterized physics. The results of this study suggest the need to make use of an ensemble methodology when conducting single-column modeling investigations.

  9. A Data Driven Framework for Integrating Regional Climate Models

    NASA Astrophysics Data System (ADS)

    Lansing, C.; Kleese van Dam, K.; Liu, Y.; Elsethagen, T.; Guillen, Z.; Stephan, E.; Critchlow, T.; Gorton, I.

    2012-12-01

    There are increasing needs for research addressing complex climate sensitive issues of concern to decision-makers and policy planners at a regional level. Decisions about allocating scarce water across competing municipal, agricultural, and ecosystem demands is just one of the challenges ahead, along with decisions regarding competing land use priorities such as biofuels, food, and species habitat. Being able to predict the extent of future climate change in the context of introducing alternative energy production strategies requires a new generation of modeling capabilities. We will also need more complete representations of human systems at regional scales, incorporating the influences of population centers, land use, agriculture and existing and planned electrical demand and generation infrastructure. At PNNL we are working towards creating a first-of-a-kind capability known as the Integrated Regional Earth System Model (iRESM). The fundamental goal of the iRESM initiative is the critical analyses of the tradeoffs and consequences of decision and policy making for integrated human and environmental systems. This necessarily combines different scientific processes, bridging different temporal and geographic scales and resolving the semantic differences between them. To achieve this goal, iRESM is developing a modeling framework and supporting infrastructure that enable the scientific team to evaluate different scenarios in light of specific stakeholder questions such as "How do regional changes in mean climate states and climate extremes affect water storage and energy consumption and how do such decisions influence possible mitigation and carbon management schemes?" The resulting capability will give analysts a toolset to gain insights into how regional economies can respond to climate change mitigation policies and accelerated deployment of alternative energy technologies. The iRESM framework consists of a collection of coupled models working with high

  10. A framework for industrial systems modeling and simulation

    SciTech Connect

    Macfarlane, J.; Nachnani, S.; Tsai, L.H.; Kaae, P.; Freund, K.; Hoza, M.; Stahlman, E.

    1995-04-01

    To successfully compete in a global market, manufacturing production systems are being forced to reduce time to market and to provide improved responsiveness to changes in market conditions. The organizations that comprise the business links in the production system must constantly make tradeoffs between time and cost in order to achieve a competitive but quick response to consumer demand. Due to the inherent uncertainty of consumer demand, these tradeoffs are, by definition, made with incomplete information and can incur significant financial and competitive risk to the organization. Partnerships between organizations are a mechanism for increasing the information in the decision making process by combining information from the two partners. Partnerships are inherently difficult to implement due to trust issues. A mechanism for investigating and validating the mutual benefit to partnering would be useful in designing and implementing partnerships. This paper describes the development of a software framework for industrial systems modeling and simulation. The framework provides a mechanism for investigating changes to industrial systems in a manner which minimizes the effort and computational power needed to develop focused simulations. The architecture and it`s component parts are described.

  11. A Process Algebraic Framework for Modeling Resource Demand and Supply

    NASA Astrophysics Data System (ADS)

    Philippou, Anna; Lee, Insup; Sokolsky, Oleg; Choi, Jin-Young

    As real-time embedded systems become more complex, resource partitioning is increasingly used to guarantee real-time performance. Recently, several compositional frameworks of resource partitioning have been proposed using real-time scheduling theory with various notions of real-time tasks running under restricted resource supply environments. However, these real-time scheduling-based approaches are limited in their expressiveness in that, although capable of describing resource-demand tasks, they are unable to model resource supply. This paper describes a process algebraic framework for reasoning about resource demand and supply inspired by the timed process algebra ACSR. In ACSR, real-time tasks are specified by enunciating their consumption needs for resources. To also accommodate resource-supply processes we define PADS where, given a resource CPU, the complimented resource overline{CPU} denotes for availability of CPU for the corresponding demand process. Using PADS, we define a supply-demand relation where a pair (S, T) belongs to the relation if the demand process T can be scheduled under supply S. We develop a theory of compositional schedulability analysis as well as a technique for synthesizing an optimal supply process for a set of tasks. We illustrate our technique via a number of examples.

  12. A modelling framework to simulate foliar fungal epidemics using functional–structural plant models

    PubMed Central

    Garin, Guillaume; Fournier, Christian; Andrieu, Bruno; Houlès, Vianney; Robert, Corinne; Pradal, Christophe

    2014-01-01

    Background and Aims Sustainable agriculture requires the identification of new, environmentally responsible strategies of crop protection. Modelling of pathosystems can allow a better understanding of the major interactions inside these dynamic systems and may lead to innovative protection strategies. In particular, functional–structural plant models (FSPMs) have been identified as a means to optimize the use of architecture-related traits. A current limitation lies in the inherent complexity of this type of modelling, and thus the purpose of this paper is to provide a framework to both extend and simplify the modelling of pathosystems using FSPMs. Methods Different entities and interactions occurring in pathosystems were formalized in a conceptual model. A framework based on these concepts was then implemented within the open-source OpenAlea modelling platform, using the platform's general strategy of modelling plant–environment interactions and extending it to handle plant interactions with pathogens. New developments include a generic data structure for representing lesions and dispersal units, and a series of generic protocols to communicate with objects representing the canopy and its microenvironment in the OpenAlea platform. Another development is the addition of a library of elementary models involved in pathosystem modelling. Several plant and physical models are already available in OpenAlea and can be combined in models of pathosystems using this framework approach. Key Results Two contrasting pathosystems are implemented using the framework and illustrate its generic utility. Simulations demonstrate the framework's ability to simulate multiscaled interactions within pathosystems, and also show that models are modular components within the framework and can be extended. This is illustrated by testing the impact of canopy architectural traits on fungal dispersal. Conclusions This study provides a framework for modelling a large number of pathosystems

  13. Pathogen population dynamics in agricultural landscapes: the Ddal modelling framework.

    PubMed

    Papaïx, Julien; Adamczyk-Chauvat, Katarzyna; Bouvier, Annie; Kiêu, Kiên; Touzeau, Suzanne; Lannou, Christian; Monod, Hervé

    2014-10-01

    Modelling processes that occur at the landscape scale is gaining more and more attention from theoretical ecologists to agricultural managers. Most of the approaches found in the literature lack applicability for managers or, on the opposite, lack a sound theoretical basis. Based on the metapopulation concept, we propose here a modelling approach for landscape epidemiology that takes advantage of theoretical results developed in the metapopulation context while considering realistic landscapes structures. A landscape simulator makes it possible to represent both the field pattern and the spatial distribution of crops. The pathogen population dynamics are then described through a matrix population model both stage- and space-structured. In addition to a classical invasion analysis we present a stochastic simulation experiment and provide a complete framework for performing a sensitivity analysis integrating the landscape as an input factor. We illustrate our approach using an example to evaluate whether the agricultural landscape composition and structure may prevent and mitigate the development of an epidemic. Although designed for a fungal foliar disease, our modelling approach is easily adaptable to other organisms.

  14. Improving NASA's Multiscale Modeling Framework for Tropical Cyclone Climate Study

    NASA Technical Reports Server (NTRS)

    Shen, Bo-Wen; Nelson, Bron; Cheung, Samson; Tao, Wei-Kuo

    2013-01-01

    One of the current challenges in tropical cyclone (TC) research is how to improve our understanding of TC interannual variability and the impact of climate change on TCs. Recent advances in global modeling, visualization, and supercomputing technologies at NASA show potential for such studies. In this article, the authors discuss recent scalability improvement to the multiscale modeling framework (MMF) that makes it feasible to perform long-term TC-resolving simulations. The MMF consists of the finite-volume general circulation model (fvGCM), supplemented by a copy of the Goddard cumulus ensemble model (GCE) at each of the fvGCM grid points, giving 13,104 GCE copies. The original fvGCM implementation has a 1D data decomposition; the revised MMF implementation retains the 1D decomposition for most of the code, but uses a 2D decomposition for the massive copies of GCEs. Because the vast majority of computation time in the MMF is spent computing the GCEs, this approach can achieve excellent speedup without incurring the cost of modifying the entire code. Intelligent process mapping allows differing numbers of processes to be assigned to each domain for load balancing. The revised parallel implementation shows highly promising scalability, obtaining a nearly 80-fold speedup by increasing the number of cores from 30 to 3,335.

  15. Factors of collaborative working: a framework for a collaboration model.

    PubMed

    Patel, Harshada; Pettitt, Michael; Wilson, John R

    2012-01-01

    The ability of organisations to support collaborative working environments is of increasing importance as they move towards more distributed ways of working. Despite the attention collaboration has received from a number of disparate fields, there is a lack of a unified understanding of the component factors of collaboration. As part of our work on a European Integrated Project, CoSpaces, collaboration and collaborative working and the factors which define it were examined through the literature and new empirical work with a number of partner user companies in the aerospace, automotive and construction sectors. This was to support development of a descriptive human factors model of collaboration - the CoSpaces Collaborative Working Model (CCWM). We identified seven main categories of factors involved in collaboration: Context, Support, Tasks, Interaction Processes, Teams, Individuals, and Overarching Factors, and summarised these in a framework which forms a basis for the model. We discuss supporting evidence for the factors which emerged from our fieldwork with user partners, and use of the model in activities such as collaboration readiness profiling. PMID:21616476

  16. Factors of collaborative working: a framework for a collaboration model.

    PubMed

    Patel, Harshada; Pettitt, Michael; Wilson, John R

    2012-01-01

    The ability of organisations to support collaborative working environments is of increasing importance as they move towards more distributed ways of working. Despite the attention collaboration has received from a number of disparate fields, there is a lack of a unified understanding of the component factors of collaboration. As part of our work on a European Integrated Project, CoSpaces, collaboration and collaborative working and the factors which define it were examined through the literature and new empirical work with a number of partner user companies in the aerospace, automotive and construction sectors. This was to support development of a descriptive human factors model of collaboration - the CoSpaces Collaborative Working Model (CCWM). We identified seven main categories of factors involved in collaboration: Context, Support, Tasks, Interaction Processes, Teams, Individuals, and Overarching Factors, and summarised these in a framework which forms a basis for the model. We discuss supporting evidence for the factors which emerged from our fieldwork with user partners, and use of the model in activities such as collaboration readiness profiling.

  17. A hierarchical modeling framework for multiple observer transect surveys.

    PubMed

    Conn, Paul B; Laake, Jeffrey L; Johnson, Devin S

    2012-01-01

    Ecologists often use multiple observer transect surveys to census animal populations. In addition to animal counts, these surveys produce sequences of detections and non-detections for each observer. When combined with additional data (i.e. covariates such as distance from the transect line), these sequences provide the additional information to estimate absolute abundance when detectability on the transect line is less than one. Although existing analysis approaches for such data have proven extremely useful, they have some limitations. For instance, it is difficult to extrapolate from observed areas to unobserved areas unless a rigorous sampling design is adhered to; it is also difficult to share information across spatial and temporal domains or to accommodate habitat-abundance relationships. In this paper, we introduce a hierarchical modeling framework for multiple observer line transects that removes these limitations. In particular, abundance intensities can be modeled as a function of habitat covariates, making it easier to extrapolate to unsampled areas. Our approach relies on a complete data representation of the state space, where unobserved animals and their covariates are modeled using a reversible jump Markov chain Monte Carlo algorithm. Observer detections are modeled via a bivariate normal distribution on the probit scale, with dependence induced by a distance-dependent correlation parameter. We illustrate performance of our approach with simulated data and on a known population of golf tees. In both cases, we show that our hierarchical modeling approach yields accurate inference about abundance and related parameters. In addition, we obtain accurate inference about population-level covariates (e.g. group size). We recommend that ecologists consider using hierarchical models when analyzing multiple-observer transect data, especially when it is difficult to rigorously follow pre-specified sampling designs. We provide a new R package, hierarchical

  18. A Smart Modeling Framework for Integrating BMI-enabled Models as Web Services

    NASA Astrophysics Data System (ADS)

    Jiang, P.; Elag, M.; Kumar, P.; Peckham, S. D.; Liu, R.; Marini, L.; Hsu, L.

    2015-12-01

    Serviced-oriented computing provides an opportunity to couple web service models using semantic web technology. Through this approach, models that are exposed as web services can be conserved in their own local environment, thus making it easy for modelers to maintain and update the models. In integrated modeling, the serviced-oriented loose-coupling approach requires (1) a set of models as web services, (2) the model metadata describing the external features of a model (e.g., variable name, unit, computational grid, etc.) and (3) a model integration framework. We present the architecture of coupling web service models that are self-describing by utilizing a smart modeling framework. We expose models that are encapsulated with CSDMS (Community Surface Dynamics Modeling System) Basic Model Interfaces (BMI) as web services. The BMI-enabled models are self-describing by uncovering models' metadata through BMI functions. After a BMI-enabled model is serviced, a client can initialize, execute and retrieve the meta-information of the model by calling its BMI functions over the web. Furthermore, a revised version of EMELI (Peckham, 2015), an Experimental Modeling Environment for Linking and Interoperability, is chosen as the framework for coupling BMI-enabled web service models. EMELI allows users to combine a set of component models into a complex model by standardizing model interface using BMI as well as providing a set of utilities smoothing the integration process (e.g., temporal interpolation). We modify the original EMELI so that the revised modeling framework is able to initialize, execute and find the dependencies of the BMI-enabled web service models. By using the revised EMELI, an example will be presented on integrating a set of topoflow model components that are BMI-enabled and exposed as web services. Reference: Peckham, S.D. (2014) EMELI 1.0: An experimental smart modeling framework for automatic coupling of self-describing models, Proceedings of HIC 2014

  19. a Framework for AN Open Source Geospatial Certification Model

    NASA Astrophysics Data System (ADS)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105

  20. Internal modelling under Risk-Based Capital (RBC) framework

    NASA Astrophysics Data System (ADS)

    Ling, Ang Siew; Hin, Pooi Ah

    2015-12-01

    Very often the methods for the internal modelling under the Risk-Based Capital framework make use of the data which are in the form of run-off triangle. The present research will instead extract from a group of n customers, the historical data for the sum insured si of the i-th customer together with the amount paid yij and the amount aij reported but not yet paid in the j-th development year for j = 1, 2, 3, 4, 5, 6. We model the future value (yij+1, aij+1) to be dependent on the present year value (yij, aij) and the sum insured si via a conditional distribution which is derived from a multivariate power-normal mixture distribution. For a group of given customers with different original purchase dates, the distribution of the aggregate claims liabilities may be obtained from the proposed model. The prediction interval based on the distribution for the aggregate claim liabilities is found to have good ability of covering the observed aggregate claim liabilities.

  1. A framework of modeling detector systems for computed tomography simulations

    NASA Astrophysics Data System (ADS)

    Youn, H.; Kim, D.; Kim, S. H.; Kam, S.; Jeon, H.; Nam, J.; Kim, H. K.

    2016-01-01

    Ultimate development in computed tomography (CT) technology may be a system that can provide images with excellent lesion conspicuity with the patient dose as low as possible. Imaging simulation tools have been cost-effectively used for these developments and will continue. For a more accurate and realistic imaging simulation, the signal and noise propagation through a CT detector system has been modeled in this study using the cascaded linear-systems theory. The simulation results are validated in comparisons with the measured results using a laboratory flat-panel micro-CT system. Although the image noise obtained from the simulations at higher exposures is slightly smaller than that obtained from the measurements, the difference between them is reasonably acceptable. According to the simulation results for various exposure levels and additive electronic noise levels, x-ray quantum noise is more dominant than the additive electronic noise. The framework of modeling a CT detector system suggested in this study will be helpful for the development of an accurate and realistic projection simulation model.

  2. Investigating GPDs in the framework of the double distribution model

    NASA Astrophysics Data System (ADS)

    Nazari, F.; Mirjalili, A.

    2016-06-01

    In this paper, we construct the generalized parton distribution (GPD) in terms of the kinematical variables x, ξ, t, using the double distribution model. By employing these functions, we could extract some quantities which makes it possible to gain a three-dimensional insight into the nucleon structure function at the parton level. The main objective of GPDs is to combine and generalize the concepts of ordinary parton distributions and form factors. They also provide an exclusive framework to describe the nucleons in terms of quarks and gluons. Here, we first calculate, in the Double Distribution model, the GPD based on the usual parton distributions arising from the GRV and CTEQ phenomenological models. Obtaining quarks and gluons angular momenta from the GPD, we would be able to calculate the scattering observables which are related to spin asymmetries of the produced quarkonium. These quantities are represented by AN and ALS. We also calculate the Pauli and Dirac form factors in deeply virtual Compton scattering. Finally, in order to compare our results with the existing experimental data, we use the difference of the polarized cross-section for an initial longitudinal leptonic beam and unpolarized target particles (ΔσLU). In all cases, our obtained results are in good agreement with the available experimental data.

  3. Evolution of Climate Science Modelling Language within international standards frameworks

    NASA Astrophysics Data System (ADS)

    Lowe, Dominic; Woolf, Andrew

    2010-05-01

    The Climate Science Modelling Language (CSML) was originally developed as part of the NERC Data Grid (NDG) project in the UK. It was one of the first Geography Markup Language (GML) application schemas describing complex feature types for the metocean domain. CSML feature types can be used to describe typical climate products such as model runs or atmospheric profiles. CSML has been successfully used within NDG to provide harmonised access to a number of different data sources. For example, meteorological observations held in heterogeneous databases by the British Atmospheric Data Centre (BADC) and Centre for Ecology and Hydrology (CEH) were served uniformly as CSML features via Web Feature Service. CSML has now been substantially revised to harmonise it with the latest developments in OGC and ISO conceptual modelling for geographic information. In particular, CSML is now aligned with the near-final ISO 19156 Observations & Measurements (O&M) standard. CSML combines the O&M concept of 'sampling features' together with an observation result based on the coverage model (ISO 19123). This general pattern is specialised for particular data types of interest, classified on the basis of sampling geometry and topology. In parallel work, the OGC Met Ocean Domain Working Group has established a conceptual modelling activity. This is a cross-organisational effort aimed at reaching consensus on a common core data model that could be re-used in a number of met-related application areas: operational meteorology, aviation meteorology, climate studies, and the research community. It is significant to note that this group has also identified sampling geometry and topology as a key classification axis for data types. Using the Model Driven Architecture (MDA) approach as adopted by INSPIRE we demonstrate how the CSML application schema is derived from a formal UML conceptual model based on the ISO TC211 framework. By employing MDA tools which map consistently between UML and GML we

  4. AN INTEGRATED MODELING FRAMEWORK FOR CARBON MANAGEMENT TECHNOLOGIES

    SciTech Connect

    Anand B. Rao; Edward S. Rubin; Michael B. Berkenpas

    2004-03-01

    CO{sub 2} capture and storage (CCS) is gaining widespread interest as a potential method to control greenhouse gas emissions from fossil fuel sources, especially electric power plants. Commercial applications of CO{sub 2} separation and capture technologies are found in a number of industrial process operations worldwide. Many of these capture technologies also are applicable to fossil fuel power plants, although applications to large-scale power generation remain to be demonstrated. This report describes the development of a generalized modeling framework to assess alternative CO{sub 2} capture and storage options in the context of multi-pollutant control requirements for fossil fuel power plants. The focus of the report is on post-combustion CO{sub 2} capture using amine-based absorption systems at pulverized coal-fired plants, which are the most prevalent technology used for power generation today. The modeling framework builds on the previously developed Integrated Environmental Control Model (IECM). The expanded version with carbon sequestration is designated as IECM-cs. The expanded modeling capability also includes natural gas combined cycle (NGCC) power plants and integrated coal gasification combined cycle (IGCC) systems as well as pulverized coal (PC) plants. This report presents details of the performance and cost models developed for an amine-based CO{sub 2} capture system, representing the baseline of current commercial technology. The key uncertainties and variability in process design, performance and cost parameters which influence the overall cost of carbon mitigation also are characterized. The new performance and cost models for CO{sub 2} capture systems have been integrated into the IECM-cs, along with models to estimate CO{sub 2} transport and storage costs. The CO{sub 2} control system also interacts with other emission control technologies such as flue gas desulfurization (FGD) systems for SO{sub 2} control. The integrated model is applied to

  5. Model-Based Reasoning in the Physics Laboratory: Framework and Initial Results

    ERIC Educational Resources Information Center

    Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.

    2015-01-01

    We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable…

  6. Models of Recognition, Repetition Priming, and Fluency : Exploring a New Framework

    ERIC Educational Resources Information Center

    Berry, Christopher J.; Shanks, David R.; Speekenbrink, Maarten; Henson, Richard N. A.

    2012-01-01

    We present a new modeling framework for recognition memory and repetition priming based on signal detection theory. We use this framework to specify and test the predictions of 4 models: (a) a single-system (SS) model, in which one continuous memory signal drives recognition and priming; (b) a multiple-systems-1 (MS1) model, in which completely…

  7. Smart licensing and environmental flows: Modeling framework and sensitivity testing

    NASA Astrophysics Data System (ADS)

    Wilby, R. L.; Fenn, C. R.; Wood, P. J.; Timlett, R.; Lequesne, T.

    2011-12-01

    Adapting to climate change is just one among many challenges facing river managers. The response will involve balancing the long-term water demands of society with the changing needs of the environment in sustainable and cost effective ways. This paper describes a modeling framework for evaluating the sensitivity of low river flows to different configurations of abstraction licensing under both historical climate variability and expected climate change. A rainfall-runoff model is used to quantify trade-offs among environmental flow (e-flow) requirements, potential surface and groundwater abstraction volumes, and the frequency of harmful low-flow conditions. Using the River Itchen in southern England as a case study it is shown that the abstraction volume is more sensitive to uncertainty in the regional climate change projection than to the e-flow target. It is also found that "smarter" licensing arrangements (involving a mix of hands off flows and "rising block" abstraction rules) could achieve e-flow targets more frequently than conventional seasonal abstraction limits, with only modest reductions in average annual yield, even under a hotter, drier climate change scenario.

  8. A modeling framework for potential induced degradation in PV modules

    NASA Astrophysics Data System (ADS)

    Bermel, Peter; Asadpour, Reza; Zhou, Chao; Alam, Muhammad A.

    2015-09-01

    Major sources of performance degradation and failure in glass-encapsulated PV modules include moisture-induced gridline corrosion, potential-induced degradation (PID) of the cell, and stress-induced busbar delamination. Recent studies have shown that PV modules operating in damp heat at -600 V are vulnerable to large amounts of degradation, potentially up to 90% of the original power output within 200 hours. To improve module reliability and restore power production in the presence of PID and other failure mechanisms, a fundamental rethinking of accelerated testing is needed. This in turn will require an improved understanding of technology choices made early in development that impact failures later. In this work, we present an integrated approach of modeling, characterization, and validation to address these problems. A hierarchical modeling framework will allows us to clarify the mechanisms of corrosion, PID, and delamination. We will employ a physics-based compact model of the cell, topology of the electrode interconnection, geometry of the packaging stack, and environmental operating conditions to predict the current, voltage, temperature, and stress distributions in PV modules correlated with the acceleration of specific degradation modes. A self-consistent solution will capture the essential complexity of the technology-specific acceleration of PID and other degradation mechanisms as a function of illumination, ambient temperature, and relative humidity. Initial results from our model include specific lifetime predictions suitable for direct comparison with indoor and outdoor experiments, which are qualitatively validated by prior work. This approach could play a significant role in developing novel accelerated lifetime tests.

  9. A modeling and simulation framework for electrokinetic nanoparticle treatment

    NASA Astrophysics Data System (ADS)

    Phillips, James

    2011-12-01

    The focus of this research is to model and provide a simulation framework for the packing of differently sized spheres within a hard boundary. The novel contributions of this dissertation are the cylinders of influence (COI) method and sectoring method implementations. The impetus for this research stems from modeling electrokinetic nanoparticle (EN) treatment, which packs concrete pores with differently sized nanoparticles. We show an improved speed of the simulation compared to previously published results of EN treatment simulation while obtaining similar porosity reduction results. We mainly focused on readily, commercially available particle sizes of 2 nm and 20 nm particles, but have the capability to model other sizes. Our simulation has graphical capabilities and can provide additional data unobtainable from physical experimentation. The data collected has a median of 0.5750 and a mean of 0.5504. The standard error is 0.0054 at alpha = 0.05 for a 95% confidence interval of 0.5504 +/- 0.0054. The simulation has produced maximum packing densities of 65% and minimum packing densities of 34%. Simulation data are analyzed using linear regression via the R statistical language to obtain two equations: one that describes porosity reduction based on all cylinder and particle characteristics, and another that focuses on describing porosity reduction based on cylinder diameter for 2 and 20 nm particles into pores of 100 nm height. Simulation results are similar to most physical results obtained from MIP and WLR. Some MIP results do not fall within the simulation limits; however, this is expected as MIP has been documented to be an inaccurate measure of pore distribution and porosity of concrete. Despite the disagreement between WLR and MIP, there is a trend that porosity reduction is higher two inches from the rebar as compared to the rebar-concrete interface. The simulation also detects a higher porosity reduction further from the rebar. This may be due to particles

  10. Adaptive invasive species distribution models: A framework for modeling incipient invasions

    USGS Publications Warehouse

    Uden, Daniel R.; Allen, Craig R.; Angeler, David G.; Corral, Lucia; Fricke, Kent A.

    2015-01-01

    The utilization of species distribution model(s) (SDM) for approximating, explaining, and predicting changes in species’ geographic locations is increasingly promoted for proactive ecological management. Although frameworks for modeling non-invasive species distributions are relatively well developed, their counterparts for invasive species—which may not be at equilibrium within recipient environments and often exhibit rapid transformations—are lacking. Additionally, adaptive ecological management strategies address the causes and effects of biological invasions and other complex issues in social-ecological systems. We conducted a review of biological invasions, species distribution models, and adaptive practices in ecological management, and developed a framework for adaptive, niche-based, invasive species distribution model (iSDM) development and utilization. This iterative, 10-step framework promotes consistency and transparency in iSDM development, allows for changes in invasive drivers and filters, integrates mechanistic and correlative modeling techniques, balances the avoidance of type 1 and type 2 errors in predictions, encourages the linking of monitoring and management actions, and facilitates incremental improvements in models and management across space, time, and institutional boundaries. These improvements are useful for advancing coordinated invasive species modeling, management and monitoring from local scales to the regional, continental and global scales at which biological invasions occur and harm native ecosystems and economies, as well as for anticipating and responding to biological invasions under continuing global change.

  11. The BlueSky Smoke Modeling Framework: Recent Developments

    NASA Astrophysics Data System (ADS)

    Sullivan, D. C.; Larkin, N.; Raffuse, S. M.; Strand, T.; ONeill, S. M.; Leung, F. T.; Qu, J. J.; Hao, X.

    2012-12-01

    (TRMM) Multi-satellite Precipitation Analysis Real-Time (TMPA-RT) data set is being used to improve dead fuel moisture estimates. - EastFire live fuel moisture estimates, which are derived from NASA's MODIS direct broadcast, are being used to improve live fuel moisture estimates. - NASA's Multi-angle Imaging Spectroradiometer (MISR) stereo heights are being used to improve estimates of plume injection heights. Further, the Fire Location and Modeling of Burning Emissions (FLAMBÉ) model was incorporated into the BlueSky Framework as an alternative means of calculating fire emissions. FLAMBÉ directly estimates emissions on the basis of fire detections and radiance measures from NASA's MODIS and NOAA's GOES satellites. (The authors gratefully acknowledge NASA's Applied Sciences Program [Grant Nos. NN506AB52A and NNX09AV76G)], the USDA Forest Service, and the Joint Fire Science Program for their support.)

  12. D Geological Framework Models as a Teaching Aid for Geoscience

    NASA Astrophysics Data System (ADS)

    Kessler, H.; Ward, E.; Geological ModelsTeaching Project Team

    2010-12-01

    3D geological models have great potential as a resource for universities when teaching foundation geological concepts as it allows the student to visualise and interrogate UK geology. They are especially useful when dealing with the conversion of 2D field, map and GIS outputs into three dimensional geological units, which is a common problem for all students of geology. Today’s earth science students use a variety of skills and processes during their learning experience including the application of schema’s, spatial thinking, image construction, detecting patterns, memorising figures, mental manipulation and interpretation, making predictions and deducing the orientation of themselves and the rocks. 3D geological models can reinforce spatial thinking strategies and encourage students to think about processes and properties, in turn helping the student to recognise pre-learnt geological principles in the field and to convert what they see at the surface into a picture of what is going on at depth. Learning issues faced by students may also be encountered by experts, policy managers, and stakeholders when dealing with environmental problems. Therefore educational research of student learning in earth science may also improve environmental decision making. 3D geological framework models enhance the learning of Geosciences because they: ● enable a student to observe, manipulate and interpret geology; in particular the models instantly convert two-dimensional geology (maps, boreholes and cross-sections) into three dimensions which is a notoriously difficult geospatial skill to acquire. ● can be orientated to whatever the user finds comfortable and most aids recognition and interpretation. ● can be used either to teach geosciences to complete beginners or add to experienced students body of knowledge (whatever point that may be at). Models could therefore be packaged as a complete educational journey or students and tutor can select certain areas of the model

  13. XML-based 3D model visualization and simulation framework for dynamic models

    NASA Astrophysics Data System (ADS)

    Kim, Taewoo; Fishwick, Paul A.

    2002-07-01

    Relatively recent advances in computer technology enable us to create three-dimensional (3D) dynamic models and simulate them within a 3D web environment. The use of such models is especially valuable when teaching simulation, and the concepts behind dynamic models, since the models are made more accessible to the students. Students tend to enjoy a construction process in which they are able to employ their own cultural and aesthetic forms. The challenge is to create a language that allows for a grammar for modeling, while simultaneously permitting arbitrary presentation styles. For further flexibility, we need an effective way to represent and simulate dynamic models that can be shared by modelers over the Internet. We present an Extensible Markup Language (XML)-based framework that will guide a modeler in creating personalized 3D models, visualizing its dynamic behaviors, and simulating the created models. A model author will use XML files to represent geometries and topology of a dynamic model. Model Fusion Engine, written in Extensible Stylesheet Language Transformation (XSLT), expedites the modeling process by automating the creation of dynamic models with the user-defined XML files. Modelers can also link simulation programs with a created model to analyze the characteristics of the model. The advantages of this system lie in the education of modeling and simulating dynamic models, and in the exploitation of visualizing the dynamic model behaviors.

  14. Devising a New Model-Driven Framework for Developing GUI for Enterprise Applications

    NASA Astrophysics Data System (ADS)

    Akiki, Pierre

    The main goal of this chapter is to demonstrate the design and development of a GUI framework that is model driven and is not directly linked to one presentation technology or any specific presentation subsystem of a certain programming language. This framework will allow us to create graphical user interfaces that are not only dynamically customizable but also multilingual. In order to demonstrate this new concept we design in this chapter a new framework called Customizable Enterprise Data Administrator (CEDAR). Additionally, we build a prototype of this framework and a technology-dependent engine which would transform the output of our framework into a known presentation technology.

  15. Subsurface and Surface Characterization using an Information Framework Model

    NASA Astrophysics Data System (ADS)

    Samuel-Ojo, Olusola

    Groundwater plays a critical dual role as a reservoir of fresh water for human consumption and as a cause of the most severe problems when dealing with construction works below the water table. This is why it is critical to monitor groundwater recharge, distribution, and discharge on a continuous basis. The conventional method of monitoring groundwater employs a network of sparsely distributed monitoring wells and it is laborious, expensive, and intrusive. The problem of sparse data and undersampling reduces the accuracy of sampled survey data giving rise to poor interpretation. This dissertation addresses this problem by investigating groundwater-deformation response in order to augment the conventional method. A blend of three research methods was employed, namely design science research, geological methods, and geophysical methods, to examine whether persistent scatterer interferometry, a remote sensing technique, might augment conventional groundwater monitoring. Observation data (including phase information for displacement deformation from permanent scatterer interferometric synthetic aperture radar and depth to groundwater data) was obtained from the Water District, Santa Clara Valley, California. An information framework model was built and applied, and then evaluated. Data was preprocessed and decomposed into five components or parts: trend, seasonality, low frequency, high frequency and octave bandwidth. Digital elevation models of observed and predicted hydraulic head were produced, illustrating the piezometric or potentiometric surface. The potentiometric surface characterizes the regional aquifer of the valley showing areal variation of rate of percolation, velocity and permeability, and completely defines flow direction, advising characteristics and design levels. The findings show a geologic forcing phenomenon which explains in part the long-term deformation behavior of the valley, characterized by poroelastic, viscoelastic, elastoplastic and

  16. Insight into model mechanisms through automatic parameter fitting: a new methodological framework for model development

    PubMed Central

    2014-01-01

    Background Striking a balance between the degree of model complexity and parameter identifiability, while still producing biologically feasible simulations using modelling is a major challenge in computational biology. While these two elements of model development are closely coupled, parameter fitting from measured data and analysis of model mechanisms have traditionally been performed separately and sequentially. This process produces potential mismatches between model and data complexities that can compromise the ability of computational frameworks to reveal mechanistic insights or predict new behaviour. In this study we address this issue by presenting a generic framework for combined model parameterisation, comparison of model alternatives and analysis of model mechanisms. Results The presented methodology is based on a combination of multivariate metamodelling (statistical approximation of the input–output relationships of deterministic models) and a systematic zooming into biologically feasible regions of the parameter space by iterative generation of new experimental designs and look-up of simulations in the proximity of the measured data. The parameter fitting pipeline includes an implicit sensitivity analysis and analysis of parameter identifiability, making it suitable for testing hypotheses for model reduction. Using this approach, under-constrained model parameters, as well as the coupling between parameters within the model are identified. The methodology is demonstrated by refitting the parameters of a published model of cardiac cellular mechanics using a combination of measured data and synthetic data from an alternative model of the same system. Using this approach, reduced models with simplified expressions for the tropomyosin/crossbridge kinetics were found by identification of model components that can be omitted without affecting the fit to the parameterising data. Our analysis revealed that model parameters could be constrained to a standard

  17. A modeling framework for the evolution and spread of antibiotic resistance: literature review and model categorization.

    PubMed

    Spicknall, Ian H; Foxman, Betsy; Marrs, Carl F; Eisenberg, Joseph N S

    2013-08-15

    Antibiotic-resistant infections complicate treatment and increase morbidity and mortality. Mathematical modeling has played an integral role in improving our understanding of antibiotic resistance. In these models, parameter sensitivity is often assessed, while model structure sensitivity is not. To examine the implications of this, we first reviewed the literature on antibiotic-resistance modeling published between 1993 and 2011. We then classified each article's model structure into one or more of 6 categories based on the assumptions made in those articles regarding within-host and population-level competition between antibiotic-sensitive and antibiotic-resistant strains. Each model category has different dynamic implications with respect to how antibiotic use affects resistance prevalence, and therefore each may produce different conclusions about optimal treatment protocols that minimize resistance. Thus, even if all parameter values are correctly estimated, inferences may be incorrect because of the incorrect selection of model structure. Our framework provides insight into model selection.

  18. FITS: A Framework for ITS--A Computational Model of Tutoring.

    ERIC Educational Resources Information Center

    Ikeda, Mitsuru; Mizoguchi, Riichiro

    1994-01-01

    Summarizes research activities concerning FITS, a Framework for Intelligent Tutoring Systems, and discusses the major results obtained thus far. Topics include system architecture; domain independent framework; student model module; expertise module; tutoring strategies; and a model of tutor's decision making, including knowledge sources and…

  19. Modeling sports highlights using a time-series clustering framework and model interpretation

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Regunathan; Otsuka, Isao; Xiong, Ziyou; Divakaran, Ajay

    2004-12-01

    In our past work on sports highlights extraction, we have shown the utility of detecting audience reaction using an audio classification framework. The audio classes in the framework were chosen based on intuition. In this paper, we present a systematic way of identifying the key audio classes for sports highlights extraction using a time series clustering framework. We treat the low-level audio features as a time series and model the highlight segments as "unusual" events in a background of an "usual" process. The set of audio classes to characterize the sports domain is then identified by analyzing the consistent patterns in each of the clusters output from the time series clustering framework. The distribution of features from the training data so obtained for each of the key audio classes, is parameterized by a Minimum Description Length Gaussian Mixture Model (MDL-GMM). We also interpret the meaning of each of the mixture components of the MDL-GMM for the key audio class (the "highlight" class) that is correlated with highlight moments. Our results show that the "highlight" class is a mixture of audience cheering and commentator's excited speech. Furthermore, we show that the precision-recall performance for highlights extraction based on this "highlight" class is better than that of our previous approach which uses only audience cheering as the key highlight class.

  20. Modeling sports highlights using a time-series clustering framework and model interpretation

    NASA Astrophysics Data System (ADS)

    Radhakrishnan, Regunathan; Otsuka, Isao; Xiong, Ziyou; Divakaran, Ajay

    2005-01-01

    In our past work on sports highlights extraction, we have shown the utility of detecting audience reaction using an audio classification framework. The audio classes in the framework were chosen based on intuition. In this paper, we present a systematic way of identifying the key audio classes for sports highlights extraction using a time series clustering framework. We treat the low-level audio features as a time series and model the highlight segments as "unusual" events in a background of an "usual" process. The set of audio classes to characterize the sports domain is then identified by analyzing the consistent patterns in each of the clusters output from the time series clustering framework. The distribution of features from the training data so obtained for each of the key audio classes, is parameterized by a Minimum Description Length Gaussian Mixture Model (MDL-GMM). We also interpret the meaning of each of the mixture components of the MDL-GMM for the key audio class (the "highlight" class) that is correlated with highlight moments. Our results show that the "highlight" class is a mixture of audience cheering and commentator's excited speech. Furthermore, we show that the precision-recall performance for highlights extraction based on this "highlight" class is better than that of our previous approach which uses only audience cheering as the key highlight class.

  1. Deep inelastic phenomena

    SciTech Connect

    Prescott, C.Y.

    1980-10-01

    Nucleon structure as seen in the context of deep inelastic scattering is discussed. The lectures begin with consideration of the quark-parton model. The model forms the basis of understanding lepton-nucleon inelastic scattering. As improved data in lepton-nucleon scattering at high energies became available, the quark-parton model failed to explain some crucial features of these data. At approximately the same time a candidate theory of strong interactions based on a SU(3) gauge theory of color was being discussed in the literature, and new ideas on the explanation of inelastic scattering data became popular. A new theory of strong interactions, now called quantum chromodynamics provides a new framework for understanding the data, with a much stronger theoretical foundation, and seems to explain well the features of the data. The lectures conclude with a look at some recent experiments which provide new data at very high energies. These lectures are concerned primarily with charged lepton inelastic scattering and to a lesser extent with neutrino results. Furthermore, due to time and space limitations, topics such as final state hadron studies, and multi-muon production are omitted here. The lectures concentrate on the more central issues: the quark-parton model and concepts of scaling, scale breaking and the ideas of quantum chromodynamics, the Q/sup 2/ dependence of structure function, moments, and the important parameter R.

  2. Linking Tectonics and Surface Processes through SNAC-CHILD Coupling: Preliminary Results Towards Interoperable Modeling Frameworks

    NASA Astrophysics Data System (ADS)

    Choi, E.; Kelbert, A.; Peckham, S. D.

    2014-12-01

    We demonstrate that code coupling can be an efficient and flexible method for modeling complicated two-way interactions between tectonic and surface processes with SNAC-CHILD coupling as an example. SNAC is a deep earth process model (a geodynamic/tectonics model), built upon a scientific software framework called StGermain and also compatible with a model coupling framework called Pyre. CHILD is a popular surface process model (a landscape evolution model), interfaced to the CSDMS (Community Surface Dynamics Modeling System) modeling framework. We first present proof-of-concept but non-trivial results from a simplistic coupling scheme. We then report progress towards augmenting SNAC with a Basic Model Interface (BMI), a framework-agnostic standard interface developed by CSDMS that uses the CSDMS Standard Names as controlled vocabulary for model communication across domains. Newly interfaced to BMI, SNAC will be easily coupled with CHILD as well as other BMI-compatible models. In broader context, this work will test BMI as a general and easy-to-implement mechanism for sharing models between modeling frameworks and is a part of the NSF-funded EarthCube Building Blocks project, "Earth System Bridge: Spanning Scientific Communities with Interoperable Modeling Frameworks."

  3. Evolution of 3-D geologic framework modeling and its application to groundwater flow studies

    USGS Publications Warehouse

    Blome, Charles D.; Smith, David V.

    2012-01-01

    In this Fact Sheet, the authors discuss the evolution of project 3-D subsurface framework modeling, research in hydrostratigraphy and airborne geophysics, and methodologies used to link geologic and groundwater flow models.

  4. Holland's RIASEC Model as an Integrative Framework for Individual Differences

    ERIC Educational Resources Information Center

    Armstrong, Patrick Ian; Day, Susan X.; McVay, Jason P.; Rounds, James

    2008-01-01

    Using data from published sources, the authors investigated J. L. Holland's (1959, 1997) theory of interest types as an integrative framework for organizing individual differences variables that are used in counseling psychology. Holland's interest types were used to specify 2- and 3-dimensional interest structures. In Study 1, measures of…

  5. A unified framework for modeling landscape evolution by discrete flows

    NASA Astrophysics Data System (ADS)

    Shelef, Eitan; Hilley, George E.

    2016-05-01

    Topographic features such as branched valley networks and undissected convex-up hillslopes are observed in disparate physical environments. In some cases, these features are formed by sediment transport processes that occur discretely in space and time, while in others, by transport processes that are uniformly distributed across the landscape. This paper presents an analytical framework that reconciles the basic attributes of such sediment transport processes with the topographic features that they form and casts those in terms that are likely common to different physical environments. In this framework, temporal changes in surface elevation reflect the frequency with which the landscape is traversed by geophysical flows generated discretely in time and space. This frequency depends on the distance to which flows travel downslope, which depends on the dynamics of individual flows, the lithologic and topographic properties of the underlying substrate, and the coevolution of topography, erosion, and the routing of flows over the topographic surface. To explore this framework, we postulate simple formulations for sediment transport and flow runout distance and demonstrate that the conditions for hillslope and channel network formation can be cast in terms of fundamental parameters such as distance from drainage divide and a friction-like coefficient that describes a flow's resistance to motion. The framework we propose is intentionally general, but the postulated formulas can be substituted with those that aim to describe a specific process and to capture variations in the size distribution of such flow events.

  6. The Foundations of Learning Framework: A Model for School Readiness

    ERIC Educational Resources Information Center

    Sorrels, Barbara

    2012-01-01

    Since the National Education Goals Panel was convened in 1991, school readiness for all children has remained a high priority across our nation. The Foundations of Learning Framework is a tool to understand what it means for a child to be "ready." Preparation for educational success requires two key ingredients--relationships and play. In the…

  7. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    PubMed

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department.

  8. System modeling with the DISC framework: evidence from safety-critical domains.

    PubMed

    Reiman, Teemu; Pietikäinen, Elina; Oedewald, Pia; Gotcheva, Nadezhda

    2012-01-01

    The objective of this paper is to illustrate the development and application of the Design for Integrated Safety Culture (DISC) framework for system modeling by evaluating organizational potential for safety in nuclear and healthcare domains. The DISC framework includes criteria for good safety culture and a description of functions that the organization needs to implement in order to orient the organization toward the criteria. Three case studies will be used to illustrate the utilization of the DISC framework in practice.

  9. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    PubMed Central

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  10. Advances in Geoscience Modeling: Smart Modeling Frameworks, Self-Describing Models and the Role of Standardized Metadata

    NASA Astrophysics Data System (ADS)

    Peckham, Scott

    2016-04-01

    Over the last decade, model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that make it much easier for modelers to connect heterogeneous sets of process models in a plug-and-play manner to create composite "system models". These mechanisms greatly simplify code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing with standardized metadata. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can use the self description functions to learn about each process model in a collection to be coupled and then automatically call framework service components (e.g. regridders

  11. Collaborative Project. A Flexible Atmospheric Modeling Framework for the Community Earth System Model (CESM)

    SciTech Connect

    Gettelman, Andrew

    2015-10-01

    In this project we have been upgrading the Multiscale Modeling Framework (MMF) in the Community Atmosphere Model (CAM), also known as Super-Parameterized CAM (SP-CAM). This has included a major effort to update the coding standards and interface with CAM so that it can be placed on the main development trunk. It has also included development of a new software structure for CAM to be able to handle sub-grid column information. These efforts have formed the major thrust of the work.

  12. The heavy quark parton oxymoron: A mini-review of heavy quark production theory in PQCD

    SciTech Connect

    Tung, W.-K.

    1997-04-20

    Conventional perturbative QCD calculations on the production of a heavy quark 'H' consist of two contrasting approaches: the usual QCD parton formalism uses the zero-mass approximation (m{sub H}=0) once above threshold, and treats H just like the other light partons; on the other hand, most recent 'NLO' heavy quark calculations treat m{sub H} as a large parameter and always consider H as a heavy particle, never as a parton, irrespective of the energy scale of the physical process. By their very nature, both these approaches are limited in their regions of applicability. This dichotomy can be resolved in a unified general-mass variable-flavor-number scheme, which retains the m{sub H} dependence at all energies, and which naturally reduces to the two conventional approaches in their respective region of validity. Recent applications to lepto- and hadro-production of heavy quarks are briefly summarized.

  13. Integration of the DAYCENT Biogeochemical Model within a Multi-Model Framework

    SciTech Connect

    David Muth

    2012-07-01

    Agricultural residues are the largest near term source of cellulosic 13 biomass for bioenergy production, but removing agricultural residues sustainably 14 requires considering the critical roles that residues play in the agronomic system. 15 Determining sustainable removal rates for agricultural residues has received 16 significant attention and integrated modeling strategies have been built to evaluate 17 sustainable removal rates considering soil erosion and organic matter constraints. 18 However the current integrated model does not quantitatively assess soil carbon 19 and long term crop yields impacts of residue removal. Furthermore the current 20 integrated model does not evaluate the greenhouse gas impacts of residue 21 removal, specifically N2O and CO2 gas fluxes from the soil surface. The DAYCENT 22 model simulates several important processes for determining agroecosystem 23 performance. These processes include daily Nitrogen-gas flux, daily carbon dioxide 24 flux from soil respiration, soil organic carbon and nitrogen, net primary productivity, 25 and daily water and nitrate leaching. Each of these processes is an indicator of 26 sustainability when evaluating emerging cellulosic biomass production systems for 27 bioenergy. A potentially vulnerable cellulosic biomass resource is agricultural 28 residues. This paper presents the integration of the DAYCENT model with the 29 existing integration framework modeling tool to investigate additional environment 30 impacts of agricultural residue removal. The integrated model is extended to 31 facilitate two-way coupling between DAYCENT and the existing framework. The 32 extended integrated model is applied to investigate additional environmental 33 impacts from a recent sustainable agricultural residue removal dataset. The 34 integrated model with DAYCENT finds some differences in sustainable removal 35 rates compared to previous results for a case study county in Iowa. The extended 36 integrated model with

  14. Applying the Nominal Response Model within a Longitudinal Framework to Construct the Positive Family Relationships Scale

    ERIC Educational Resources Information Center

    Preston, Kathleen Suzanne Johnson; Parral, Skye N.; Gottfried, Allen W.; Oliver, Pamella H.; Gottfried, Adele Eskeles; Ibrahim, Sirena M.; Delany, Danielle

    2015-01-01

    A psychometric analysis was conducted using the nominal response model under the item response theory framework to construct the Positive Family Relationships scale. Using data from the Fullerton Longitudinal Study, this scale was constructed within a long-term longitudinal framework spanning middle childhood through adolescence. Items tapping…

  15. Design of a component-based integrated environmental modeling framework

    EPA Science Inventory

    Integrated environmental modeling (IEM) includes interdependent science-based components (e.g., models, databases, viewers, assessment protocols) that comprise an appropriate software modeling system. The science-based components are responsible for consuming and producing inform...

  16. Using a scalable modeling and simulation framework to evaluate the benefits of intelligent transportation systems.

    SciTech Connect

    Ewing, T.; Tentner, A.

    2000-03-21

    A scalable, distributed modeling and simulation framework has been developed at Argonne National Laboratory to study Intelligent Transportation Systems. The framework can run on a single-processor workstation, or run distributed on a multiprocessor computer or network of workstations. The framework is modular and supports plug-in models, hardware, and live data sources. The initial set of models currently includes road network and traffic flow, probe and smart vehicles, traffic management centers, communications between vehicles and centers, in-vehicle navigation systems, roadway traffic advisories. The modeling and simulation capability has been used to examine proposed ITS concepts. Results are presented from modeling scenarios from the Advanced Driver and Vehicle Advisory Navigation Concept (ADVANCE) experimental program to demonstrate how the framework can be used to evaluate the benefits of ITS and to plan future ITS operational tests and deployment initiatives.

  17. Temporo-spatial model construction using the MML and software framework.

    PubMed

    Chang, David C; Dokos, Socrates; Lovell, Nigel H

    2011-12-01

    Development of complex temporo-spatial biological computational models can be a time consuming and arduous task. These models may contain hundreds of differential equations as well as realistic geometries that may require considerable investment in time to ensure that all model components are correctly implemented and error free. To tackle this problem, the Modeling Markup Languages (MML) and software framework is a modular XML/HDF5-based specification and toolkits that aims to simplify this process. The main goal of this framework is to encourage reusability, sharing and storage. To achieve this, the MML framework utilizes the CellML specification and repository, which comprises an extensive range of curated models available for use. The MML framework is an open-source project available at http://mml.gsbme.unsw.edu.au. PMID:21947514

  18. Hydrogeologic Framework Model for the Saturated Zone Site Scale flow and Transport Model

    SciTech Connect

    T. Miller

    2004-11-15

    The purpose of this report is to document the 19-unit, hydrogeologic framework model (19-layer version, output of this report) (HFM-19) with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results in accordance with AP-SIII.10Q, Models. The HFM-19 is developed as a conceptual model of the geometric extent of the hydrogeologic units at Yucca Mountain and is intended specifically for use in the development of the ''Saturated Zone Site-Scale Flow Model'' (BSC 2004 [DIRS 170037]). Primary inputs to this model report include the GFM 3.1 (DTN: MO9901MWDGFM31.000 [DIRS 103769]), borehole lithologic logs, geologic maps, geologic cross sections, water level data, topographic information, and geophysical data as discussed in Section 4.1. Figure 1-1 shows the information flow among all of the saturated zone (SZ) reports and the relationship of this conceptual model in that flow. The HFM-19 is a three-dimensional (3-D) representation of the hydrogeologic units surrounding the location of the Yucca Mountain geologic repository for spent nuclear fuel and high-level radioactive waste. The HFM-19 represents the hydrogeologic setting for the Yucca Mountain area that covers about 1,350 km2 and includes a saturated thickness of about 2.75 km. The boundaries of the conceptual model were primarily chosen to be coincident with grid cells in the Death Valley regional groundwater flow model (DTN: GS960808312144.003 [DIRS 105121]) such that the base of the site-scale SZ flow model is consistent with the base of the regional model (2,750 meters below a smoothed version of the potentiometric surface), encompasses the exploratory boreholes, and provides a framework over the area of interest for groundwater flow and radionuclide transport modeling. In depth, the model domain extends from land surface to the base of the regional groundwater flow model (D'Agnese et al. 1997 [DIRS 100131], p 2). For the site-scale SZ flow model, the HFM

  19. Model-based reasoning in the physics laboratory: Framework and initial results

    NASA Astrophysics Data System (ADS)

    Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.

    2015-12-01

    [This paper is part of the Focused Collection on Upper Division Physics Courses.] We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable process, within physics education, it has been preferentially applied to the iterative development of broadly applicable principles (e.g., Newton's laws of motion in introductory mechanics). A significant feature of the new framework is that measurement tools (in addition to the physical system being studied) are subjected to the process of modeling. Think-aloud interviews were used to refine the framework and demonstrate its utility by documenting examples of model-based reasoning in the laboratory. When applied to the think-aloud interviews, the framework captures and differentiates students' model-based reasoning and helps identify areas of future research. The interviews showed how students productively applied similar facets of modeling to the physical system and measurement tools: construction, prediction, interpretation of data, identification of model limitations, and revision. Finally, we document students' challenges in explicitly articulating assumptions when constructing models of experimental systems and further challenges in model construction due to students' insufficient prior conceptual understanding. A modeling perspective reframes many of the seemingly arbitrary technical details of measurement tools and apparatus as an opportunity for authentic and engaging scientific sense making.

  20. A Framework for Multifaceted Evaluation of Student Models

    ERIC Educational Resources Information Center

    Huang, Yun; González-Brenes, José P.; Kumar, Rohit; Brusilovsky, Peter

    2015-01-01

    Latent variable models, such as the popular Knowledge Tracing method, are often used to enable adaptive tutoring systems to personalize education. However, finding optimal model parameters is usually a difficult non-convex optimization problem when considering latent variable models. Prior work has reported that latent variable models obtained…

  1. A Framework to Develop Symbolic Performance Models of Parallel Applications

    SciTech Connect

    Alam, Sadaf R; Vetter, Jeffrey S

    2006-01-01

    Performance and workload modeling has numerous uses at every stage of the high-end computing lifecycle: design, integration, procurement, installation and tuning. Despite the tremendous usefulness of performance models, their construction remains largely a manual, complex, and time-consuming exercise. We propose a new approach to the model construction, called modeling assertions (MA), which borrows advantages from both the empirical and analytical modeling techniques. This strategy has many advantages over traditional methods: incremental construction of realistic performance models, straightforward model validation against empirical data, and intuitive error bounding on individual model terms. We demonstrate this new technique on the NAS parallel CG and SP benchmarks by constructing high fidelity models for the floating-point operation cost, memory requirements, and MPI message volume. These models are driven by a small number of key input parameters thereby allowing efficient design space exploration of future problem sizes and architectures.

  2. A Model Independent S/W Framework for Search-Based Software Testing

    PubMed Central

    Baik, Jongmoon

    2014-01-01

    In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model. PMID:25302314

  3. Design of a Model Execution Framework: Repetitive Object-Oriented Simulation Environment (ROSE)

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Briggs, Jeffery L.

    2008-01-01

    The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.

  4. The common sense model: an organizing framework for knowledge development in nursing.

    PubMed

    Ward, S E

    1993-01-01

    The common sense model emphasizes the importance of persons' perceptions or beliefs about their health and illness, and has potential for guiding knowledge development in nursing. This paper describes the model, reviews the existing relevant literature, and organizes the literature within the framework of the model. Suggestions are made as to how the model can be used for further research in nursing.

  5. An Integrated Modeling Framework Forecasting Ecosystem Services: Application to the Albemarle Pamlico Basins, NC and VA (USA)

    EPA Science Inventory

    We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...

  6. An Integrated Modeling Framework Forcasting Ecosystem Services--Application to the Albemarle Pamlico Basins, NC and VA (USA) and Beyond

    EPA Science Inventory

    We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...

  7. ASSESSING POPULATION EXPOSURES TO MULTIPLE AIR POLLUTANTS USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    The Modeling Environment for Total Risks studies (MENTOR) system, combined with an extension of the SHEDS (Stochastic Human Exposure and Dose Simulation) methodology, provide a mechanistically consistent framework for conducting source-to-dose exposure assessments of multiple pol...

  8. Neural-network models of learning and memory: leading questions and an emerging framework.

    PubMed

    Carpenter, G A.

    2001-03-01

    Real-time neural-network models provide a conceptual framework for formulating questions about the nature of cognition, an architectural framework for mapping cognitive functions to brain regions, a semantic framework for defining terms, and a computational framework for testing hypotheses. This article considers key questions about how a physical system might simultaneously support one-trial learning and lifetime memories, in the context of neural models that test possible solutions to the problems posed. Model properties point to partial answers, and model limitations lead to new questions. Placing individual system components in the context of a unified real-time network allows analysis to move from the level of neural processes, including learning laws and rules of synaptic transmission, to cognitive processes, including attention and consciousness.

  9. A flexible and efficient multi-model framework in support of water management

    NASA Astrophysics Data System (ADS)

    Wolfs, Vincent; Tran Quoc, Quan; Willems, Patrick

    2016-05-01

    Flexible, fast and accurate water quantity models are essential tools in support of water management. Adjustable levels of model detail and the ability to handle varying spatial and temporal resolutions are requisite model characteristics to ensure that such models can be employed efficiently in various applications. This paper uses a newly developed flexible modelling framework that aims to generate such models. The framework incorporates several approaches to model catchment hydrology, rivers and floodplains, and the urban drainage system by lumping processes on different levels. To illustrate this framework, a case study of integrated hydrological-hydraulic modelling is elaborated for the Grote Nete catchment in Belgium. Three conceptual rainfall-runoff models (NAM, PDM and VHM) were implemented in a generalized model structure, allowing flexibility in the spatial resolution by means of an innovative disaggregation/aggregation procedure. They were linked to conceptual hydraulic models of the rivers in the catchment, which were developed by means of an advanced model structure identification and calibration procedure. The conceptual models manage to emulate the simulation results of a detailed full hydrodynamic model accurately. The models configured using the approaches of this framework are well-suited for many applications in water management due to their very short calculation time, interfacing possibilities and adjustable level of detail.

  10. A Framework for Sharing and Integrating Remote Sensing and GIS Models Based on Web Service

    PubMed Central

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a “black box” and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users. PMID:24901016

  11. A framework for sharing and integrating remote sensing and GIS models based on Web service.

    PubMed

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a "black box" and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users.

  12. A MODELLING FRAMEWORK FOR MERCURY CYCLING IN LAKE MICHIGAN

    EPA Science Inventory

    A time dependent mercury model was developed to describe mercury cycling in Lake Michigan. The model addresses dynamic relationships between net mercury loadings and the resulting concentrations of mercury species in the water and sediment. The transformations among three mercury...

  13. A MODELLING FRAMEWORK FOR MERCURY CYCLING IN LAKE MICHIGAN

    EPA Science Inventory

    A time-dependent mercury model was developed to describe mercury cycling in Lake Michigan. The model addresses dynamic relationships between net mercury loadings and the resulting concentrations of mercury species in the water and sediment. The simplified predictive modeling fram...

  14. An Interactive Reference Framework for Modeling a Dynamic Immune System

    PubMed Central

    Spitzer, Matthew H.; Gherardini, Pier Federico; Fragiadakis, Gabriela K.; Bhattacharya, Nupur; Yuan, Robert T.; Hotson, Andrew N.; Finck, Rachel; Carmi, Yaron; Zunder, Eli R.; Fantl, Wendy J.; Bendall, Sean C.; Engleman, Edgar G.; Nolan, Garry P.

    2015-01-01

    Immune cells function in an interacting hierarchy that coordinates activities of various cell types according to genetic and environmental contexts. We developed graphical approaches to construct an extensible immune reference map from mass cytometry data of cells from different organs, incorporating landmark cell populations as flags on the map to compare cells from distinct samples. The maps recapitulated canonical cellular phenotypes and revealed reproducible, tissue-specific deviations. The approach revealed influences of genetic variation and circadian rhythms on immune system structure, enabled direct comparisons of murine and human blood cell phenotypes, and even enabled archival fluorescence-based flow cytometry data to be mapped onto the reference framework. This foundational reference map provides a working definition of systemic immune organization to which new data can be integrated to reveal deviations driven by genetics, environment, or pathology. PMID:26160952

  15. Integrating water quality modeling with ecological risk assessment for nonpoint source pollution control: A conceptual framework

    SciTech Connect

    Chen, Y.D.; McCutcheon, S.C.; Rasmussen, T.C.; Nutter, W.L.; Carsel, R.F.

    1993-01-01

    The historical development of water quality protection goals and strategies in the United States is reviewed. The review leads to the identification and discussion of three components (i.e., management mechanism, environmental investigation approaches, and environmental assessment and criteria) for establishing a management framework for nonpoint source pollution control. Water quality modeling and ecological risk assessment are the two most important and promising approaches to the operation of the proposed management framework. A conceptual framework that shows the general integrative relationships between water quality modeling and ecological risk assessment is presented. (Copyright (c) 1993 IAWQ.)

  16. A scalable delivery framework and a pricing model for streaming media with advertisements

    NASA Astrophysics Data System (ADS)

    Al-Hadrusi, Musab; Sarhan, Nabil J.

    2008-01-01

    This paper presents a delivery framework for streaming media with advertisements and an associated pricing model. The delivery model combines the benefits of periodic broadcasting and stream merging. The advertisements' revenues are used to subsidize the price of the media content. The pricing is determined based on the total ads' viewing time. Moreover, this paper presents an efficient ad allocation scheme and three modified scheduling policies that are well suited to the proposed delivery framework. Furthermore, we study the effectiveness of the delivery framework and various scheduling polices through extensive simulation in terms of numerous metrics, including customer defection probability, average number of ads viewed per client, price, arrival rate, profit, and revenue.

  17. BioASF: a framework for automatically generating executable pathway models specified in BioPAX

    PubMed Central

    Haydarlou, Reza; Jacobsen, Annika; Bonzanni, Nicola; Feenstra, K. Anton; Abeln, Sanne; Heringa, Jaap

    2016-01-01

    Motivation: Biological pathways play a key role in most cellular functions. To better understand these functions, diverse computational and cell biology researchers use biological pathway data for various analysis and modeling purposes. For specifying these biological pathways, a community of researchers has defined BioPAX and provided various tools for creating, validating and visualizing BioPAX models. However, a generic software framework for simulating BioPAX models is missing. Here, we attempt to fill this gap by introducing a generic simulation framework for BioPAX. The framework explicitly separates the execution model from the model structure as provided by BioPAX, with the advantage that the modelling process becomes more reproducible and intrinsically more modular; this ensures natural biological constraints are satisfied upon execution. The framework is based on the principles of discrete event systems and multi-agent systems, and is capable of automatically generating a hierarchical multi-agent system for a given BioPAX model. Results: To demonstrate the applicability of the framework, we simulated two types of biological network models: a gene regulatory network modeling the haematopoietic stem cell regulators and a signal transduction network modeling the Wnt/β-catenin signaling pathway. We observed that the results of the simulations performed using our framework were entirely consistent with the simulation results reported by the researchers who developed the original models in a proprietary language. Availability and Implementation: The framework, implemented in Java, is open source and its source code, documentation and tutorial are available at http://www.ibi.vu.nl/programs/BioASF. Contact: j.heringa@vu.nl PMID:27307645

  18. Computational Morphodynamics: A modeling framework to understand plant growth

    PubMed Central

    Chickarmane, Vijay; Roeder, Adrienne H.K.; Tarr, Paul T.; Cunha, Alexandre; Tobin, Cory; Meyerowitz, Elliot M.

    2014-01-01

    Computational morphodynamics utilizes computer modeling to understand the development of living organisms over space and time. Results from biological experiments are used to construct accurate and predictive models of growth. These models are then used to make novel predictions providing further insight into the processes in question, which can be tested experimentally to either confirm or rule out the validity of the computational models. This review highlights two fundamental issues: (1.) models should span and integrate single cell behavior with tissue development and (2.) the necessity to understand the feedback between mechanics of growth and chemical or molecular signaling. We review different approaches to model plant growth and discuss a variety of model types that can be implemented, with the aim of demonstrating how this methodology can be used, to explore the morphodynamics of plant development. PMID:20192756

  19. Approaches to implementing deterministic models in a probabilistic framework

    SciTech Connect

    Talbott, D.V.

    1995-04-01

    The increasing use of results from probabilistic risk assessments in the decision-making process makes it ever more important to eliminate simplifications in probabilistic models that might lead to conservative results. One area in which conservative simplifications are often made is modeling the physical interactions that occur during the progression of an accident sequence. This paper demonstrates and compares different approaches for incorporating deterministic models of physical parameters into probabilistic models; parameter range binning, response curves, and integral deterministic models. An example that combines all three approaches in a probabilistic model for the handling of an energetic material (i.e. high explosive, rocket propellant,...) is then presented using a directed graph model.

  20. Multiple-species analysis of point count data: A more parsimonious modelling framework

    USGS Publications Warehouse

    Alldredge, M.W.; Pollock, K.H.; Simons, T.R.; Shriner, S.A.

    2007-01-01

    1. Although population surveys often provide information on multiple species, these data are rarely analysed within a multiple-species framework despite the potential for more efficient estimation of population parameters. 2. We have developed a multiple-species modelling framework that uses similarities in capture/detection processes among species to model multiple species data more parsimoniously. We present examples of this approach applied to distance, time of detection and multiple observer sampling for avian point count data. 3. Models that included species as a covariate and individual species effects were generally selected as the best models for distance sampling, but group models without species effects performed best for the time of detection and multiple observer methods. Population estimates were more precise for no-species-effect models than for species-effect models, demonstrating the benefits of exploiting species' similarities when modelling multiple species data. Partial species-effect models and additive models were also useful because they modelled similarities among species while allowing for species differences. 4. Synthesis and applications. We recommend the adoption of multiple-species modelling because of its potential for improved population estimates. This framework will be particularly beneficial for modelling count data from rare species because information on the detection process can be 'borrowed' from more common species. The multiple-species modelling framework presented here is applicable to a wide range of sampling techniques and taxa. ?? 2007 The Authors.

  1. Enhancing a socio-hydrological modelling framework through field observations: a case study in India

    NASA Astrophysics Data System (ADS)

    den Besten, Nadja; Pande, Saket; Savenije, Huub H. G.

    2016-04-01

    Recently a smallholder socio-hydrological modelling framework was proposed and deployed to understand the underlying dynamics of Agrarian Crisis in Maharashtra state of India. It was found that cotton and sugarcane smallholders whom lack irrigation and storage techniques are most susceptible to distress. This study further expands the application of the modelling framework to other crops that are abundant in the state of Maharashtra, such as Paddy, Jowar and Soyabean to assess whether the conclusions on the possible causes behind smallholder distress still hold. Further, a fieldwork will be undertaken in March 2016 in the district of Pune. During the fieldwork 50 smallholders will be interviewed in which socio-hydrological assumptions on hydrology and capital equations and corresponding closure relationships, incorporated the current model, will be put to test. Besides the assumptions, the questionnaires will be used to better understand the hydrological reality of the farm holders, in terms of water usage and storage capacity. In combination with historical records on the smallholders' socio-economic data acquired over the last thirty years available through several NGOs in the region, socio-hydrological realism of the modelling framework will be enhanced. The preliminary outcomes of a desktop study show the possibilities of a water-centric modelling framework in understanding the constraints on smallholder farming. The results and methods described can be a first step guiding following research on the modelling framework: a start in testing the framework in multiple rural locations around the globe.

  2. Integrated Bayesian network framework for modeling complex ecological issues.

    PubMed

    Johnson, Sandra; Mengersen, Kerrie

    2012-07-01

    The management of environmental problems is multifaceted, requiring varied and sometimes conflicting objectives and perspectives to be considered. Bayesian network (BN) modeling facilitates the integration of information from diverse sources and is well suited to tackling the management challenges of complex environmental problems. However, combining several perspectives in one model can lead to large, unwieldy BNs that are difficult to maintain and understand. Conversely, an oversimplified model may lead to an unrealistic representation of the environmental problem. Environmental managers require the current research and available knowledge about an environmental problem of interest to be consolidated in a meaningful way, thereby enabling the assessment of potential impacts and different courses of action. Previous investigations of the environmental problem of interest may have already resulted in the construction of several disparate ecological models. On the other hand, the opportunity may exist to initiate this modeling. In the first instance, the challenge is to integrate existing models and to merge the information and perspectives from these models. In the second instance, the challenge is to include different aspects of the environmental problem incorporating both the scientific and management requirements. Although the paths leading to the combined model may differ for these 2 situations, the common objective is to design an integrated model that captures the available information and research, yet is simple to maintain, expand, and refine. BN modeling is typically an iterative process, and we describe a heuristic method, the iterative Bayesian network development cycle (IBNDC), for the development of integrated BN models that are suitable for both situations outlined above. The IBNDC approach facilitates object-oriented BN (OOBN) modeling, arguably viewed as the next logical step in adaptive management modeling, and that embraces iterative development

  3. Introducing MERGANSER: A Flexible Framework for Ecological Niche Modeling

    NASA Astrophysics Data System (ADS)

    Klawonn, M.; Dow, E. M.

    2015-12-01

    Ecological Niche Modeling (ENM) is a collection of techniques to find a "fundamental niche", the range of environmental conditions suitable for a species' survival in the absence of inter-species interactions, given a set of environmental parameters. Traditional approaches to ENM face a number of obstacles including limited data accessibility, data management problems, computational costs, interface usability, and model validation. The MERGANSER system, which stands for Modeling Ecological Residency Given A Normalized Set of Environmental Records, addresses these issues through powerful data persistence and flexible data access, coupled with a clear presentation of results and fine-tuned control over model parameters. MERGANSER leverages data measuring 72 weather related phenomena, land cover, soil type, population, species occurrence, general species information, and elevation, totaling over 1.5 TB of data. To the best of the authors' knowledge, MERGANSER uses higher-resolution spatial data sets than previously published models. Since MERGANSER stores data in an instance of Apache SOLR, layers generated in support of niche models are accessible to users via simplified Apache Lucene queries. This is made even simpler via an HTTP front end that generates Lucene queries automatically. Specifically, a user need only enter the name of a place and a species to run a model. Using this approach to synthesizing model layers, the MERGANSER system has successfully reproduced previously published niche model results with a simplified user experience. Input layers for the model are generated dynamically using OpenStreetMap and SOLR's spatial search functionality. Models are then run using either user-specified or automatically determined parameters after normalizing them into a common grid. Finally, results are visualized in the web interface, which allows for quick validation. Model results and all surrounding metadata are also accessible to the user for further study.

  4. Integration of the Radiation Belt Environment Model Into the Space Weather Modeling Framework

    NASA Technical Reports Server (NTRS)

    Glocer, A.; Toth, G.; Fok, M.; Gombosi, T.; Liemohn, M.

    2009-01-01

    We have integrated the Fok radiation belt environment (RBE) model into the space weather modeling framework (SWMF). RBE is coupled to the global magnetohydrodynamics component (represented by the Block-Adaptive-Tree Solar-wind Roe-type Upwind Scheme, BATS-R-US, code) and the Ionosphere Electrodynamics component of the SWMF, following initial results using the Weimer empirical model for the ionospheric potential. The radiation belt (RB) model solves the convection-diffusion equation of the plasma in the energy range of 10 keV to a few MeV. In stand-alone mode RBE uses Tsyganenko's empirical models for the magnetic field, and Weimer's empirical model for the ionospheric potential. In the SWMF the BATS-R-US model provides the time dependent magnetic field by efficiently tracing the closed magnetic field-lines and passing the geometrical and field strength information to RBE at a regular cadence. The ionosphere electrodynamics component uses a two-dimensional vertical potential solver to provide new potential maps to the RBE model at regular intervals. We discuss the coupling algorithm and show some preliminary results with the coupled code. We run our newly coupled model for periods of steady solar wind conditions and compare our results to the RB model using an empirical magnetic field and potential model. We also simulate the RB for an active time period and find that there are substantial differences in the RB model results when changing either the magnetic field or the electric field, including the creation of an outer belt enhancement via rapid inward transport on the time scale of tens of minutes.

  5. Multiscale Multiphysics Lithium-Ion Battery Model with Multidomain Modular Framework

    SciTech Connect

    Kim, G. H.

    2013-01-01

    Lithium-ion batteries (LIBs) powering recent wave of personal ubiquitous electronics are also believed to be a key enabler of electrification of vehicle powertrain on the path toward sustainable transportation future. Over the past several years, National Renewable Energy Laboratory (NREL) has developed the Multi-Scale Multi-Domain (MSMD) model framework, which is an expandable platform and a generic modularized flexible framework resolving interactions among multiple physics occurring in varied length and time scales in LIB[1]. NREL has continued to enhance the functionality of the framework and to develop constituent models in the context of the MSMD framework responding to U.S. Department of Energy's CAEBAT program objectives. This talk will introduce recent advancements in NREL's LIB modeling research in regards of scale-bridging, multi-physics integration, and numerical scheme developments.

  6. The Conceptual Framework of Factors Affecting Shared Mental Model

    ERIC Educational Resources Information Center

    Lee, Miyoung; Johnson, Tristan; Lee, Youngmin; O'Connor, Debra; Khalil, Mohammed

    2004-01-01

    Many researchers have paid attention to the potentiality and possibility of the shared mental model because it enables teammates to perform their job better by sharing team knowledge, skills, attitudes, dynamics and environments. Even though theoretical and experimental evidences provide a close relationship between the shared mental model and…

  7. Simulation Framework for Teaching in Modeling and Simulation Areas

    ERIC Educational Resources Information Center

    De Giusti, Marisa Raquel; Lira, Ariel Jorge; Villarreal, Gonzalo Lujan

    2008-01-01

    Simulation is the process of executing a model that describes a system with enough detail; this model has its entities, an internal state, some input and output variables and a list of processes bound to these variables. Teaching a simulation language such as general purpose simulation system (GPSS) is always a challenge, because of the way it…

  8. The Relational-Cultural Model: A Framework for Group Process

    ERIC Educational Resources Information Center

    Comstock, Dana L.; Duffey, Thelma; St. George, Holly

    2002-01-01

    The relational-cultural model of psychotherapy has been evolving for the past 20 years. Within this model, difficult group dynamics are conceptualized as the playing out of the central relational paradox. This paradox recognizes that an individual may yearn for connection but, out of a sense of fear, simultaneously employ strategies that restrict…

  9. A Model Framework for Course Materials Construction (Second Edition).

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    Designed for use by Coast Guard course writers, curriculum developers, course coordinators, and instructors as a decision-support system, this publication presents a model that translates the Intraservices Procedures for Instructional Systems Development curriculum design model into materials usable by classroom teachers and students. Although…

  10. The Social-Ecological Model: A Framework for Violence Prevention

    ERIC Educational Resources Information Center

    Centers for Disease Control and Prevention, 2002

    2002-01-01

    The ultimate goal of the work of violence prevention is to stop violence before it begins. The Centers for Disease Control (CDC) uses a four-level social-ecological model (SEM) to better understand and prevent violence. The four levels are: (1) Individual; (2) Relationship; (3) Community; and (4) Societal. This model considers the complex…

  11. Source-to-Outcome Microbial Exposure and Risk Modeling Framework

    EPA Science Inventory

    A Quantitative Microbial Risk Assessment (QMRA) is a computer-based data-delivery and modeling approach that integrates interdisciplinary fate/transport, exposure, and impact models and databases to characterize potential health impacts/risks due to pathogens. As such, a QMRA ex...

  12. A MULTISCALE, CELL-BASED FRAMEWORK FOR MODELING CANCER DEVELOPMENT

    SciTech Connect

    JIANG, YI

    2007-01-16

    Cancer remains to be one of the leading causes of death due to diseases. We use a systems approach that combines mathematical modeling, numerical simulation, in vivo and in vitro experiments, to develop a predictive model that medical researchers can use to study and treat cancerous tumors. The multiscale, cell-based model includes intracellular regulations, cellular level dynamics and intercellular interactions, and extracellular level chemical dynamics. The intracellular level protein regulations and signaling pathways are described by Boolean networks. The cellular level growth and division dynamics, cellular adhesion and interaction with the extracellular matrix is described by a lattice Monte Carlo model (the Cellular Potts Model). The extracellular dynamics of the signaling molecules and metabolites are described by a system of reaction-diffusion equations. All three levels of the model are integrated through a hybrid parallel scheme into a high-performance simulation tool. The simulation results reproduce experimental data in both avasular tumors and tumor angiogenesis. By combining the model with experimental data to construct biologically accurate simulations of tumors and their vascular systems, this model will enable medical researchers to gain a deeper understanding of the cellular and molecular interactions associated with cancer progression and treatment.

  13. A framework for multi-criteria assessment of model enhancements

    NASA Astrophysics Data System (ADS)

    Francke, Till; Foerster, Saskia; Brosinsky, Arlena; Delgado, José; Güntner, Andreas; López-Tarazón, José A.; Bronstert, Axel

    2016-04-01

    Modellers are often faced with unsatisfactory model performance for a specific setup of a hydrological model. In these cases, the modeller may try to improve the setup by addressing selected causes for the model errors (i.e. data errors, structural errors). This leads to adding certain "model enhancements" (MEs), e.g. climate data based on more monitoring stations, improved calibration data, modifications in process formulations. However, deciding on which MEs to implement remains a matter of expert knowledge, guided by some sensitivity analysis at best. When multiple MEs have been implemented, a resulting improvement in model performance is not easily attributed, especially when considering different aspects of this improvement (e.g. better performance dynamics vs. reduced bias). In this study we present an approach for comparing the effect of multiple MEs in the face of multiple improvement aspects. A stepwise selection approach and structured plots help in addressing the multidimensionality of the problem. The approach is applied to a case study, which employs the meso-scale hydrosedimentological model WASA-SED for a sub-humid catchment. The results suggest that the effect of the MEs is quite diverse, with some MEs (e.g. augmented rainfall data) cause improvements for almost all aspects, while the effect of other MEs is restricted to few aspects or even deteriorate some. These specific results may not be generalizable. However, we suggest that based on studies like this, identifying the most promising MEs to implement may be facilitated.

  14. Developing an Interdisciplinary Curriculum Framework for Aquatic-Ecosystem Modeling

    ERIC Educational Resources Information Center

    Saito, Laurel; Segale, Heather M.; DeAngelis, Donald L.; Jenkins, Stephen H.

    2007-01-01

    This paper presents results from a July 2005 workshop and course aimed at developing an interdisciplinary course on modeling aquatic ecosystems that will provide the next generation of practitioners with critical skills for which formal training is presently lacking. Five different course models were evaluated: (1) fundamentals/general principles…

  15. A Framework for Modeling Human-Machine Interactions

    NASA Technical Reports Server (NTRS)

    Shafto, Michael G.; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    Modern automated flight-control systems employ a variety of different behaviors, or modes, for managing the flight. While developments in cockpit automation have resulted in workload reduction and economical advantages, they have also given rise to an ill-defined class of human-machine problems, sometimes referred to as 'automation surprises'. Our interest in applying formal methods for describing human-computer interaction stems from our ongoing research on cockpit automation. In this area of aeronautical human factors, there is much concern about how flight crews interact with automated flight-control systems, so that the likelihood of making errors, in particular mode-errors, is minimized and the consequences of such errors are contained. The goal of the ongoing research on formal methods in this context is: (1) to develop a framework for describing human interaction with control systems; (2) to formally categorize such automation surprises; and (3) to develop tests for identification of these categories early in the specification phase of a new human-machine system.

  16. Incorporating physically-based microstructures in materials modeling: Bridging phase field and crystal plasticity frameworks

    NASA Astrophysics Data System (ADS)

    Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.; Hanks, Byron W.; Foulk, James W.; Battaile, Corbett C.

    2016-05-01

    The mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FE meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.

  17. Incorporating physically-based microstructures in materials modeling: Bridging phase field and crystal plasticity frameworks

    DOE PAGES

    Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.; Hanks, Byron W.; Foulk, James W.; Battaile, Corbett C.

    2016-04-25

    Here, the mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FEmore » meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.« less

  18. A FRAMEWORK FOR FINE-SCALE COMPUTATIONAL FLUID DYNAMICS AIR QUALITY MODELING AND ANALYSIS

    EPA Science Inventory

    This paper discusses a framework for fine-scale CFD modeling that may be developed to complement the present Community Multi-scale Air Quality (CMAQ) modeling system which itself is a computational fluid dynamics model. A goal of this presentation is to stimulate discussions on w...

  19. A big-microsite framework for soil carbon modeling.

    PubMed

    Davidson, Eric A; Savage, Kathleen E; Finzi, Adrien C

    2014-12-01

    Soil carbon cycling processes potentially play a large role in biotic feedbacks to climate change, but little agreement exists at present on what the core of numerical soil C cycling models should look like. In contrast, most canopy models of photosynthesis and leaf gas exchange share a common 'Farquhaur-model' core structure. Here, we explore why a similar core model structure for heterotrophic soil respiration remains elusive and how a pathway to that goal might be envisioned. The spatial and temporal variation in soil microsite conditions greatly complicates modeling efforts, but we believe it is possible to develop a tractable number of parameterizable equations that are organized into a coherent, modular, numerical model structure. First, we show parallels in insights gleaned from linking Arrhenius and Michaelis-Menten kinetics for both photosynthesis and soil respiration. Additional equations and layers of complexity are then added to simulate substrate supply. For soils, model modules that simulate carbon stabilization processes will be key to estimating the fraction of soil C that is accessible to enzymes. Potential modules for dynamic photosynthate input, wetting-event inputs, freeze-thaw impacts on substrate diffusion, aggregate turnover, soluble-C sorption, gas transport, methane respiration, and microbial dynamics are described for conceptually and numerically linking our understanding of fast-response processes of soil gas exchange with longer-term dynamics of soil carbon and nitrogen stocks.

  20. The Community Earth System Model: A Framework for Collaborative Research

    SciTech Connect

    Hurrell, Jim; Holland, Marika M.; Gent, Peter R.; Ghan, Steven J.; Kay, Jennifer; Kushner, P.; Lamarque, J.-F.; Large, William G.; Lawrence, David M.; Lindsay, Keith; Lipscomb, William; Long , Matthew; Mahowald, N.; Marsh, D.; Neale, Richard; Rasch, Philip J.; Vavrus, Steven J.; Vertenstein, Mariana; Bader, David C.; Collins, William D.; Hack, James; Kiehl, J. T.; Marshall, Shawn

    2013-09-30

    The Community Earth System Model (CESM) is a flexible and extensible community tool used to investigate a diverse set of earth system interactions across multiple time and space scales. This global coupled model is a natural evolution from its predecessor, the Community Climate System Model, following the incorporation of new earth system capabilities. These include the ability to simulate biogeochemical cycles, atmospheric chemistry, ice sheets, and a high-top atmosphere. These and other new model capabilities are enabling investigations into a wide range of pressing scientific questions, providing new predictive capabilities and increasing our collective knowledge about the behavior and interactions of the earth system. Simulations with numerous configurations of the CESM have been provided to the Coupled Model Intercomparison Project Phase 5 (CMIP5) and are being analyzed by the broader community of scientists. Additionally, the model source code and associated documentation are freely available to the scientific community to use for earth system studies, making it a true community tool. Here we describe this earth modeling system, its various possible configurations, and illustrate its capabilities with a few science highlights.

  1. An Integrated Model Framework of Catchment-Scale Ecohydrological Processes

    NASA Astrophysics Data System (ADS)

    Niu, G.; Troch, P. A.; Paniconi, C.; Zeng, X.; Scott, R. L.; Huxman, T. E.; Pelletier, J. D.

    2012-12-01

    The interactions between the atmospheric, hydrological, and ecological processes at various spatial and temporal scales are not fully represented in most hydrometeorological, ecohydrological, and Earth System Models. We present a fully integrated catchment-scale ecohydrological model consisting of a 3-dimensional (3D) process-based hydrological model and a land surface model (LSM) and tests over an energy limited catchment (8.4 km2) of the Sleepers River watershed in Vermont and a water limited catchment in Arizona (7.92 ha). The hydrological model (CATHY) describes 3D subsurface flow in variably saturated porous media and surface routing on hillslopes and in stream channels, while the LSM, an augmented version of Noah LSM with multiple parameterization schemes (NoahMP), accounts for energy, water, and carbon flux exchanges between land surface and the atmosphere. CATHY and NoahMP are coupled through an exchange of fluxes and state variables. The coupled CATHY/NoahMP model, with minor calibration, performs well in simulating the observed snow mass and discharge. In the energy-limited catchment where runoff is dominant, the coupled model at both 90 m and 30 m resolutions simulated the observed discharge in response to snowmelt better than did the 1D NoahMP. The coupled model also simulates surface energy, water, and CO2 fluxes reasonably well at various temporal scales over the water-limited catchment. The 3D coupled model produced wetter soils in lowland areas along stream rills and channels through re-infiltration of lateral overland flow. This water subsidy provide plants with favorable conditions to produce more persistent leaves, CO2, and ET fluxes during drought years and dry-down periods.

  2. Bayesian model selection framework for identifying growth patterns in filamentous fungi.

    PubMed

    Lin, Xiao; Terejanu, Gabriel; Shrestha, Sajan; Banerjee, Sourav; Chanda, Anindya

    2016-06-01

    This paper describes a rigorous methodology for quantification of model errors in fungal growth models. This is essential to choose the model that best describes the data and guide modeling efforts. Mathematical modeling of growth of filamentous fungi is necessary in fungal biology for gaining systems level understanding on hyphal and colony behaviors in different environments. A critical challenge in the development of these mathematical models arises from the indeterminate nature of their colony architecture, which is a result of processing diverse intracellular signals induced in response to a heterogeneous set of physical and nutritional factors. There exists a practical gap in connecting fungal growth models with measurement data. Here, we address this gap by introducing the first unified computational framework based on Bayesian inference that can quantify individual model errors and rank the statistical models based on their descriptive power against data. We show that this Bayesian model comparison is just a natural formalization of Occam׳s razor. The application of this framework is discussed in comparing three models in the context of synthetic data generated from a known true fungal growth model. This framework of model comparison achieves a trade-off between data fitness and model complexity and the quantified model error not only helps in calibrating and comparing the models, but also in making better predictions and guiding model refinements. PMID:27000772

  3. A model integration framework for linking SWAT and MODFLOW

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Hydrological response and transport phenomena are driven by atmospheric, surface and subsurface processes. These complex processes occur at different spatiotemporal scales requiring comprehensive modeling to assess the impact of anthropogenic activity on hydrology and fate and transport of chemical ...

  4. Effective Thermal Conductivity Modeling of Sandstones: SVM Framework Analysis

    NASA Astrophysics Data System (ADS)

    Rostami, Alireza; Masoudi, Mohammad; Ghaderi-Ardakani, Alireza; Arabloo, Milad; Amani, Mahmood

    2016-06-01

    Among the most significant physical characteristics of porous media, the effective thermal conductivity (ETC) is used for estimating the thermal enhanced oil recovery process efficiency, hydrocarbon reservoir thermal design, and numerical simulation. This paper reports the implementation of an innovative least square support vector machine (LS-SVM) algorithm for the development of enhanced model capable of predicting the ETCs of dry sandstones. By means of several statistical parameters, the validity of the presented model was evaluated. The prediction of the developed model for determining the ETCs of dry sandstones was in excellent agreement with the reported data with a coefficient of determination value ({R}2) of 0.983 and an average absolute relative deviation of 0.35 %. Results from present research show that the proposed LS-SVM model is robust, reliable, and efficient in calculating the ETCs of sandstones.

  5. A Model Framework for Science and Other Course Materials Construction.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    A model is presented to provide guidance for Coast Guard writers, curriculum developers, course coordinators, and instructors who intend to update, or draft course materials. Detailed instructions are provided for developing instructor's guides and student's guides. (CS)

  6. A PROBABILISTIC MODELING FRAMEWORK FOR PREDICTING POPULATION EXPOSURES TO BENZENE

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) is modifying their probabilistic Stochastic Human Exposure Dose Simulation (SHEDS) model to assess aggregate exposures to air toxics. Air toxics include urban Hazardous Air Pollutants (HAPS) such as benzene from mobile sources, part...

  7. The Inter-Sectoral Impact Model Intercomparison Project (ISI–MIP): Project framework

    PubMed Central

    Warszawski, Lila; Frieler, Katja; Huber, Veronika; Piontek, Franziska; Serdeczny, Olivia; Schewe, Jacob

    2014-01-01

    The Inter-Sectoral Impact Model Intercomparison Project offers a framework to compare climate impact projections in different sectors and at different scales. Consistent climate and socio-economic input data provide the basis for a cross-sectoral integration of impact projections. The project is designed to enable quantitative synthesis of climate change impacts at different levels of global warming. This report briefly outlines the objectives and framework of the first, fast-tracked phase of Inter-Sectoral Impact Model Intercomparison Project, based on global impact models, and provides an overview of the participating models, input data, and scenario set-up. PMID:24344316

  8. The Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP): project framework.

    PubMed

    Warszawski, Lila; Frieler, Katja; Huber, Veronika; Piontek, Franziska; Serdeczny, Olivia; Schewe, Jacob

    2014-03-01

    The Inter-Sectoral Impact Model Intercomparison Project offers a framework to compare climate impact projections in different sectors and at different scales. Consistent climate and socio-economic input data provide the basis for a cross-sectoral integration of impact projections. The project is designed to enable quantitative synthesis of climate change impacts at different levels of global warming. This report briefly outlines the objectives and framework of the first, fast-tracked phase of Inter-Sectoral Impact Model Intercomparison Project, based on global impact models, and provides an overview of the participating models, input data, and scenario set-up.

  9. Model Adaptation for Prognostics in a Particle Filtering Framework

    NASA Technical Reports Server (NTRS)

    Saha, Bhaskar; Goebel, Kai Frank

    2011-01-01

    One of the key motivating factors for using particle filters for prognostics is the ability to include model parameters as part of the state vector to be estimated. This performs model adaptation in conjunction with state tracking, and thus, produces a tuned model that can used for long term predictions. This feature of particle filters works in most part due to the fact that they are not subject to the "curse of dimensionality", i.e. the exponential growth of computational complexity with state dimension. However, in practice, this property holds for "well-designed" particle filters only as dimensionality increases. This paper explores the notion of wellness of design in the context of predicting remaining useful life for individual discharge cycles of Li-ion batteries. Prognostic metrics are used to analyze the tradeoff between different model designs and prediction performance. Results demonstrate how sensitivity analysis may be used to arrive at a well-designed prognostic model that can take advantage of the model adaptation properties of a particle filter.

  10. Design theoretic analysis of three system modeling frameworks.

    SciTech Connect

    McDonald, Michael James

    2007-05-01

    This paper analyzes three simulation architectures from the context of modeling scalability to address System of System (SoS) and Complex System problems. The paper first provides an overview of the SoS problem domain and reviews past work in analyzing model and general system complexity issues. It then identifies and explores the issues of vertical and horizontal integration as well as coupling and hierarchical decomposition as the system characteristics and metrics against which the tools are evaluated. In addition, it applies Nam Suh's Axiomatic Design theory as a construct for understanding coupling and its relationship to system feasibility. Next it describes the application of MATLAB, Swarm, and Umbra (three modeling and simulation approaches) to modeling swarms of Unmanned Flying Vehicle (UAV) agents in relation to the chosen characteristics and metrics. Finally, it draws general conclusions for analyzing model architectures that go beyond those analyzed. In particular, it identifies decomposition along phenomena of interaction and modular system composition as enabling features for modeling large heterogeneous complex systems.

  11. A full annual cycle modeling framework for American black ducks

    USGS Publications Warehouse

    Robinson, Orin J.; McGowan, Conor; Devers, Patrick K.; Brook, Rodney W.; Huang, Min; Jones, Malcom; McAuley, Daniel G.; Zimmerman, Guthrie

    2016-01-01

    American black ducks (Anas rubripes) are a harvested, international migratory waterfowl species in eastern North America. Despite an extended period of restrictive harvest regulations, the black duck population is still below the population goal identified in the North American Waterfowl Management Plan (NAWMP). It has been hypothesized that density-dependent factors restrict population growth in the black duck population and that habitat management (increases, improvements, etc.) may be a key component of growing black duck populations and reaching the prescribed NAWMP population goal. Using banding data from 1951 to 2011 and breeding population survey data from 1990 to 2014, we developed a full annual cycle population model for the American black duck. This model uses the seven management units as set by the Black Duck Joint Venture, allows movement into and out of each unit during each season, and models survival and fecundity for each region separately. We compare model population trajectories with observed population data and abundance estimates from the breeding season counts to show the accuracy of this full annual cycle model. With this model, we then show how to simulate the effects of habitat management on the continental black duck population.

  12. A fuzzy rule based framework for noise annoyance modeling.

    PubMed

    Botteldooren, Dick; Verkeyn, Andy; Lercher, Peter

    2003-09-01

    Predicting the effect of noise on individual people and small groups is an extremely difficult task due to the influence of a multitude of factors that vary from person to person and from context to context. Moreover, noise annoyance is inherently a vague concept. That is why, in this paper, it is argued that noise annoyance models should identify a fuzzy set of possible effects rather than seek a very accurate crisp prediction. Fuzzy rule based models seem ideal candidates for this task. This paper provides the theoretical background for building these models. Existing empirical knowledge is used to extract a few typical rules that allow making the model more specific for small groups of individuals. The resulting model is tested on two large-scale social surveys augmented with exposure simulations. The testing demonstrates how this new way of thinking about noise effect modeling can be used in practice both in management support as a "noise annoyance adviser" and in social science for testing hypotheses such as the effect of noise sensitivity or the degree of urbanization.

  13. Next Generation Framework for Aquatic Modeling of the Earth System (NextFrAMES)

    NASA Astrophysics Data System (ADS)

    Fekete, B. M.; Wollheim, W. M.; Lakhankar, T.; Vorosmarty, C. J.

    2008-12-01

    Earth System model development is becoming an increasingly complex task. As scientists attempt to represent the physical and bio-geochemical processes and various feedback mechanisms in unprecedented detail, the models themselves are becoming increasingly complex. At the same time, the surrounding IT infrastructure needed to carry out these detailed model computations is growing increasingly complex as well. To be accurate and useful, Earth System models must manage a vast amount of data in heterogenous computing environments ranging from single CPU systems to Beowulf type computer clusters. Scientists developing Earth System models increasingly confront obstacles associated with IT infrastructure. Numerous development efforts are on the way to ease that burden and offer model development platforms that reduce IT challenges and allow scientists to focus on their science. While these new modeling frameworks (e.g. FMS, ESMF, CCA, OpenMI) do provide solutions to many IT challenges (performing input/output, managing space and time, establishing model coupling, etc.), they are still considerably complex and often have steep learning curves. Over the course of the last fifteen years ,the University of New Hampshire developed several modeling frameworks independently from the above-mentioned efforts (Data Assembler, Frameworks for Aquatic Modeling of the Earth System and NextFrAMES which is continued at CCNY). While the UNH modeling frameworks have numerous similarities to those developed by other teams, these frameworks, in particular the latest NextFrAMES, represent a novel model development paradigm. While other modeling frameworks focus on providing services to modelers to perform various tasks, NextFrAMES strives to hide all of those services and provide a new approach for modelers to express their scientific thoughts. From a scientific perspective, most models have two core elements: the overall model structure (defining the linkages between the simulated processes

  14. Surgical model-view-controller simulation software framework for local and collaborative applications

    PubMed Central

    Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2010-01-01

    Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933

  15. Development of a practical modeling framework for estimating the impact of wind technology on bird populations

    SciTech Connect

    Morrison, M.L.; Pollock, K.H.

    1997-11-01

    One of the most pressing environmental concerns related to wind project development is the potential for avian fatalities caused by the turbines. The goal of this project is to develop a useful, practical modeling framework for evaluating potential wind power plant impacts that can be generalized to most bird species. This modeling framework could be used to get a preliminary understanding of the likelihood of significant impacts to birds, in a cost-effective way. The authors accomplish this by (1) reviewing the major factors that can influence the persistence of a wild population; (2) briefly reviewing various models that can aid in estimating population status and trend, including methods of evaluating model structure and performance; (3) reviewing survivorship and population projections; and (4) developing a framework for using models to evaluate the potential impacts of wind development on birds.

  16. Model Components of the Certification Framework for Geologic Carbon Sequestration Risk Assessment

    SciTech Connect

    Oldenburg, Curtis M.; Bryant, Steven L.; Nicot, Jean-Philippe; Kumar, Navanit; Zhang, Yingqi; Jordan, Preston; Pan, Lehua; Granvold, Patrick; Chow, Fotini K.

    2009-06-01

    We have developed a framework for assessing the leakage risk of geologic carbon sequestration sites. This framework, known as the Certification Framework (CF), emphasizes wells and faults as the primary potential leakage conduits. Vulnerable resources are grouped into compartments, and impacts due to leakage are quantified by the leakage flux or concentrations that could potentially occur in compartments under various scenarios. The CF utilizes several model components to simulate leakage scenarios. One model component is a catalog of results of reservoir simulations that can be queried to estimate plume travel distances and times, rather than requiring CF users to run new reservoir simulations for each case. Other model components developed for the CF and described here include fault characterization using fault-population statistics; fault connection probability using fuzzy rules; well-flow modeling with a drift-flux model implemented in TOUGH2; and atmospheric dense-gas dispersion using a mesoscale weather prediction code.

  17. An Integrated Object Model and Method Framework for Subject-Centric e-Research Applications

    PubMed Central

    Lohrey, Jason M.; Killeen, Neil E.B.; Egan, Gary F.

    2009-01-01

    A framework that integrates an object model, research methods (workflows), the capture of experimental data sets and the provenance of those data sets for subject-centric research is presented. The design of the Framework object model draws on and extends pre-existing object models in the public domain. In particular the Framework tracks the state and life cycle of a subject during an experimental method, provides for reusable subjects, primary, derived and recursive data sets of arbitrary content types, and defines a user-friendly and practical scheme for citably identifying information in a distributed environment. The Framework is currently used to manage neuroscience Magnetic Resonance and microscopy imaging data sets in both clinical and basic neuroscience research environments. The Framework facilitates multi-disciplinary and collaborative subject-based research, and extends earlier object models used in the research imaging domain. Whilst the Framework has been explicitly validated for neuroimaging research applications, it has broader application to other fields of subject-centric research. PMID:19636389

  18. Building an Open Source Framework for Integrated Catchment Modeling

    NASA Astrophysics Data System (ADS)

    Jagers, B.; Meijers, E.; Villars, M.

    2015-12-01

    In order to develop effective strategies and associated policies for environmental management, we need to understand the dynamics of the natural system as a whole and the human role therein. This understanding is gained by comparing our mental model of the world with observations from the field. However, to properly understand the system we should look at dynamics of water, sediments, water quality, and ecology throughout the whole system from catchment to coast both at the surface and in the subsurface. Numerical models are indispensable in helping us understand the interactions of the overall system, but we need to be able to update and adjust them to improve our understanding and test our hypotheses. To support researchers around the world with this challenging task we started a few years ago with the development of a new open source modeling environment DeltaShell that integrates distributed hydrological models with 1D, 2D, and 3D hydraulic models including generic components for the tracking of sediment, water quality, and ecological quantities throughout the hydrological cycle composed of the aforementioned components. The open source approach combined with a modular approach based on open standards, which allow for easy adjustment and expansion as demands and knowledge grow, provides an ideal starting point for addressing challenging integrated environmental questions.

  19. A Flexible Atmospheric Modeling Framework for the CESM

    SciTech Connect

    Randall, David; Heikes, Ross; Konor, Celal

    2014-11-12

    We have created two global dynamical cores based on the unified system of equations and Z-grid staggering on an icosahedral grid, which are collectively called UZIM (Unified Z-grid Icosahedral Model). The z-coordinate version (UZIM-height) can be run in hydrostatic and nonhydrostatic modes. The sigma-coordinate version (UZIM-sigma) runs in only hydrostatic mode. The super-parameterization has been included as a physics option in both models. The UZIM versions with the super-parameterization are called SUZI. With SUZI-height, we have completed aquaplanet runs. With SUZI-sigma, we are making aquaplanet runs and realistic climate simulations. SUZI-sigma includes realistic topography and a SiB3 model to parameterize the land-surface processes.

  20. Introducing a boreal wetland model within the Earth System model framework

    NASA Astrophysics Data System (ADS)

    Getzieh, R. J.; Brovkin, V.; Reick, C.; Kleinen, T.; Raddatz, T.; Raivonen, M.; Sevanto, S.

    2009-04-01

    Wetlands of the northern high latitudes with their low temperatures and waterlogged conditions are prerequisite for peat accumulation. They store at least 25% of the global soil organic carbon and constitute currently the largest natural source of methane. These boreal and subarctic peat carbon pools are sensitive to climate change since the ratio of carbon sequestration and emission is closely dependent on hydrology and temperature. Global biogeochemistry models used for simulations of CO2 dynamics in the past and future climates usually ignore changes in the peat storages. Our approach aims at the evaluation of the boreal wetland feedback to climate through the CO2 and CH4 fluxes on decadal to millennial time scales. A generic model of organic matter accumulation and decay in boreal wetlands is under development in the MPI for Meteorology in cooperation with the University of Helsinki. Our approach is to develop a wetland model which is consistent with the physical and biogeochemical components of the land surface module JSBACH as a part of the Earth System model framework ECHAM5-MPIOM-JSBACH. As prototypes, we use modelling approach by Frolking et al. (2001) for the peat dynamics and the wetland model by Wania (2007) for vegetation cover and plant productivity. An initial distribution of wetlands follows the GLWD-3 map by Lehner and Döll (2004). First results of the modelling approach will be presented. References: Frolking, S. E., N. T. Roulet, T. R. Moore, P. J. H. Richard, M. Lavoie and S. D. Muller (2001): Modeling Northern Peatland Decomposition and Peat Accumulation, Ecosystems, 4, 479-498. Lehner, B., Döll P. (2004): Development and validation of a global database of lakes, reservoirs and wetlands. Journal of Hydrology 296 (1-4), 1-22. Wania, R. (2007): Modelling northern peatland land surface processes, vegetation dynamics and methane emissions. PhD thesis, University of Bristol, 122 pp.

  1. Improvement of sediment transport models using the shallow water framework

    NASA Astrophysics Data System (ADS)

    Morales de Luna, Tomás; Castro Diaz, Manuel J.; Fernandez Nieto, Enrique D.; Narbona Reina, Gladys

    2016-04-01

    Sediment can be transported in several ways by the action of a river. During low transport stages, particles move by sliding and rolling over the surface of the bed. This type of transport is usually called bedload transport. With the increase of the velocity, the sediment is entrained into suspension and travels significant distances before being deposed again. One can observe a continuous exchange between sediment at the riverbed and sediment in suspension. One possible approach to model these phenomena is to use a shallow water model coupled with transport equations for sediment in suspension and a morphodynamical component for the bedload transport, which depends on an empirical flux. Nevertheless, this approach presents some drawbacks, for instance, the vertical distribution of the sediment in suspension is lost, gravitational effects for bedload transport is neglected and the models are usually too simplified for practical situations. We present here some recent advances in sediment transport modeling that aim to overcome the difficulties present in classic models. In particular, for suspended transport, a multilayer approach results as a promising tool. This allows to keep track of the vertical distribution of sediment and the computational cost is less expensive than a fully 3D approach. In what concerns bedload transport, a new general formulation will be introduced that recovers classic formulae as a particular case, but incorporates more information on the physics of the problem. This makes the model more suitable for practical applications. ACKNOWLEDGMENTS This research has been partially supported by the Junta de Andalucía research project TESELA (P11-RNM7069) and by the Spanish Government and FEDER through the research project DAIFLUID (MTM2012-38383-C02-01 and MTM2012-38383-C02-02)

  2. Towards uncertainty quantification and parameter estimation for Earth system models in a component-based modeling framework

    NASA Astrophysics Data System (ADS)

    Peckham, Scott D.; Kelbert, Anna; Hill, Mary C.; Hutton, Eric W. H.

    2016-05-01

    Component-based modeling frameworks make it easier for users to access, configure, couple, run and test numerical models. However, they do not typically provide tools for uncertainty quantification or data-based model verification and calibration. To better address these important issues, modeling frameworks should be integrated with existing, general-purpose toolkits for optimization, parameter estimation and uncertainty quantification. This paper identifies and then examines the key issues that must be addressed in order to make a component-based modeling framework interoperable with general-purpose packages for model analysis. As a motivating example, one of these packages, DAKOTA, is applied to a representative but nontrivial surface process problem of comparing two models for the longitudinal elevation profile of a river to observational data. Results from a new mathematical analysis of the resulting nonlinear least squares problem are given and then compared to results from several different optimization algorithms in DAKOTA.

  3. A Trust Framework of Ubiquitous Healthcare with Advanced Petri Net Model

    NASA Astrophysics Data System (ADS)

    Nam, Jaechang

    Ubiquitous healthcare, which enables patients to access medical services anywhere anytime, is still so immature that particularly trust issues in this area require more researches. In this sense, how to design a trustworthy ubiquitous healthcare system is an interesting problem to be resolved. In this paper, we propose a trust framework for ubiquitous healthcare systems by using advanced Petri net and verify how this trust framework evaluates trust properties when developing ubiquitous healthcare systems. The outcome of the research, the trust framework, can make it easy to understand how trust relationships can be built in ubiquitous healthcare systems via mathematical and graphical models.

  4. A framework for modelling kinematic measurements in gravity field applications

    NASA Technical Reports Server (NTRS)

    Schwarz, K. P.; Wei, M.

    1989-01-01

    To assess the resolution of the local gravity field from kinematic measurements, a state model for motion in the gravity field of the earth is formulated. The resulting set of equations can accommodate gravity gradients, specific force, acceleration, velocity and position as input data and can take into account approximation errors as well as sensor errors.

  5. Models of Continuing Professional Development: A Framework for Analysis

    ERIC Educational Resources Information Center

    Kennedy, Aileen

    2014-01-01

    The area of teachers' continuing professional development (CPD) is of growing interest internationally. However, while an increasing range of literature focuses on particular aspects of CPD, there is a paucity of literature addressing the spectrum of CPD models in a comparative manner. This article therefore considers a wide range of…

  6. COMPASS: A Framework for Automated Performance Modeling and Prediction

    SciTech Connect

    Lee, Seyong; Meredith, Jeremy S; Vetter, Jeffrey S

    2015-01-01

    Flexible, accurate performance predictions offer numerous benefits such as gaining insight into and optimizing applications and architectures. However, the development and evaluation of such performance predictions has been a major research challenge, due to the architectural complexities. To address this challenge, we have designed and implemented a prototype system, named COMPASS, for automated performance model generation and prediction. COMPASS generates a structured performance model from the target application's source code using automated static analysis, and then, it evaluates this model using various performance prediction techniques. As we demonstrate on several applications, the results of these predictions can be used for a variety of purposes, such as design space exploration, identifying performance tradeoffs for applications, and understanding sensitivities of important parameters. COMPASS can generate these predictions across several types of applications from traditional, sequential CPU applications to GPU-based, heterogeneous, parallel applications. Our empirical evaluation demonstrates a maximum overhead of 4%, flexibility to generate models for 9 applications, speed, ease of creation, and very low relative errors across a diverse set of architectures.

  7. A Framework for Modelling Connective Tissue Changes in VIIP Syndrome

    NASA Technical Reports Server (NTRS)

    Ethier, C. R.; Best, L.; Gleason, R.; Mulugeta, L.; Myers, J. G.; Nelson, E. S.; Samuels, B. C.

    2014-01-01

    Insertion of astronauts into microgravity induces a cascade of physiological adaptations, notably including a cephalad fluid shift. Longer-duration flights carry an increased risk of developing Visual Impairment and Intracranial Pressure (VIIP) syndrome, a spectrum of ophthalmic changes including posterior globe flattening, choroidal folds, distension of the optic nerve sheath, kinking of the optic nerve and potentially permanent degradation of visual function. The slow onset of changes in VIIP, their chronic nature, and the similarity of certain clinical features of VIIP to ophthalmic findings in patients with raised intracranial pressure strongly suggest that: (i) biomechanical factors play a role in VIIP, and (ii) connective tissue remodeling must be accounted for if we wish to understand the pathology of VIIP. Our goal is to elucidate the pathophysiology of VIIP and suggest countermeasures based on biomechanical modeling of ocular tissues, suitably informed by experimental data, and followed by validation and verification. We specifically seek to understand the quasi-homeostatic state that evolves over weeks to months in space, during which ocular tissue remodeling occurs. This effort is informed by three bodies of work: (i) modeling of cephalad fluid shifts; (ii) modeling of ophthalmic tissue biomechanics in glaucoma; and (iii) modeling of connective tissue changes in response to biomechanical loading.

  8. An Empirical Generative Framework for Computational Modeling of Language Acquisition

    ERIC Educational Resources Information Center

    Waterfall, Heidi R.; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-01-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of…

  9. Models for Indicator Development: A Framework for Policy Analysis.

    ERIC Educational Resources Information Center

    Garn, Harvey A.; And Others

    Aspects are summarized of a current approach to social indicator research and related problems in policy analysis generated by an interest in isolating major sources of variability in the generation of human welfare and developing indicators associated with welfare-generating processes. A set of models being developed for indicator research is…

  10. Understanding cardiac alternans: A piecewise linear modeling framework

    NASA Astrophysics Data System (ADS)

    Thul, R.; Coombes, S.

    2010-12-01

    Cardiac alternans is a beat-to-beat alternation in action potential duration (APD) and intracellular calcium (Ca2+) cycling seen in cardiac myocytes under rapid pacing that is believed to be a precursor to fibrillation. The cellular mechanisms of these rhythms and the coupling between cellular Ca2+ and voltage dynamics have been extensively studied leading to the development of a class of physiologically detailed models. These have been shown numerically to reproduce many of the features of myocyte response to pacing, including alternans, and have been analyzed mathematically using various approximation techniques that allow for the formulation of a low dimensional map to describe the evolution of APDs. The seminal work by Shiferaw and Karma is of particular interest in this regard [Shiferaw, Y. and Karma, A., "Turing instability mediated by voltage and calcium diffusion in paced cardiac cells," Proc. Natl. Acad. Sci. U.S.A. 103, 5670-5675 (2006)]. Here, we establish that the key dynamical behaviors of the Shiferaw-Karma model are arranged around a set of switches. These are shown to be the main elements for organizing the nonlinear behavior of the model. Exploiting this observation, we show that a piecewise linear caricature of the Shiferaw-Karma model, with a set of appropriate switching manifolds, can be constructed that preserves the physiological interpretation of the original model while being amenable to a systematic mathematical analysis. In illustration of this point, we formulate the dynamics of Ca2+ cycling (in response to pacing) and compute the properties of periodic orbits in terms of a stroboscopic map that can be constructed without approximation. Using this, we show that alternans emerge via a period-doubling instability and track this bifurcation in terms of physiologically important parameters. We also show that when coupled to a spatially extended model for Ca2+ transport, the model supports spatially varying patterns of alternans. We analyze

  11. Computational fluid dynamics framework for aerodynamic model assessment

    NASA Astrophysics Data System (ADS)

    Vallespin, D.; Badcock, K. J.; Da Ronch, A.; White, M. D.; Perfect, P.; Ghoreyshi, M.

    2012-07-01

    This paper reviews the work carried out at the University of Liverpool to assess the use of CFD methods for aircraft flight dynamics applications. Three test cases are discussed in the paper, namely, the Standard Dynamic Model, the Ranger 2000 jet trainer and the Stability and Control Unmanned Combat Air Vehicle. For each of these, a tabular aerodynamic model based on CFD predictions is generated along with validation against wind tunnel experiments and flight test measurements. The main purpose of the paper is to assess the validity of the tables of aerodynamic data for the force and moment prediction of realistic aircraft manoeuvres. This is done by generating a manoeuvre based on the tables of aerodynamic data, and then replaying the motion through a time-accurate computational fluid dynamics calculation. The resulting forces and moments from these simulations were compared with predictions from the tables. As the latter are based on a set of steady-state predictions, the comparisons showed perfect agreement for slow manoeuvres. As manoeuvres became more aggressive some disagreement was seen, particularly during periods of large rates of change in attitudes. Finally, the Ranger 2000 model was used on a flight simulator.

  12. An efficient framework for modeling clouds from Landsat8 images

    NASA Astrophysics Data System (ADS)

    Yuan, Chunqiang; Guo, Jing

    2015-03-01

    Cloud plays an important role in creating realistic outdoor scenes for video game and flight simulation applications. Classic methods have been proposed for cumulus cloud modeling. However, these methods are not flexible for modeling large cloud scenes with hundreds of clouds in that the user must repeatedly model each cloud and adjust its various properties. This paper presents a meteorologically based method to reconstruct cumulus clouds from high resolution Landsat8 satellite images. From these input satellite images, the clouds are first segmented from the background. Then, the cloud top surface is estimated from the temperature of the infrared image. After that, under a mild assumption of flat base for cumulus cloud, the base height of each cloud is computed by averaging the top height for pixels on the cloud edge. Then, the extinction is generated from the visible image. Finally, we enrich the initial shapes of clouds using a fractal method and represent the recovered clouds as a particle system. The experimental results demonstrate our method can yield realistic cloud scenes resembling those in the satellite images.

  13. Generic framework for mining cellular automata models on protein-folding simulations.

    PubMed

    Diaz, N; Tischer, I

    2016-01-01

    Cellular automata model identification is an important way of building simplified simulation models. In this study, we describe a generic architectural framework to ease the development process of new metaheuristic-based algorithms for cellular automata model identification in protein-folding trajectories. Our framework was developed by a methodology based on design patterns that allow an improved experience for new algorithms development. The usefulness of the proposed framework is demonstrated by the implementation of four algorithms, able to obtain extremely precise cellular automata models of the protein-folding process with a protein contact map representation. Dynamic rules obtained by the proposed approach are discussed, and future use for the new tool is outlined. PMID:27323045

  14. Regional scale framework for modeling water resources and health risk problems

    NASA Astrophysics Data System (ADS)

    Pelmulder, Susan D.; Yeh, William W.-G.; Kastenberg, William E.

    A framework of simulation models for including human exposure to contaminants in regional scale aquifer management problems is presented. The framework includes horizontal flow and transport of contaminant plumes in the aquifer and multiple-pathway human exposure. Well water from the aquifer simulation model is used as the source of contaminant in the human exposure model. The exposure pathways considered for regional analysis are ingestion of foods grown using well water as part of the irrigation supply; ingestion and dermal absorption of contaminants in tap water; and inhalation of vaporized contaminants while bathing. An environmental compartment model is used to track the contaminant in irrigation water into the soil layers in contact with food products. The simulation framework is demonstrated in a study of the sensitivity of exposure to various aquifer and water supply parameters. The region used is hypothetical; however, the parameters are typical of California.

  15. Periodic model of LTA framework containing various non-tetrahedral cations

    NASA Astrophysics Data System (ADS)

    Koleżyński, A.; Mikuła, A.; Król, M.

    2016-03-01

    A simplified periodic model of Linde Type A zeolite (LTA) structure with various selected mono- and di-valent extra-framework cations was formulated. Ab initio calculations (geometry optimization and vibrational spectra calculations) using the proposed model were carried out by means of Crystal09 program. The resulting structures and simulated spectra were analyzed in detail and compared with the experimental ones. The presented results show that in most cases the proposed model agrees well with experimental results. Individual bands were assigned to respective normal modes of vibration and the changes resulting from the selective substitution of extra framework cations were described and explained.

  16. A review of the quantification and communication of uncertainty associated with geological framework models

    NASA Astrophysics Data System (ADS)

    Mathers, Steve; Lark, Murray

    2015-04-01

    Digital Geological Framework Models show geology in three dimensions, they can most easily be thought of as 3D geological maps. The volume of the model is divided into distinct geological units using a suitable rock classification in the same way that geological maps are. Like geological maps the models are generic and many are intended to be fit for any geoscience purpose. Over the last decade many Geological Survey Organisations (GSO's) worldwide have begun to communicate their geological understanding of the subsurface through Geological Framework Models and themed derivatives, and the traditional printed geological map has been increasingly phased out. Building Geological Framework Models entails the assembly of all the known geospatial information into a single workspace for interpretation. The calculated models are commonly displayed as either a stack of geological surfaces or boundaries (unit tops, bases, unconformities) or as solid calculated blocks of 3D geology with the unit volumes infilled in with colour or symbols. The studied volume however must be completely populated so decisions on the subsurface distribution of units must be made even where considerable uncertainty exists There is naturally uncertainty associated with any Geological Framework Model and this is composed of two main components; the uncertainty in the geospatial data used to constrain the model, and the uncertainty related to the model construction, this includes factors such as choice of modeller(s), choice of software(s), and modelling workflow. Uncertainty is the inverse of confidence, reliability or certainty, other closely related terms include risk commonly used in preference to uncertainty where financial or safety matters are presented and probability used as a statistical measure of uncertainty. We can consider uncertainty in geological framework models to be of two main types: Uncertainty in the geospatial data used to constrain the model; this differs with the distinct

  17. Parameter Estimation for Differential Equation Models Using a Framework of Measurement Error in Regression Models

    PubMed Central

    Liang, Hua

    2008-01-01

    Differential equation (DE) models are widely used in many scientific fields that include engineering, physics and biomedical sciences. The so-called “forward problem”, the problem of simulations and predictions of state variables for given parameter values in the DE models, has been extensively studied by mathematicians, physicists, engineers and other scientists. However, the “inverse problem”, the problem of parameter estimation based on the measurements of output variables, has not been well explored using modern statistical methods, although some least squares-based approaches have been proposed and studied. In this paper, we propose parameter estimation methods for ordinary differential equation models (ODE) based on the local smoothing approach and a pseudo-least squares (PsLS) principle under a framework of measurement error in regression models. The asymptotic properties of the proposed PsLS estimator are established. We also compare the PsLS method to the corresponding SIMEX method and evaluate their finite sample performances via simulation studies. We illustrate the proposed approach using an application example from an HIV dynamic study. PMID:19956350

  18. Evaluation of Hydrometeor Occurrence Profiles in the Multiscale Modeling Framework Climate Model using Atmospheric Classification

    SciTech Connect

    Marchand, Roger T.; Beagley, Nathaniel; Ackerman, Thomas P.

    2009-09-01

    Vertical profiles of hydrometeor occurrence from the Multiscale Modeling Framework (MMF) climate model are compared with profiles observed by a vertically pointing millimeter wavelength cloud-radar (located in the U.S. Southern Great Plains) as a function of the largescale atmospheric state. The atmospheric state is determined by classifying (or clustering) the large-scale (synoptic) fields produced by the MMF and a numerical weather prediction model using a neural network approach. The comparison shows that for cold frontal and post-cold frontal conditions the MMF produces profiles of hydrometeor occurrence that compare favorably with radar observations, while for warm frontal conditions the model tends to produce hydrometeor fractions that are too large with too much cloud (non-precipitating hydrometeors) above 7 km and too much precipitating hydrometeor coverage below 7 km. We also find that the MMF has difficulty capturing the formation of low clouds and that for all atmospheric states that occur during June, July, and August, the MMF produces too much high and thin cloud, especially above 10 km.

  19. A Distributed Multi-User Role-Based Model Integration Framework

    SciTech Connect

    Dorow, Kevin E.; Gorton, Ian; Thurman, David A.

    2004-06-14

    Integrated computational modeling can be very useful in making quick, yet informed decisions related to environmental issues including Brownfield assessments. Unfortunately, the process of creating meaningful information using this methodology is fraught with difficulties, particularly when multiple computational models are required. Common problems include the inability to seamlessly transfer information between models, the difficulty of incorporating new models and integrating heterogeneous data sources, executing large numbers of model runs in a reasonable time frame, and adequately capturing pedigree information that describes the specific computational steps and data required to reproduce results. While current model integration frameworks have successfully addressed some of these problems, none have addressed all of them. Building on existing work at Pacific Northwest National Laboratory (PNNL), we have created an extensible software architecture for the next generation of model integration frameworks that addresses these issues. This paper describes this architecture that is being developed to support integrated water resource modeling in a metropolitan area.

  20. Development of a framework for reporting health service models for managing rheumatoid arthritis.

    PubMed

    O'Donnell, Siobhan; Li, Linda C; King, Judy; Lauzon, Chantal; Finn, Heather; Vliet Vlieland, Theodora P M

    2010-02-01

    The purpose of this study was to develop a framework for reporting health service models for managing rheumatoid arthritis (RA). We conducted a search of the health sciences literature for primary studies that described interventions which aimed to improve the implementation of health services in adults with RA. Thereafter, a nominal group consensus process was used to synthesize the evidence for the development of the reporting framework. Of the 2,033 citations screened, 68 primary studies were included which described 93 health service models for RA. The origin and meaning of the labels given to these health service delivery models varied widely and, in general, the reporting of their components lacked detail or was absent. The six dimensions underlying the framework for reporting RA health service delivery models are: (1) Why was it founded? (2) Who was involved? (3) What were the roles of those participating? (4) When were the services provided? (5) Where were the services provided/received? (6) How were the services/interventions accessed and implemented, how long was the intervention, how did individuals involved communicate, and how was the model supported/sustained? The proposed framework has the potential to facilitate knowledge exchange among clinicians, researchers, and decision makers in the area of health service delivery. Future work includes the validation of the framework with national and international stakeholders such as clinicians, health care administrators, and health services researchers. PMID:19865842

  1. Using the Bifocal Modeling Framework to Resolve "Discrepant Events" between Physical Experiments and Virtual Models in Biology

    ERIC Educational Resources Information Center

    Blikstein, Paulo; Fuhrmann, Tamar; Salehi, Shima

    2016-01-01

    In this paper, we investigate an approach to supporting students' learning in science through a combination of physical experimentation and virtual modeling. We present a study that utilizes a scientific inquiry framework, which we call "bifocal modeling," to link student-designed experiments and computer models in real time. In this…

  2. A framework of fuzzy hybrid systems for modelling and control

    NASA Astrophysics Data System (ADS)

    Cheng, Shu; Dong, Ruijun; Pedrycz, Witold

    2010-02-01

    This paper presents a new approach to modelling and control of hybrid systems with both continuous variables and discrete events. Applying the fuzzy set theory, a hierarchical fuzzy hybrid structure consisting of a fuzzy discrete event dynamic system and a continuous variable dynamic system is constructed, which not only captures the hybrid continuous/discrete dynamics but also handles the uncertainties in states and state transitions. The identification of continuous and discrete components is developed, and the hybrid control is then synthesised by fuzzy IF-THEN rules embedded in the fuzzy interface. An example of the optimisation of a production line in manufacturing shows the efficacy of the proposed approach.

  3. An Odds Ratio Approach for Detecting DDF under the Nested Logit Modeling Framework

    ERIC Educational Resources Information Center

    Terzi, Ragip; Suh, Youngsuk

    2015-01-01

    An odds ratio approach (ORA) under the framework of a nested logit model was proposed for evaluating differential distractor functioning (DDF) in multiple-choice items and was compared with an existing ORA developed under the nominal response model. The performances of the two ORAs for detecting DDF were investigated through an extensive…

  4. HexSim - A general purpose framework for spatially-explicit, individual-based modeling

    EPA Science Inventory

    HexSim is a framework for constructing spatially-explicit, individual-based computer models designed for simulating terrestrial wildlife population dynamics and interactions. HexSim is useful for a broad set of modeling applications. This talk will focus on a subset of those ap...

  5. A modeling framework for characterizing near-road air pollutant concentration at community scales

    EPA Science Inventory

    In this study, we combine information from transportation network, traffic emissions, and dispersion model to develop a framework to inform exposure estimates for traffic-related air pollutants (TRAPs) with a high spatial resolution. A Research LINE source dispersion model (R-LIN...

  6. A Theoretical Framework for Research in Algebra: Modification of Janvier's "Star" Model of Function Understanding.

    ERIC Educational Resources Information Center

    Bowman, Anita H.

    A pentagonal model, based on the star model of function understanding of C. Janvier (1987), is presented as a framework for the design and interpretation of research in the area of learning the concept of mathematical function. The five vertices of the pentagon correspond to five common representations of mathematical function: (1) graph; (2)…

  7. PARCC Model Content Frameworks: English Language Arts/Literacy--Grades 3-11

    ERIC Educational Resources Information Center

    Partnership for Assessment of Readiness for College and Careers (NJ1), 2011

    2011-01-01

    As part of its proposal to the U.S. Department of Education, the Partnership for Assessment of Readiness for College and Careers (PARCC) committed to developing model content frameworks for English language arts/literacy (ELA/Literacy) to serve as a bridge between the Common Core State Standards and the PARCC assessments. The PARCC Model Content…

  8. The Dimensions of Social Justice Model: Transforming Traditional Group Work into a Socially Just Framework

    ERIC Educational Resources Information Center

    Ratts, Manivong J.; Anthony, Loni; Santos, KristiAnna Nicole T.

    2010-01-01

    Social justice is a complex and abstract concept that can be difficult to discuss and integrate within group work. To address this concern, this article introduces readers to the Dimensions of Social Justice Model. The model provides group leaders with a conceptual framework for understanding the degree to which social justice is integrated within…

  9. Modeling of features of slow earthquakes in a dynamical framework

    NASA Astrophysics Data System (ADS)

    Yamashita, T.

    2010-12-01

    Slow earthquakes exhibit a striking contrast with ordinary earthquakes. Rupture speeds of slow slip events are four orders of magnitude smaller than those of ordinary earthquakes. Ide et al.(2007) found that seismic moment of slow earthquakes is linearly proportional to the characteristic duration, which is different from the relation for ordinary earthquakes. It is also known that slow slip events are frequently coupled with tremor. We now simulate such features of slow earthquakes on the basis of fault model developed by Suzuki and Yamashita (2009, 2010). Key ingredients of the model are the fluid flow, shear heating and inelastic pore creation. We assume a fault in a thermoporoelastic medium saturated with fluid. The inelastic porosity is assumed to increase with evolving slip. The shear heating builds up the fluid pressure on the fault, whereas the pore creation lowers it. Since the slip is promoted by high fluid pressure according to the Coulomb law of friction, the relative dominance of these two effects determines the nature of slip. Our 1D analysis showed that slip-weakening and -strengthening emerge in the ranges Su < -P0 and Su > -P0 (Suzuki and Yamashita, 2010); shear heating and pore creation are dominant in the former and latter ranges. Here, Su is a parameter proportional to the creation rate of pore; Su’ and P0 are proportional to the permeability and to the initial fluid pressure, respectively. We found in the 2D modeling that slow fault growth can be simulated if we assume Su >> -P0 (Suzuki and Yamashita, 2009). Suzuki and Yamashita (2009) showed that the fluid inflow triggered by the pore creation tends to weaken the degree of slip-strengthening in the range Su >> -P0, which causes slow fault growth whose speed is dependent on the fluid inflow rate. However, if the value of Su is large enough, a nucleated event stops its growth soon after the nucleation because of intense slip-strengthening. Suzuki and Yamashita (2009) assumed that slip is

  10. A Bayesian modelling framework for tornado occurrences in North America.

    PubMed

    Cheng, Vincent Y S; Arhonditsis, George B; Sills, David M L; Gough, William A; Auld, Heather

    2015-01-01

    Tornadoes represent one of nature's most hazardous phenomena that have been responsible for significant destruction and devastating fatalities. Here we present a Bayesian modelling approach for elucidating the spatiotemporal patterns of tornado activity in North America. Our analysis shows a significant increase in the Canadian Prairies and the Northern Great Plains during the summer, indicating a clear transition of tornado activity from the United States to Canada. The linkage between monthly-averaged atmospheric variables and likelihood of tornado events is characterized by distinct seasonality; the convective available potential energy is the predominant factor in the summer; vertical wind shear appears to have a strong signature primarily in the winter and secondarily in the summer; and storm relative environmental helicity is most influential in the spring. The present probabilistic mapping can be used to draw inference on the likelihood of tornado occurrence in any location in North America within a selected time period of the year.

  11. A Bayesian modelling framework for tornado occurrences in North America.

    PubMed

    Cheng, Vincent Y S; Arhonditsis, George B; Sills, David M L; Gough, William A; Auld, Heather

    2015-01-01

    Tornadoes represent one of nature's most hazardous phenomena that have been responsible for significant destruction and devastating fatalities. Here we present a Bayesian modelling approach for elucidating the spatiotemporal patterns of tornado activity in North America. Our analysis shows a significant increase in the Canadian Prairies and the Northern Great Plains during the summer, indicating a clear transition of tornado activity from the United States to Canada. The linkage between monthly-averaged atmospheric variables and likelihood of tornado events is characterized by distinct seasonality; the convective available potential energy is the predominant factor in the summer; vertical wind shear appears to have a strong signature primarily in the winter and secondarily in the summer; and storm relative environmental helicity is most influential in the spring. The present probabilistic mapping can be used to draw inference on the likelihood of tornado occurrence in any location in North America within a selected time period of the year. PMID:25807465

  12. HART: An Efficient Modeling Framework for Simulated Solar Imaging

    NASA Astrophysics Data System (ADS)

    Benkevitch, L. V.; Oberoi, D.; Benjamin, M. D.; Sokolov, I. V.

    2012-09-01

    The Haystack & AOSS Ray Tracer (HART) is a software tool for modeling propagation of electromagnetic radiation through a realistic description of the magnetized solar corona, along with the associated radiative transfer effects. Its primary outputs are solar brightness temperature (or flux density) images corresponding to a user-specified coronal description and radio frequency. HART is based on native high-efficiency algorithms coded in the C language, and provides convenient command-line (Python) and graphical user interfaces. HART is a necessary tool for enabling the extraction of solar physics from the images that will be produced by the new generation of low radio frequency arrays like the Murchison Widefield Array (MWA), Low Frequency Array (LOFAR) and Long Wavelength Array (LWA).

  13. The Modular Modeling System (MMS): A modeling framework for water- and environmental-resources management

    USGS Publications Warehouse

    Leavesley, G.H.; Markstrom, S.L.; Viger, R.J.

    2004-01-01

    The interdisciplinary nature and increasing complexity of water- and environmental-resource problems require the use of modeling approaches that can incorporate knowledge from a broad range of scientific disciplines. The large number of distributed hydrological and ecosystem models currently available are composed of a variety of different conceptualizations of the associated processes they simulate. Assessment of the capabilities of these distributed models requires evaluation of the conceptualizations of the individual processes, and the identification of which conceptualizations are most appropriate for various combinations of criteria, such as problem objectives, data constraints, and spatial and temporal scales of application. With this knowledge, "optimal" models for specific sets of criteria can be created and applied. The U.S. Geological Survey (USGS) Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide these model development and application capabilities. MMS supports the integration of models and tools at a variety of levels of modular design. These include individual process models, tightly coupled models, loosely coupled models, and fully-integrated decision support systems. A variety of visualization and statistical tools are also provided. MMS has been coupled with the Bureau of Reclamation (BOR) object-oriented reservoir and river-system modeling framework, RiverWare, under a joint USGS-BOR program called the Watershed and River System Management Program. MMS and RiverWare are linked using a shared relational database. The resulting database-centered decision support system provides tools for evaluating and applying optimal resource-allocation and management strategies to complex, operational decisions on multipurpose reservoir systems and watersheds. Management issues being addressed include efficiency of water-resources management, environmental concerns such as meeting flow needs for

  14. A new fit-for-purpose model testing framework: Decision Crash Tests

    NASA Astrophysics Data System (ADS)

    Tolson, Bryan; Craig, James

    2016-04-01

    Decision-makers in water resources are often burdened with selecting appropriate multi-million dollar strategies to mitigate the impacts of climate or land use change. Unfortunately, the suitability of existing hydrologic simulation models to accurately inform decision-making is in doubt because the testing procedures used to evaluate model utility (i.e., model validation) are insufficient. For example, many authors have identified that a good standard framework for model testing called the Klemes Crash Tests (KCTs), which are the classic model validation procedures from Klemeš (1986) that Andréassian et al. (2009) rename as KCTs, have yet to become common practice in hydrology. Furthermore, Andréassian et al. (2009) claim that the progression of hydrological science requires widespread use of KCT and the development of new crash tests. Existing simulation (not forecasting) model testing procedures such as KCTs look backwards (checking for consistency between simulations and past observations) rather than forwards (explicitly assessing if the model is likely to support future decisions). We propose a fundamentally different, forward-looking, decision-oriented hydrologic model testing framework based upon the concept of fit-for-purpose model testing that we call Decision Crash Tests or DCTs. Key DCT elements are i) the model purpose (i.e., decision the model is meant to support) must be identified so that model outputs can be mapped to management decisions ii) the framework evaluates not just the selected hydrologic model but the entire suite of model-building decisions associated with model discretization, calibration etc. The framework is constructed to directly and quantitatively evaluate model suitability. The DCT framework is applied to a model building case study on the Grand River in Ontario, Canada. A hypothetical binary decision scenario is analysed (upgrade or not upgrade the existing flood control structure) under two different sets of model building

  15. Spatiotemporal nonpoint source pollution water quality management framework using bi-directional model-GIS linkage

    SciTech Connect

    Faizullabhoy, M.S.; Yoon, J.

    1999-07-01

    A framework for water quality assessment and management purposes was developed. In this framework, a bilateral linkage was implemented between the distributed model, Agricultural Nonpoint Source Pollution Model (AGNPS) and the Geographic Information System (GIS) to investigate a spatiotemporal nonpoint source pollution problem from a 750-acre watershed in the NSGA (Naval Security Group Activity) Northwest base at the Virginia/North Carolina border. AGNPS is an event-based, distributed parameter model that simulates runoff and the transport of sediment and nutrients (nitrogen and phosphorus) from predominantly agricultural watersheds. In this study rather than manually implementing AGNPS simulation, extracted data are integrated in an automated fashion through a direct bilateral linkage framework between the AGNPS model engine and the GIS. This bilateral linkage framework resulted in a powerful, up-to-date tool that would be capable of monitoring and instantaneously visualizing the transport of any pollutant as well as effectively identifying critical areas of the nonpoint source (NPS) pollution. The framework also allowed the various what if scenarios to support the decision-making processes. Best Management Practices (BMP) for the watershed can be generated in a close loop iterative scheme, until predefined management objectives are achieved. Simulated results showed that the optimal BMP scenario achieved an average reduction of about 41% in soluble and sediment-attached nitrogen and about 62% reduction in soluble and sediment phosphorus from current NPS pollution levels.

  16. A scalable framework for the global offline community land model ensemble simulation

    DOE PAGES

    Wang, Dali; Domke, Jens; Mao, Jiafu; Shi, Xiaoying; Ricciuto, Daniel M.

    2016-01-01

    Current earth system models have a large range of uncertainty, owing to differences in the simulation of feedbacks and insufficient information to constrain model parameters. Parameter disturbance experiment provides a straightforward method to quantify the variation (uncertainty) outputs caused by model inputs. Owing to the software complexity and computational intensity of earth system models, a large-scale simulation framework is needed to support ensemble simulation required by parameter disturbance experiment. This paper presents a parallel framework for the community land model ensemble simulation. After a software structure review of the community land model simulation, a single factor parameter disturbance experiment ofmore » a reference computational experiment design is used to demonstrate the software design principles, computational characteristics of individual application, parallel ensemble simulation implementation, as well as the weak scalability of this simulation framework on a high-end computer. Lastly, the paper discusses some preliminary diagnostic analysis results of the single factor parameter disturbance experiments. The framework design considerations and implementation details described in this paper can be beneficial to many other research programmes involving large scale, legacy modelling system.« less

  17. A Satellite Based Modeling Framework for Estimating Seasonal Carbon Fluxes Over Agricultural Lands

    NASA Astrophysics Data System (ADS)

    Bandaru, V.; Houborg, R.; Izaurralde, R. C.

    2014-12-01

    Croplands are typically characterized by fine-scale heterogeneity, which makes it difficult to accurately estimate cropland carbon fluxes over large regions given the fairly coarse spatial resolution of high-frequency satellite observations. It is, however, important that we improve our ability to estimate spatially and temporally resolved carbon fluxes because croplands constitute a large land area and have a large impact on global carbon cycle. A Satellite based Dynamic Cropland Carbon (SDCC) modeling framework was developed to estimate spatially resolved crop specific daily carbon fluxes over large regions. This modeling framework uses the REGularized canopy reFLECtance (REGFLEC) model to estimate crop specific leaf area index (LAI) using downscaled MODIS reflectance data, and subsequently LAI estimates are integrated into the Environmental Policy Integrated Model (EPIC) model to determine daily net primary productivity (NPP) and net ecosystem productivity (NEP). Firstly, we evaluate the performance of this modeling framework over three eddy covariance flux tower sites (Bondville, IL; Fermi Agricultural Site, IL; and Rosemount site, MN). Daily NPP and NEP of corn and soybean crops are estimated (based on REGFLEC LAI) for year 2007 and 2008 over the flux tower sites and compared against flux tower observations and model estimates based on in-situ LAI. Secondly, we apply the SDCC framework for estimating regional NPP and NEP for corn, soybean and sorghum crops in Nebraska during year 2007 and 2008. The methods and results will be presented.

  18. A Satellite Based Modeling Framework for Estimating Seasonal Carbon Fluxes Over Agricultural Lands

    NASA Astrophysics Data System (ADS)

    Bandaru, V.; Izaurralde, R. C.; Sahajpal, R.; Houborg, R.; Milla, Z.

    2013-12-01

    Croplands are typically characterized by fine-scale heterogeneity, which makes it difficult to accurately estimate cropland carbon fluxes over large regions given the fairly coarse spatial resolution of high-frequency satellite observations. It is, however, important that we improve our ability to estimate spatially and temporally resolved carbon fluxes because croplands constitute a large land area and have a large impact on global carbon cycle. A Satellite based Dynamic Cropland Carbon (SDCC) modeling framework was developed to estimate spatially resolved crop specific daily carbon fluxes over large regions. This modeling framework uses the REGularized canopy reFLECtance (REGFLEC) model to estimate crop specific leaf area index (LAI) using downscaled MODIS reflectance data, and subsequently LAI estimates are integrated into the Environmental Policy Integrated Model (EPIC) model to determine daily net primary productivity (NPP) and net ecosystem productivity (NEP). Firstly, we evaluate the performance of this modeling framework over three eddy covariance flux tower sites (Bondville, IL; Fermi Agricultural Site, IL; and Rosemount site, MN). Daily NPP and NEP of corn and soybean crops are estimated (based on REGFLEC LAI) for year 2007 and 2008 over the flux tower sites and compared against flux tower observations and model estimates based on in-situ LAI. Secondly, we apply the SDCC framework for estimating regional NPP and NEP for corn, soybean and sorghum crops in Nebraska during year 2007 and 2008. The methods and results will be presented.

  19. Clinical Interdisciplinary Collaboration Models and Frameworks From Similarities to Differences: A Systematic Review

    PubMed Central

    Mahdizadeh, Mousa; Heydari, Abbas; Moonaghi, Hossien Karimi

    2015-01-01

    Introduction: So far, various models of interdisciplinary collaboration in clinical nursing have been presented, however, yet a comprehensive model is not available. The purpose of this study is to review the evidences that had presented model or framework with qualitative approach about interdisciplinary collaboration in clinical nursing. Methods: All the articles and theses published from 1990 to 10 June 2014 which in both English and Persian models or frameworks of clinicians had presented model or framework of clinical collaboration were searched using databases of Proquest, Scopus, pub Med, Science Direct, and Iranian databases of Sid, Magiran, and Iranmedex. In this review, for published articles and theses, keywords according with MESH such as nurse-physician relations, care team, collaboration, interdisciplinary relations and their Persian equivalents were used. Results: In this study contexts, processes and outcomes of interdisciplinary collaboration as findings were extracted. One of the major components affecting on collaboration that most of the models had emphasized was background of collaboration. Most of studies suggested that the outcome of collaboration were improved care, doctors and nurses’ satisfaction, controlling costs, reducing clinical errors and patient’s safety. Conclusion: Models and frameworks had different structures, backgrounds, and conditions, but the outcomes were similar. Organizational structure, culture and social factors are important aspects of clinical collaboration. So it is necessary to improve the quality and effectiveness of clinical collaboration these factors to be considered. PMID:26153158

  20. A scalable framework for the global offline community land model ensemble simulation

    SciTech Connect

    Wang, Dali; Domke, Jens; Mao, Jiafu; Shi, Xiaoying; Ricciuto, Daniel M.

    2016-01-01

    Current earth system models have a large range of uncertainty, owing to differences in the simulation of feedbacks and insufficient information to constrain model parameters. Parameter disturbance experiment provides a straightforward method to quantify the variation (uncertainty) outputs caused by model inputs. Owing to the software complexity and computational intensity of earth system models, a large-scale simulation framework is needed to support ensemble simulation required by parameter disturbance experiment. This paper presents a parallel framework for the community land model ensemble simulation. After a software structure review of the community land model simulation, a single factor parameter disturbance experiment of a reference computational experiment design is used to demonstrate the software design principles, computational characteristics of individual application, parallel ensemble simulation implementation, as well as the weak scalability of this simulation framework on a high-end computer. Lastly, the paper discusses some preliminary diagnostic analysis results of the single factor parameter disturbance experiments. The framework design considerations and implementation details described in this paper can be beneficial to many other research programmes involving large scale, legacy modelling system.

  1. Modeling Complex Biological Flows in Multi-Scale Systems using the APDEC Framework

    SciTech Connect

    Trebotich, D

    2006-06-24

    We have developed advanced numerical algorithms to model biological fluids in multiscale flow environments using the software framework developed under the SciDAC APDEC ISIC. The foundation of our computational effort is an approach for modeling DNA-laden fluids as ''bead-rod'' polymers whose dynamics are fully coupled to an incompressible viscous solvent. The method is capable of modeling short range forces and interactions between particles using soft potentials and rigid constraints. Our methods are based on higher-order finite difference methods in complex geometry with adaptivity, leveraging algorithms and solvers in the APDEC Framework. Our Cartesian grid embedded boundary approach to incompressible viscous flow in irregular geometries has also been interfaced to a fast and accurate level-sets method within the APDEC Framework for extracting surfaces from volume renderings of medical image data and used to simulate cardio-vascular and pulmonary flows in critical anatomies.

  2. A modeling framework for investment planning in interdependent infrastructures in multi-hazard environments.

    SciTech Connect

    Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.; Nozick, Linda Karen; Prince, Michael

    2013-09-01

    Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research & Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorists actions based on assumptions regarding their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.

  3. A Fuzzy Logic Framework for Integrating Multiple Learned Models

    SciTech Connect

    Bobi Kai Den Hartog

    1999-03-01

    The Artificial Intelligence field of Integrating Multiple Learned Models (IMLM) explores ways to combine results from sets of trained programs. Aroclor Interpretation is an ill-conditioned problem in which trained programs must operate in scenarios outside their training ranges because it is intractable to train them completely. Consequently, they fail in ways related to the scenarios. We developed a general-purpose IMLM solution, the Combiner, and applied it to Aroclor Interpretation. The Combiner's first step, Scenario Identification (M), learns rules from very sparse, synthetic training data consisting of results from a suite of trained programs called Methods. S1 produces fuzzy belief weights for each scenario by approximately matching the rules. The Combiner's second step, Aroclor Presence Detection (AP), classifies each of three Aroclors as present or absent in a sample. The third step, Aroclor Quantification (AQ), produces quantitative values for the concentration of each Aroclor in a sample. AP and AQ use automatically learned empirical biases for each of the Methods in each scenario. Through fuzzy logic, AP and AQ combine scenario weights, automatically learned biases for each of the Methods in each scenario, and Methods' results to determine results for a sample.

  4. A climate robust integrated modelling framework for regional impact assessment of climate change

    NASA Astrophysics Data System (ADS)

    Janssen, Gijs; Bakker, Alexander; van Ek, Remco; Groot, Annemarie; Kroes, Joop; Kuiper, Marijn; Schipper, Peter; van Walsum, Paul; Wamelink, Wieger; Mol, Janet

    2013-04-01

    Decision making towards climate proofing the water management of regional catchments can benefit greatly from the availability of a climate robust integrated modelling framework, capable of a consistent assessment of climate change impacts on the various interests present in the catchments. In the Netherlands, much effort has been devoted to developing state-of-the-art regional dynamic groundwater models with a very high spatial resolution (25x25 m2). Still, these models are not completely satisfactory to decision makers because the modelling concepts do not take into account feedbacks between meteorology, vegetation/crop growth, and hydrology. This introduces uncertainties in forecasting the effects of climate change on groundwater, surface water, agricultural yields, and development of groundwater dependent terrestrial ecosystems. These uncertainties add to the uncertainties about the predictions on climate change itself. In order to create an integrated, climate robust modelling framework, we coupled existing model codes on hydrology, agriculture and nature that are currently in use at the different research institutes in the Netherlands. The modelling framework consists of the model codes MODFLOW (groundwater flow), MetaSWAP (vadose zone), WOFOST (crop growth), SMART2-SUMO2 (soil-vegetation) and NTM3 (nature valuation). MODFLOW, MetaSWAP and WOFOST are coupled online (i.e. exchange information on time step basis). Thus, changes in meteorology and CO2-concentrations affect crop growth and feedbacks between crop growth, vadose zone water movement and groundwater recharge are accounted for. The model chain WOFOST-MetaSWAP-MODFLOW generates hydrological input for the ecological prediction model combination SMART2-SUMO2-NTM3. The modelling framework was used to support the regional water management decision making process in the 267 km2 Baakse Beek-Veengoot catchment in the east of the Netherlands. Computations were performed for regionalized 30-year climate change

  5. A Physics-Based Modeling Framework for Prognostic Studies

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.

    2014-01-01

    Prognostics and Health Management (PHM) methodologies have emerged as one of the key enablers for achieving efficient system level maintenance as part of a busy operations schedule, and lowering overall life cycle costs. PHM is also emerging as a high-priority issue in critical applications, where the focus is on conducting fundamental research in the field of integrated systems health management. The term diagnostics relates to the ability to detect and isolate faults or failures in a system. Prognostics on the other hand is the process of predicting health condition and remaining useful life based on current state, previous conditions and future operating conditions. PHM methods combine sensing, data collection, interpretation of environmental, operational, and performance related parameters to indicate systems health under its actual application conditions. The development of prognostics methodologies for the electronics field has become more important as more electrical systems are being used to replace traditional systems in several applications in the aeronautics, maritime, and automotive fields. The development of prognostics methods for electronics presents several challenges due to the great variety of components used in a system, a continuous development of new electronics technologies, and a general lack of understanding of how electronics fail. Similarly with electric unmanned aerial vehicles, electrichybrid cars, and commercial passenger aircraft, we are witnessing a drastic increase in the usage of batteries to power vehicles. However, for battery-powered vehicles to operate at maximum efficiency and reliability, it becomes crucial to both monitor battery health and performance and to predict end of discharge (EOD) and end of useful life (EOL) events. We develop an electrochemistry-based model of Li-ion batteries that capture the significant electrochemical processes, are computationally efficient, capture the effects of aging, and are of suitable

  6. Empirical assessment of the uncertainty in a 3-D geological framework model

    NASA Astrophysics Data System (ADS)

    Lark, Murray; Mathers, Steve; Thorpe, Steve; Arkley, Sarah; Morgan, Dave; Lawrence, Dave

    2013-04-01

    Three-dimensional framework models are the state of the art to present geologists' understanding of a region in a form that can be used to support planning and decision making. However, there is little information on the uncertainty of such framework models. We report a statistically-designed experiment in which each of five geologists independently produced a framework model of a single region in the east of England. Each geologist used a unique set of borehole observations from which to make their model. Each set was made by withholding five unique validation boreholes from the set of all available boreholes. The models were then compared with the validation observations. Between-modeller differences were not a significant source of variation in framework model error. There was no evidence of systematic bias in the modelled depth for any unit, but there was a statistically significant but small tendency for the mean error to increase with depth below the surface. The confidence interval for the predicted height of a surface at a point ranged from ±5.6 m to ±6.4 m. There was some evidence that the variance of the model error increased with depth, but no evidence that it differed between modellers or varied with the number of close-neighbouring boreholes or distance to the outcrop. These results are specific to the area that has been modelled, with relatively simple geology, and they must also reflect the relatively dense set of boreholes available for modelling. The method should be applied under a range of conditions to derive more general conclusions, and benchmark quality measures for three-dimensional models of contrasting terranes.

  7. Modeling overland flow-driven erosion across a watershed DEM using the Landlab modeling framework.

    NASA Astrophysics Data System (ADS)

    Adams, J. M.; Gasparini, N. M.; Tucker, G. E.; Hobley, D. E. J.; Hutton, E. W. H.; Nudurupati, S. S.; Istanbulluoglu, E.

    2015-12-01

    Many traditional landscape evolution models assume steady-state hydrology when computing discharge, and generally route flow in a single direction, along the path of steepest descent. Previous work has demonstrated that, for larger watersheds or short-duration storms, hydrologic steady-state may not be achieved. In semiarid regions, often dominated by convective summertime storms, landscapes are likely heavily influenced by these short-duration but high-intensity periods of rainfall. To capture these geomorphically significant bursts of rain, a new overland flow method has been implemented in the Landlab modeling framework. This overland flow method routes a hydrograph across a landscape, and allows flow to travel in multiple directions out of a given grid node. This study compares traditional steady-state flow routing and incision methods to the new, hydrograph-driven overland flow and erosion model in Landlab. We propose that for short-duration, high-intensity precipitation events, steady-state, single-direction flow routing models will significantly overestimate discharge and erosion when compared with non-steady, multiple flow direction model solutions. To test this hypothesis, discharge and erosion are modeled using both steady-state and hydrograph methods. A stochastic storm generator is used to generate short-duration, high-intensity precipitation intervals, which drive modeled discharge and erosion across a watershed imported from a digital elevation model, highlighting Landlab's robust raster-gridding library and watershed modeling capabilities. For each storm event in this analysis, peak discharge at the outlet, incision rate at the outlet, as well as total discharge and erosion depth are compared between methods. Additionally, these results are organized by storm duration and intensity to understand how erosion rates scale with precipitation between both flow routing methods. Results show that in many cases traditional steady-state methods overestimate

  8. A framework for personalization of computational models of the human atria.

    PubMed

    Dössel, Olaf; Krueger, Martin W; Weber, Frank M; Schilling, Christopher; Schulze, Walther H W; Seemann, Gunnar

    2011-01-01

    A framework for step-by-step personalization of a computational model of human atria is presented. Beginning with anatomical modeling based on CT or MRI data, next fiber structure is superimposed using a rule-based method. If available, late-enhancement-MRI images can be considered in order to mark fibrotic tissue. A first estimate of individual electrophysiology is gained from BSPM data solving the inverse problem of ECG. A final adjustment of electrophysiology is realized using intracardiac measurements. The framework is applied using several patient data. First clinical application will be computer assisted planning of RF-ablation for treatment of atrial flutter and atrial fibrillation. PMID:22255296

  9. The Melanoma MAICare Framework: A Microsimulation Model for the Assessment of Individualized Cancer Care

    PubMed Central

    van der Meijde, Elisabeth; van den Eertwegh, Alfons J. M.; Linn, Sabine C.; Meijer, Gerrit A.; Fijneman, Remond J. A.; Coupé, Veerle M. H.

    2016-01-01

    Recently, new but expensive treatments have become available for metastatic melanoma. These improve survival, but in view of the limited funds available, cost-effectiveness needs to be evaluated. Most cancer cost-effectiveness models are based on the observed clinical events such as recurrence- free and overall survival. Times at which events are recorded depend not only on the effectiveness of treatment but also on the timing of examinations and the types of tests performed. Our objective was to construct a microsimulation model framework that describes the melanoma disease process using a description of underlying tumor growth as well as its interaction with diagnostics, treatments, and surveillance. The framework should allow for exploration of the impact of simultaneously altering curative treatment approaches in different phases of the disease as well as altering diagnostics. The developed framework consists of two components, namely, the disease model and the clinical management module. The disease model consists of a tumor level, describing growth and metastasis of the tumor, and a patient level, describing clinically observed states, such as recurrence and death. The clinical management module consists of the care patients receive. This module interacts with the disease process, influencing the rate of transition between tumor growth states at the tumor level and the rate of detecting a recurrence at the patient level. We describe the framework as the required input and the model output. Furthermore, we illustrate model calibration using registry data and data from the literature. PMID:27346945

  10. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  11. Models and Frameworks: A Synergistic Association for Developing Component-Based Applications

    PubMed Central

    Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858

  12. Models and frameworks: a synergistic association for developing component-based applications.

    PubMed

    Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.

  13. Models and frameworks: a synergistic association for developing component-based applications.

    PubMed

    Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858

  14. Building a Framework Earthquake Cycle Deformational Model for Subduction Megathrust Zones: Integrating Observations with Numerical Models

    NASA Astrophysics Data System (ADS)

    Furlong, Kevin P.; Govers, Rob; Herman, Matthew

    2016-04-01

    last for decades after a major event (e.g. Alaska 1964) We have integrated the observed patterns of upper-plate displacements (and deformation) with models of subduction zone evolution that allow us to incorporate both the transient behavior associated with post-earthquake viscous re-equilibration and the underlying long term, relatively constant elastic strain accumulation. Modeling the earthquake cycle through the use of a visco-elastic numerical model over numerous earthquake cycles, we have developed a framework model for the megathrust cycle that is constrained by observations made at a variety of plate boundary zones at different stages in their earthquake cycle (see paper by Govers et al., this meeting). Our results indicate that the observed patterns of co- and post- and inter-seismic deformation are largely controlled by interplay between elastic and viscous processes. Observed displacements represent the competition between steady elastic-strain accumulation driven by plate boundary coupling, and post-earthquake viscous behavior in response to the coseismic loading of the system by the rapid elastic rebound. The application of this framework model to observations from subduction zone observatories points up the dangers of simply extrapolating current deformation observations to the overall strain accumulation state of the subduction zoned allows us to develop improved assessments of the slip deficit accumulating within the seismogenic zone, and the near-future earthquake potential of different segments of the subduction plate boundary.

  15. A framework for extracting and representing project knowledge contexts using topic models and dynamic knowledge maps

    NASA Astrophysics Data System (ADS)

    Xu, Jin; Li, Zheng; Li, Shuliang; Zhang, Yanyan

    2015-07-01

    There is still a lack of effective paradigms and tools for analysing and discovering the contents and relationships of project knowledge contexts in the field of project management. In this paper, a new framework for extracting and representing project knowledge contexts using topic models and dynamic knowledge maps under big data environments is proposed and developed. The conceptual paradigm, theoretical underpinning, extended topic model, and illustration examples of the ontology model for project knowledge maps are presented, with further research work envisaged.

  16. ToPS: a framework to manipulate probabilistic models of sequence data.

    PubMed

    Kashiwabara, André Yoshiaki; Bonadio, Igor; Onuchic, Vitor; Amado, Felipe; Mathias, Rafael; Durham, Alan Mitchell

    2013-01-01

    Discrete Markovian models can be used to characterize patterns in sequences of values and have many applications in biological sequence analysis, including gene prediction, CpG island detection, alignment, and protein profiling. We present ToPS, a computational framework that can be used to implement different applications in bioinformatics analysis by combining eight kinds of models: (i) independent and identically distributed process; (ii) variable-length Markov chain; (iii) inhomogeneous Markov chain; (iv) hidden Markov model; (v) profile hidden Markov model; (vi) pair hidden Markov model; (vii) generalized hidden Markov model; and (viii) similarity based sequence weighting. The framework includes functionality for training, simulation and decoding of the models. Additionally, it provides two methods to help parameter setting: Akaike and Bayesian information criteria (AIC and BIC). The models can be used stand-alone, combined in Bayesian classifiers, or included in more complex, multi-model, probabilistic architectures using GHMMs. In particular the framework provides a novel, flexible, implementation of decoding in GHMMs that detects when the architecture can be traversed efficiently. PMID:24098098

  17. Predicting the resilience and recovery of aquatic systems: A framework for model evolution within environmental observatories

    NASA Astrophysics Data System (ADS)

    Hipsey, Matthew R.; Hamilton, David P.; Hanson, Paul C.; Carey, Cayelan C.; Coletti, Janaine Z.; Read, Jordan S.; Ibelings, Bas W.; Valesini, Fiona J.; Brookes, Justin D.

    2015-09-01

    Maintaining the health of aquatic systems is an essential component of sustainable catchment management, however, degradation of water quality and aquatic habitat continues to challenge scientists and policy-makers. To support management and restoration efforts aquatic system models are required that are able to capture the often complex trajectories that these systems display in response to multiple stressors. This paper explores the abilities and limitations of current model approaches in meeting this challenge, and outlines a strategy based on integration of flexible model libraries and data from observation networks, within a learning framework, as a means to improve the accuracy and scope of model predictions. The framework is comprised of a data assimilation component that utilizes diverse data streams from sensor networks, and a second component whereby model structural evolution can occur once the model is assessed against theoretically relevant metrics of system function. Given the scale and transdisciplinary nature of the prediction challenge, network science initiatives are identified as a means to develop and integrate diverse model libraries and workflows, and to obtain consensus on diagnostic approaches to model assessment that can guide model adaptation. We outline how such a framework can help us explore the theory of how aquatic systems respond to change by bridging bottom-up and top-down lines of enquiry, and, in doing so, also advance the role of prediction in aquatic ecosystem management.

  18. Alternative Model-Based and Design-Based Frameworks for Inference from Samples to Populations: From Polarization to Integration

    ERIC Educational Resources Information Center

    Sterba, Sonya K.

    2009-01-01

    A model-based framework, due originally to R. A. Fisher, and a design-based framework, due originally to J. Neyman, offer alternative mechanisms for inference from samples to populations. We show how these frameworks can utilize different types of samples (nonrandom or random vs. only random) and allow different kinds of inference (descriptive vs.…

  19. The Ecology of Human Performance Framework: A Model for Identifying and Designing Appropriate Accommodations for Adult Learners.

    ERIC Educational Resources Information Center

    Dunn, Winnie; Gilbert, Mary Pat; Parker, Kathy

    This paper proposes a model framework, The Ecology of Human Performance (EHP) framework, for organizing adult basic education to utilize the skills of occupational therapists. The paper also includes two responses to the proposed framework by Janet S. Stotts and Cheryl Keenan. Reasons for the inclusion of occupational therapy in adult education…

  20. The universal fuzzy logical framework of neural circuits and its application in modeling primary visual cortex.

    PubMed

    Hu, Hong; Li, Su; Wang, YunJiu; Qi, XiangLin; Shi, ZhongZhi

    2008-10-01

    Analytical study of large-scale nonlinear neural circuits is a difficult task. Here we analyze the function of neural systems by probing the fuzzy logical framework of the neural cells' dynamical equations. Although there is a close relation between the theories of fuzzy logical systems and neural systems and many papers investigate this subject, most investigations focus on finding new functions of neural systems by hybridizing fuzzy logical and neural system. In this paper, the fuzzy logical framework of neural cells is used to understand the nonlinear dynamic attributes of a common neural system by abstracting the fuzzy logical framework of a neural cell. Our analysis enables the educated design of network models for classes of computation. As an example, a recurrent network model of the primary visual cortex has been built and tested using this approach.

  1. A structured continuum modelling framework for martensitic transformation and reorientation in shape memory materials.

    PubMed

    Bernardini, Davide; Pence, Thomas J

    2016-04-28

    Models for shape memory material behaviour can be posed in the framework of a structured continuum theory. We study such a framework in which a scalar phase fraction field and a tensor field of martensite reorientation describe the material microstructure, in the context of finite strains. Gradients of the microstructural descriptors naturally enter the formulation and offer the possibility to describe and resolve phase transformation localizations. The constitutive theory is thoroughly described by a single free energy function in conjunction with a path-dependent dissipation function. Balance laws in the form of differential equations are obtained and contain both bulk and surface terms, the latter in terms of microstreses. A natural constraint on the tensor field for martensite reorientation gives rise to reactive fields in these balance laws. Conditions ensuring objectivity as well as the relation of this framework to that provided by currently used models for shape memory alloy behaviour are discussed.

  2. A structured continuum modelling framework for martensitic transformation and reorientation in shape memory materials.

    PubMed

    Bernardini, Davide; Pence, Thomas J

    2016-04-28

    Models for shape memory material behaviour can be posed in the framework of a structured continuum theory. We study such a framework in which a scalar phase fraction field and a tensor field of martensite reorientation describe the material microstructure, in the context of finite strains. Gradients of the microstructural descriptors naturally enter the formulation and offer the possibility to describe and resolve phase transformation localizations. The constitutive theory is thoroughly described by a single free energy function in conjunction with a path-dependent dissipation function. Balance laws in the form of differential equations are obtained and contain both bulk and surface terms, the latter in terms of microstreses. A natural constraint on the tensor field for martensite reorientation gives rise to reactive fields in these balance laws. Conditions ensuring objectivity as well as the relation of this framework to that provided by currently used models for shape memory alloy behaviour are discussed. PMID:27002064

  3. Alternative Model-Based and Design-Based Frameworks for Inference From Samples to Populations: From Polarization to Integration

    PubMed Central

    Sterba, Sonya K.

    2010-01-01

    A model-based framework, due originally to R. A. Fisher, and a design-based framework, due originally to J. Neyman, offer alternative mechanisms for inference from samples to populations. We show how these frameworks can utilize different types of samples (nonrandom or random vs. only random) and allow different kinds of inference (descriptive vs. analytic) to different kinds of populations (finite vs. infinite). We describe the extent of each framework's implementation in observational psychology research. After clarifying some important limitations of each framework, we describe how these limitations are overcome by a newer hybrid model/design-based inferential framework. This hybrid framework allows both kinds of inference to both kinds of populations, given a random sample. We illustrate implementation of the hybrid framework using the High School and Beyond data set. PMID:20411042

  4. Framework for non-coherent interface models at finite displacement jumps and finite strains

    NASA Astrophysics Data System (ADS)

    Ottosen, Niels Saabye; Ristinmaa, Matti; Mosler, Jörn

    2016-05-01

    This paper deals with a novel constitutive framework suitable for non-coherent interfaces, such as cracks, undergoing large deformations in a geometrically exact setting. For this type of interface, the displacement field shows a jump across the interface. Within the engineering community, so-called cohesive zone models are frequently applied in order to describe non-coherent interfaces. However, for existing models to comply with the restrictions imposed by (a) thermodynamical consistency (e.g., the second law of thermodynamics), (b) balance equations (in particular, balance of angular momentum) and (c) material frame indifference, these models are essentially fiber models, i.e. models where the traction vector is collinear with the displacement jump. This constraints the ability to model shear and, in addition, anisotropic effects are excluded. A novel, extended constitutive framework which is consistent with the above mentioned fundamental physical principles is elaborated in this paper. In addition to the classical tractions associated with a cohesive zone model, the main idea is to consider additional tractions related to membrane-like forces and out-of-plane shear forces acting within the interface. For zero displacement jump, i.e. coherent interfaces, this framework degenerates to existing formulations presented in the literature. For hyperelasticity, the Helmholtz energy of the proposed novel framework depends on the displacement jump as well as on the tangent vectors of the interface with respect to the current configuration - or equivalently - the Helmholtz energy depends on the displacement jump and the surface deformation gradient. It turns out that by defining the Helmholtz energy in terms of the invariants of these variables, all above-mentioned fundamental physical principles are automatically fulfilled. Extensions of the novel framework necessary for material degradation (damage) and plasticity are also covered.

  5. Integrated Modeling, Mapping, and Simulation (IMMS) Framework for Exercise and Response Planning

    NASA Technical Reports Server (NTRS)

    Mapar, Jalal; Hoette, Trisha; Mahrous, Karim; Pancerella, Carmen M.; Plantenga, Todd; Yang, Christine; Yang, Lynn; Hopmeier, Michael

    2011-01-01

    EmergenCy management personnel at federal, stale, and local levels can benefit from the increased situational awareness and operational efficiency afforded by simulation and modeling for emergency preparedness, including planning, training and exercises. To support this goal, the Department of Homeland Security's Science & Technology Directorate is funding the Integrated Modeling, Mapping, and Simulation (IMMS) program to create an integrating framework that brings together diverse models for use by the emergency response community. SUMMIT, one piece of the IMMS program, is the initial software framework that connects users such as emergency planners and exercise developers with modeling resources, bridging the gap in expertise and technical skills between these two communities. SUMMIT was recently deployed to support exercise planning for National Level Exercise 2010. Threat, casualty. infrastructure, and medical surge models were combined within SUMMIT to estimate health care resource requirements for the exercise ground truth.

  6. Predicting the resilience and recovery of aquatic systems: a framework for model evolution within environmental observatories

    USGS Publications Warehouse

    Hipsey, Matthew R.; Hamilton, David P.; Hanson, Paul C.; Carey, Cayelan C; Coletti, Janaine Z; Read, Jordan S.; Ibelings, Bas W; Valensini, Fiona J; Brookes, Justin D

    2015-01-01

    Maintaining the health of aquatic systems is an essential component of sustainable catchmentmanagement, however, degradation of water quality and aquatic habitat continues to challenge scientistsand policy-makers. To support management and restoration efforts aquatic system models are requiredthat are able to capture the often complex trajectories that these systems display in response to multiplestressors. This paper explores the abilities and limitations of current model approaches in meeting this chal-lenge, and outlines a strategy based on integration of flexible model libraries and data from observationnetworks, within a learning framework, as a means to improve the accuracy and scope of model predictions.The framework is comprised of a data assimilation component that utilizes diverse data streams from sensornetworks, and a second component whereby model structural evolution can occur once the model isassessed against theoretically relevant metrics of system function. Given the scale and transdisciplinarynature of the prediction challenge, network science initiatives are identified as a means to develop and inte-grate diverse model libraries and workflows, and to obtain consensus on diagnostic approaches to modelassessment that can guide model adaptation. We outline how such a framework can help us explore thetheory of how aquatic systems respond to change by bridging bottom-up and top-down lines of enquiry,and, in doing so, also advance the role of prediction in aquatic ecosystem management.

  7. Smart Frameworks and Self-Describing Models: Model Metadata for Automated Coupling of Hydrologic Process Components (Invited)

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.

    2013-12-01

    Model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that allow heterogeneous sets of process models to be assembled in a plug-and-play manner to create composite "system models". These mechanisms facilitate code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers, e.g. by requiring them to provide their output in specific forms that meet the input requirements of other models. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can compare the answers to these queries with similar answers from other process models in a collection and then automatically call framework

  8. A Novel, Physics-Based Data Analytics Framework for Reducing Systematic Model Errors

    NASA Astrophysics Data System (ADS)

    Wu, W.; Liu, Y.; Vandenberghe, F. C.; Knievel, J. C.; Hacker, J.

    2015-12-01

    Most climate and weather models exhibit systematic biases, such as under predicted diurnal temperatures in the WRF (Weather Research and Forecasting) model. General approaches to alleviate the systematic biases include improving model physics and numerics, improving data assimilation, and bias correction through post-processing. In this study, we developed a novel, physics-based data analytics framework in post processing by taking advantage of ever-growing high-resolution (spatial and temporal) observational and modeling data. In the framework, a spatiotemporal PCA (Principal Component Analysis) is first applied on the observational data to filter out noise and information on scales that a model may not be able to resolve. The filtered observations are then used to establish regression relationships with archived model forecasts in the same spatiotemporal domain. The regressions along with the model forecasts predict the projected observations in the forecasting period. The pre-regression PCA procedure strengthens regressions, and enhances predictive skills. We then combine the projected observations with the past observations to apply PCA iteratively to derive the final forecasts. This post-regression PCA reconstructs variances and scales of information that are lost in the regression. The framework was examined and validated with 24 days of 5-minute observational data and archives from the WRF model at 27 stations near Dugway Proving Ground, Utah. The validation shows significant bias reduction in the diurnal cycle of predicted surface air temperature compared to the direct output from the WRF model. Additionally, unlike other post-processing bias correction schemes, the data analytics framework does not require long-term historic data and model archives. A week or two of the data is enough to take into account changes in weather regimes. The program, written in python, is also computationally efficient.

  9. Adapting to Students' Social and Health Needs: Suggested Framework for Building Inclusive Models of Practice

    ERIC Educational Resources Information Center

    Schwitzer, Alan M.

    2009-01-01

    Objective: This article builds on earlier discussions about college health research. The author suggests a 5-step framework that research practitioners can use to build models of practice that accurately address the needs of diverse campus populations. Methods: The author provides 3 illustrations, drawn from published research examining college…

  10. A Model Driven Framework to Address Challenges in a Mobile Learning Environment

    ERIC Educational Resources Information Center

    Khaddage, Ferial; Christensen, Rhonda; Lai, Wing; Knezek, Gerald; Norris, Cathie; Soloway, Elliot

    2015-01-01

    In this paper a review of the pedagogical, technological, policy and research challenges and concepts underlying mobile learning is presented, followed by a brief description of categories of implementations. A model Mobile learning framework and dynamic criteria for mobile learning implementations are proposed, along with a case study of one site…

  11. A Quality Framework for Continuous Improvement of e-Learning: The e-Learning Maturity Model

    ERIC Educational Resources Information Center

    Marshall, Stephen

    2010-01-01

    The E-Learning Maturity Model (eMM) is a quality improvement framework designed to help institutional leaders assess their institution's e-learning maturity. This paper reviews the eMM, drawing on examples of assessments conducted in New Zealand, Australia, the UK and the USA to show how it helps institutional leaders assess and compare their…

  12. Toward a Model Framework of Generalized Parallel Componential Processing of Multi-Symbol Numbers

    ERIC Educational Resources Information Center

    Huber, Stefan; Cornelsen, Sonja; Moeller, Korbinian; Nuerk, Hans-Christoph

    2015-01-01

    In this article, we propose and evaluate a new model framework of parallel componential multi-symbol number processing, generalizing the idea of parallel componential processing of multi-digit numbers to the case of negative numbers by considering the polarity signs similar to single digits. In a first step, we evaluated this account by defining…

  13. MODELING FRAMEWORK FOR EVALUATING SEDIMENTATION IN STREAM NETWORKS: FOR USE IN SEDIMENT TMDL ANALYSIS

    EPA Science Inventory

    A modeling framework that can be used to evaluate sedimentation in stream networks is described. This methodology can be used to determine sediment Total Maximum Daily Loads (TMDLs) in sediment impaired waters, and provide the necessary hydrodynamic and sediment-related data t...

  14. Beyond a Definition: Toward a Framework for Designing and Specifying Mentoring Models

    ERIC Educational Resources Information Center

    Dawson, Phillip

    2014-01-01

    More than three decades of mentoring research has yet to converge on a unifying definition of mentoring; this is unsurprising given the diversity of relationships classified as mentoring. This article advances beyond a definition toward a common framework for specifying mentoring models. Sixteen design elements were identified from the literature…

  15. Instructional Dissent in the College Classroom: Using the Instructional Beliefs Model as a Framework

    ERIC Educational Resources Information Center

    LaBelle, Sara; Martin, Matthew M.; Weber, Keith

    2013-01-01

    We examined the impact of instructor characteristics and student beliefs on students' decisions to enact instructional dissent using the Instructional Beliefs Model (IBM) as a framework. Students (N = 244) completed survey questionnaires assessing their perceptions of instructors' clarity, nonverbal immediacy, and affirming style, as well as their…

  16. An Analytical Framework for Evaluating E-Commerce Business Models and Strategies.

    ERIC Educational Resources Information Center

    Lee, Chung-Shing

    2001-01-01

    Considers electronic commerce as a paradigm shift, or a disruptive innovation, and presents an analytical framework based on the theories of transaction costs and switching costs. Topics include business transformation process; scale effect; scope effect; new sources of revenue; and e-commerce value creation model and strategy. (LRW)

  17. Applying the adult Education Framework to ESP Curriculum Development: An Integrative Model.

    ERIC Educational Resources Information Center

    Sifakis, N. C.

    2003-01-01

    Presents recent work in English for specific purposes (ESP)/languages for specific purposes (LSP) and adult education and puts forward an integrative model for ESP curriculum design. Outlines a set of characteristics that identify the ESP learner within the general adult learning framework. (Author/VWL)

  18. Map Resource Packet: Course Models for the History-Social Science Framework, Grade Seven.

    ERIC Educational Resources Information Center

    California State Dept. of Education, Sacramento.

    This packet of maps is an auxiliary resource to the "World History and Geography: Medieval and Early Modern Times. Course Models for the History-Social Science Framework, Grade Seven." The set includes: outline, precipitation, and elevation maps; maps for locating key places; landform maps; and historical maps. The list of maps are grouped under…

  19. Argumentation, Dialogue Theory, and Probability Modeling: Alternative Frameworks for Argumentation Research in Education

    ERIC Educational Resources Information Center

    Nussbaum, E. Michael

    2011-01-01

    Toulmin's model of argumentation, developed in 1958, has guided much argumentation research in education. However, argumentation theory in philosophy and cognitive science has advanced considerably since 1958. There are currently several alternative frameworks of argumentation that can be useful for both research and practice in education. These…

  20. Spatial optimization of watershed management practices for nitrogen load reduction using a modeling-optimization framework

    EPA Science Inventory

    Best management practices (BMPs) are perceived as being effective in reducing nutrient loads transported from non-point sources (NPS) to receiving water bodies. The objective of this study was to develop a modeling-optimization framework that can be used by watershed management p...

  1. A Supervisory Issue When Utilizing the ASCA National Model Framework in School Counseling

    ERIC Educational Resources Information Center

    Bryant-Young, Necole; Bell, Catherine A.; Davis, Kalena M.

    2014-01-01

    The authors discuss a supervisory issue, in that, the ASCA National Model: A Framework for School Counseling Programs does not emphasize on-going supervision where ethical expectations of supervisors and supervisees in a school setting are clearly defined. Subsequently, the authors highlight supervisor expectations stated with the ASCA National…

  2. PIRPOSAL Model of Integrative STEM Education: Conceptual and Pedagogical Framework for Classroom Implementation

    ERIC Educational Resources Information Center

    Wells, John G.

    2016-01-01

    The PIRPOSAL model is both a conceptual and pedagogical framework intended for use as a pragmatic guide to classroom implementation of Integrative STEM Education. Designerly questioning prompted by a "need to know" serves as the basis for transitioning student designers within and among multiple phases while they progress toward an…

  3. Evaluating and Improving Cloud Processes in the Multi-Scale Modeling Framework

    SciTech Connect

    Ackerman, Thomas P.

    2015-03-01

    The research performed under this grant was intended to improve the embedded cloud model in the Multi-scale Modeling Framework (MMF) for convective clouds by using a 2-moment microphysics scheme rather than the single moment scheme used in all the MMF runs to date. The technical report and associated documents describe the results of testing the cloud resolving model with fixed boundary conditions and evaluation of model results with data. The overarching conclusion is that such model evaluations are problematic because errors in the forcing fields control the results so strongly that variations in parameterization values cannot be usefully constrained

  4. Modeling Nonlinear Change via Latent Change and Latent Acceleration Frameworks: Examining Velocity and Acceleration of Growth Trajectories

    ERIC Educational Resources Information Center

    Grimm, Kevin; Zhang, Zhiyong; Hamagami, Fumiaki; Mazzocco, Michele

    2013-01-01

    We propose the use of the latent change and latent acceleration frameworks for modeling nonlinear growth in structural equation models. Moving to these frameworks allows for the direct identification of "rates of change" and "acceleration" in latent growth curves--information available indirectly through traditional growth curve models when change…

  5. A New Object-Oriented MODFLOW Framework for Coupling Multiple Hydrologic Models

    NASA Astrophysics Data System (ADS)

    Langevin, C.; Hughes, J. D.; Panday, S. M.; Banta, E. R.; Niswonger, R. G.

    2014-12-01

    MODFLOW is a popular open-source groundwater flow model distributed by the U.S. Geological Survey. For 30 years, the MODFLOW program has been widely used by academic researchers, private consultants, and government scientists to accurately, reliably, and efficiently simulate groundwater flow. With time, growing interest in surface and groundwater interactions, local refinement with nested and unstructured grids, karst groundwater flow, solute transport, and saltwater intrusion, has led to the development of numerous MODFLOW versions. Although these MODFLOW versions are often based on the core version (presently MODFLOW-2005), there are often incompatibilities that restrict their use with one another. In many cases, development of these alternative versions has been challenging due to the underlying MODFLOW structure, which was designed for simulation with a single groundwater flow model using a rectilinear grid. A new object-oriented framework is being developed for MODFLOW to provide a platform for supporting multiple models and multiple types of models within the same simulation. In the new design, any number of numerical models can be tightly coupled at the matrix level by adding them to the same numerical solution, or they can be iteratively coupled until there is convergence between them. Transfer of information between models is isolated to exchange objects, which allow models to be developed and used independently. For existing MODFLOW users, this means that the program can function in the same way it always has for a single groundwater flow model. Within this new framework, a regional-scale groundwater model may be coupled with multiple local-scale groundwater models. Or, a surface water flow model can be coupled to multiple groundwater flow models. The framework naturally allows for the simulation of solute transport. Presently, unstructured control-volume finite-difference models have been implemented in the framework for three-dimensional groundwater

  6. TLS and photogrammetry for the modeling of a historic wooden framework

    NASA Astrophysics Data System (ADS)

    Koehl, M.; Viale, M.

    2012-04-01

    The building which is the object of the study is located in the center of Andlau, France. This mansion that was built in 1582 was the residence of the Lords of Andlau from the XVIth century until the French Revolution. Its architecture represents the Renaissance style of the XVIth century in particular by its volutes and its spiral staircase inside the polygonal turret. In January 2005, the municipality of Andlau became the owner of this Seigneury which is intended to welcome the future Heritage Interpretation Center (HIC), a museum is also going to be created there. Three levels of attic of this building are going to be restored and isolated, the historic framework will that way be masked and the last three levels will not be accessible any more. In this context, our lab was asked to model the framework to allow to make diagnoses there, to learn to know and to consolidate the knowledge on this type of historic framework. Finally, next to a virtual visualization, we provided other applications in particular the creation of an accurate 3D model of the framework for animations, as well as for foundation of an historical information system and for supplying the future museum and HIC with digital data. The project contains different phases: the data acquisition, the model creation and data structuring, the creation of an interactive model and the integration in a historic information system. All levels of the attic were acquired: a 3D Trimble GX scanner and partially a Trimble CX scanner were used in particular for the acquisition of data in the highest part of the framework. The various scans were directly georeferenced in the field thanks to control points, then merged together in an unique point cloud covering the whole structure. Several panoramic photos were also realized to create a virtual tour of the framework and the surroundings of the Seigneury. The purpose of the project was to supply a 3D model allowing the creation of scenographies and interactive

  7. The ObjECTS: Framework for Integrated Assessment: Hybrid Modeling of Transportation

    SciTech Connect

    Kim, Son H.; Edmonds, James A.; Lurz, Joshua; Smith, Steven J.; Wise, Marshall A.

    2006-09-01

    Technology is a central issue for the global climate change problem, requiring analysis tools that can examine the impact of specific technologies with a long-term, global context. This paper describes the architecture of the ObjECTS-MiniCAM integrated assessment model, which implements a long-term, global model of energy, economy, agriculture, land-use, atmosphere, and climate change in a framework that allows the flexible incorporation of explicit technology detail. We describe the implementation of a ''bottom-up'' representation of the transportation sector as an illustration of this approach, in which the resulting hybrid model is fully integrated, internally consistent and theoretically compatible with the regional and global modeling framework. The analysis of the transportation sector presented here supports and clarifies the need for a comprehensive strategy promoting advanced vehicle technologies and an economy-wide carbon policy to cost-effectively reduce carbon emissions from the transportation sector in the long-term.

  8. An approximate framework for quantum transport calculation with model order reduction

    SciTech Connect

    Chen, Quan; Li, Jun; Yam, Chiyung; Zhang, Yu; Wong, Ngai; Chen, Guanhua

    2015-04-01

    A new approximate computational framework is proposed for computing the non-equilibrium charge density in the context of the non-equilibrium Green's function (NEGF) method for quantum mechanical transport problems. The framework consists of a new formulation, called the X-formulation, for single-energy density calculation based on the solution of sparse linear systems, and a projection-based nonlinear model order reduction (MOR) approach to address the large number of energy points required for large applied biases. The advantages of the new methods are confirmed by numerical experiments.

  9. A Framework for Different Levels of Integration of Computational Models Into Web-Based Virtual Patients

    PubMed Central

    Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-01

    Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the

  10. Representing natural and manmade drainage systems in an earth system modeling framework

    SciTech Connect

    Li, Hongyi; Wu, Huan; Huang, Maoyi; Leung, Lai-Yung R.

    2012-08-27

    Drainage systems can be categorized into natural or geomorphological drainage systems, agricultural drainage systems and urban drainage systems. They interact closely among themselves and with climate and human society, particularly under extreme climate and hydrological events such as floods. This editorial articulates the need to holistically understand and model drainage systems in the context of climate change and human influence, and discusses the requirements and examples of feasible approaches to representing natural and manmade drainage systems in an earth system modeling framework.

  11. Development of an integrated modelling framework: comparing client-server and demand-driven control flow for model execution

    NASA Astrophysics Data System (ADS)

    Schmitz, Oliver; Karssenberg, Derek; de Jong, Kor; de Kok, Jean-Luc; de Jong, Steven M.

    2014-05-01

    The construction of hydrological models at the catchment or global scale depends on the integration of component models representing various environmental processes, often operating at different spatial and temporal discretisations. A flexible construction of spatio-temporal model components, a means to specify aggregation or disaggregation to bridge discretisation discrepancies, ease of coupling these into complex integrated models, and support for stochastic modelling and the assessment of model outputs are the desired functionalities for the development of integrated models. These functionalities are preferably combined into one modelling framework such that domain specialists can perform exploratory model development without the need to change their working environment. We implemented an integrated modelling framework in the Python programming language, providing support for 1) model construction and 2) model execution. The framework enables modellers to represent spatio-temporal processes or to specify spatio-temporal (dis)aggregation with map algebra operations provided by the PCRaster library. Model algebra operations can be used by the modeller to specify the exchange of data and therefore the coupling of components. The framework determines the control flow for the ordered execution based on the time steps and couplings of the model components given by the modeller. We implemented two different control flow mechanisms. First, a client-server approach is used with a central entity controlling the execution of the component models and steering the data exchange. Second, a demand-driven approach is used that triggers the execution of a component model when data is requested by a coupled component model. We show that both control flow mechanisms allow for the execution of stochastic, multi-scale integrated models. We examine the implications of each control flow mechanism on the terminology used by the modeller to specify integrated models, and illustrate the

  12. Framework for microbial food-safety risk assessments amenable to Bayesian modeling.

    PubMed

    Williams, Michael S; Ebel, Eric D; Vose, David

    2011-04-01

    Regulatory agencies often perform microbial risk assessments to evaluate the change in the number of human illnesses as the result of a new policy that reduces the level of contamination in the food supply. These agencies generally have regulatory authority over the production and retail sectors of the farm-to-table continuum. Any predicted change in contamination that results from new policy that regulates production practices occurs many steps prior to consumption of the product. This study proposes a framework for conducting microbial food-safety risk assessments; this framework can be used to quantitatively assess the annual effects of national regulatory policies. Advantages of the framework are that estimates of human illnesses are consistent with national disease surveillance data (which are usually summarized on an annual basis) and some of the modeling steps that occur between production and consumption can be collapsed or eliminated. The framework leads to probabilistic models that include uncertainty and variability in critical input parameters; these models can be solved using a number of different Bayesian methods. The Bayesian synthesis method performs well for this application and generates posterior distributions of parameters that are relevant to assessing the effect of implementing a new policy. An example, based on Campylobacter and chicken, estimates the annual number of illnesses avoided by a hypothetical policy; this output could be used to assess the economic benefits of a new policy. Empirical validation of the policy effect is also examined by estimating the annual change in the numbers of illnesses observed via disease surveillance systems.

  13. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    SciTech Connect

    Hadjidoukas, P.E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.

    2015-03-01

    We present Π4U,{sup 1} an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  14. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    NASA Astrophysics Data System (ADS)

    Hadjidoukas, P. E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.

    2015-03-01

    We present Π4U, an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  15. A Modeling Framework to Quantify Dilution Enhancement in Spatially Heterogeneous Aquifers

    NASA Astrophysics Data System (ADS)

    de Barros, Felipe; Fiori, Aldo; Boso, Francesca; Bellin, Alberto

    2016-04-01

    Solute dilution rates are strongly affected by the spatial fluctuations of the permeability. Current challenges consist of establishing a quantitative link between the statistical properties of the heterogeneous porous media and the concentration field. Proper quantification of solute dilution is crucial for the success of a remediation campaign and for risk assessment. In this work, we provide a modeling framework to quantify the dilution of a non-reactive solute. More precisely, we model that heterogeneity induced dilution enhancement within a steady state flow. Adopting the Lagrangian framework, we obtain semi-analytical solutions for the dilution index as a function of the structural parameters characterizing the permeability field. The solutions provided are valid for uniform-in-the-mean steady flow fields, small injection source and weak-to-mild heterogeneity in the log-permeability. Results show how the dilution enhancement of the solute plume depends the statistical anisotropy ratio and the heterogeneity level of the porous medium. The modeling framework also captures the temporal evolution of the dilution rate at distinct time regimes thus recovering previous results from the literature. Finally, the performance of the framework is verified with high resolution numerical results and successfully tested against the Cape Cod field data.

  16. Discrete Element Framework for Modelling Extracellular Matrix, Deformable Cells and Subcellular Components

    PubMed Central

    Gardiner, Bruce S.; Wong, Kelvin K. L.; Joldes, Grand R.; Rich, Addison J.; Tan, Chin Wee; Burgess, Antony W.; Smith, David W.

    2015-01-01

    This paper presents a framework for modelling biological tissues based on discrete particles. Cell components (e.g. cell membranes, cell cytoskeleton, cell nucleus) and extracellular matrix (e.g. collagen) are represented using collections of particles. Simple particle to particle interaction laws are used to simulate and control complex physical interaction types (e.g. cell-cell adhesion via cadherins, integrin basement membrane attachment, cytoskeletal mechanical properties). Particles may be given the capacity to change their properties and behaviours in response to changes in the cellular microenvironment (e.g., in response to cell-cell signalling or mechanical loadings). Each particle is in effect an ‘agent’, meaning that the agent can sense local environmental information and respond according to pre-determined or stochastic events. The behaviour of the proposed framework is exemplified through several biological problems of ongoing interest. These examples illustrate how the modelling framework allows enormous flexibility for representing the mechanical behaviour of different tissues, and we argue this is a more intuitive approach than perhaps offered by traditional continuum methods. Because of this flexibility, we believe the discrete modelling framework provides an avenue for biologists and bioengineers to explore the behaviour of tissue systems in a computational laboratory. PMID:26452000

  17. A framework for analyzing the robustness of movement models to variable step discretization.

    PubMed

    Schlägel, Ulrike E; Lewis, Mark A

    2016-10-01

    When sampling animal movement paths, the frequency at which location measurements are attempted is a critical feature for data analysis. Important quantities derived from raw data, e.g. travel distance or sinuosity, can differ largely based on the temporal resolution of the data. Likewise, when movement models are fitted to data, parameter estimates have been demonstrated to vary with sampling rate. Thus, biological statements derived from such analyses can only be made with respect to the resolution of the underlying data, limiting extrapolation of results and comparison between studies. To address this problem, we investigate whether there are models that are robust against changes in temporal resolution. First, we propose a mathematically rigorous framework, in which we formally define robustness as a model property. We then use the framework for a thorough assessment of a range of basic random walk models, in which we also show how robustness relates to other probabilistic concepts. While we found robustness to be a strong condition met by few models only, we suggest a new method to extend models so as to make them robust. Our framework provides a new systematic, mathematically founded approach to the question if, and how, sampling rate of movement paths affects statistical inference.

  18. Evaluation framework for nursing education programs: application of the CIPP model.

    PubMed

    Singh, Mina D

    2004-01-01

    It is advised that all nursing education programs conduct program evaluations to address accountability requirements and information for planning and guiding the delivery of the programs. Stufflebeam's CIPP Model, supported by triangulation of multiple modes of data collection provides such a theoretical framework for evaluations. This article proposes a total CIPP evaluation framework for nursing education programs. While this evaluation framework is applicable to any nursing evaluation program, it is practically useful for collaborative nursing programs as it allows a full assessment of each partner in its context. Under the direction of this author, the York-Seneca-Georgian-Durham collaborative BScN Program Evaluation Committee in Ontario developed and utilized a CIPP process evaluation.

  19. An Automated Application Framework to Model Disordered Materials Based on a High Throughput First Principles Approach

    NASA Astrophysics Data System (ADS)

    Oses, Corey; Yang, Kesong; Curtarolo, Stefano; Duke Univ Collaboration; UC San Diego Collaboration

    Predicting material properties of disordered systems remains a long-standing and formidable challenge in rational materials design. To address this issue, we introduce an automated software framework capable of modeling partial occupation within disordered materials using a high-throughput (HT) first principles approach. At the heart of the approach is the construction of supercells containing a virtually equivalent stoichiometry to the disordered material. All unique supercell permutations are enumerated and material properties of each are determined via HT electronic structure calculations. In accordance with a canonical ensemble of supercell states, the framework evaluates ensemble average properties of the system as a function of temperature. As proof of concept, we examine the framework's final calculated properties of a zinc chalcogenide (ZnS1-xSex), a wide-gap oxide semiconductor (MgxZn1-xO), and an iron alloy (Fe1-xCux) at various stoichiometries.

  20. A dynamic water-quality modeling framework for the Neuse River estuary, North Carolina

    USGS Publications Warehouse

    Bales, Jerad D.; Robbins, Jeanne C.

    1999-01-01

    As a result of fish kills in the Neuse River estuary in 1995, nutrient reduction strategies were developed for point and nonpoint sources in the basin. However, because of the interannual variability in the natural system and the resulting complex hydrologic-nutrient inter- actions, it is difficult to detect through a short-term observational program the effects of management activities on Neuse River estuary water quality and aquatic health. A properly constructed water-quality model can be used to evaluate some of the potential effects of manage- ment actions on estuarine water quality. Such a model can be used to predict estuarine response to present and proposed nutrient strategies under the same set of meteorological and hydrologic conditions, thus removing the vagaries of weather and streamflow from the analysis. A two-dimensional, laterally averaged hydrodynamic and water-quality modeling framework was developed for the Neuse River estuary by using previously collected data. Development of the modeling framework consisted of (1) computational grid development, (2) assembly of data for model boundary conditions and model testing, (3) selection of initial values of model parameters, and (4) limited model testing. The model domain extends from Streets Ferry to Oriental, N.C., includes seven lateral embayments that have continual exchange with the main- stem of the estuary, three point-source discharges, and three tributary streams. Thirty-five computational segments represent the mainstem of the estuary, and the entire framework contains a total of 60 computa- tional segments. Each computational cell is 0.5 meter thick; segment lengths range from 500 meters to 7,125 meters. Data that were used to develop the modeling framework were collected during March through October 1991 and represent the most comprehensive data set available prior to 1997. Most of the data were collected by the North Carolina Division of Water Quality, the University of North Carolina

  1. Implementation and validation of a modeling framework to assess personal exposure to black carbon.

    PubMed

    Dons, Evi; Van Poppel, Martine; Kochan, Bruno; Wets, Geert; Int Panis, Luc

    2014-01-01

    Because people tend to move from one place to another during the day, their exposure to air pollution will be determined by the concentration at each location combined with the exposure encountered in transport. In order to estimate the exposure of individuals in a population more accurately, the activity-based modeling framework for Black Carbon exposure assessment, AB(2)C, was developed. An activity-based traffic model was applied to model the whereabouts of individual agents. Exposure to black carbon (BC) in different microenvironments is assessed with a land use regression model, combined with a fixed indoor/outdoor factor for exposure in indoor environments. To estimate exposure in transport, a separate model was used taking into account transport mode, timing of the trip and degree of urbanization. The modeling framework is validated using weeklong time-activity diaries and BC exposure as revealed from a personal monitoring campaign with 62 participants. For each participant in the monitoring campaign, a synthetic population of 100 model-agents per day was made up with all agents meeting similar preconditions as each real-life agent. When these model-agents pass through every stage of the modeling framework, it results in a distribution of potential exposures for each individual. The AB(2)C model estimates average personal exposure slightly more accurately compared to ambient concentrations as predicted for the home subzone; however the added value of a dynamic model lies in the potential for detecting short term peak exposures rather than modeling average exposures. The latter may bring new opportunities to epidemiologists: studying the effect of frequently repeated but short exposure peaks on long term exposure and health. PMID:24161448

  2. Performance of default risk model with barrier option framework and maximum likelihood estimation: Evidence from Taiwan

    NASA Astrophysics Data System (ADS)

    Chou, Heng-Chih; Wang, David

    2007-11-01

    We investigate the performance of a default risk model based on the barrier option framework with maximum likelihood estimation. We provide empirical validation of the model by showing that implied default barriers are statistically significant for a sample of construction firms in Taiwan over the period 1994-2004. We find that our model dominates the commonly adopted models, Merton model, Z-score model and ZETA model. Moreover, we test the n-year-ahead prediction performance of the model and find evidence that the prediction accuracy of the model improves as the forecast horizon decreases. Finally, we assess the effect of estimated default risk on equity returns and find that default risk is able to explain equity returns and that default risk is a variable worth considering in asset-pricing tests, above and beyond size and book-to-market.

  3. Intrinsic flexibility of porous materials; theory, modelling and the flexibility window of the EMT zeolite framework

    PubMed Central

    Fletcher, Rachel E.; Wells, Stephen A.; Leung, Ka Ming; Edwards, Peter P.; Sartbaeva, Asel

    2015-01-01

    Framework materials have structures containing strongly bonded polyhedral groups of atoms connected through their vertices. Typically the energy cost for variations of the inter-polyhedral geometry is much less than the cost of distortions of the polyhedra themselves – as in the case of silicates, where the geometry of the SiO4 tetrahedral group is much more strongly constrained than the Si—O—Si bridging angle. As a result, framework materials frequently display intrinsic flexibility, and their dynamic and static properties are strongly influenced by low-energy collective motions of the polyhedra. Insight into these motions can be obtained in reciprocal space through the ‘rigid unit mode’ (RUM) model, and in real-space through template-based geometric simulations. We briefly review the framework flexibility phenomena in energy-relevant materials, including ionic conductors, perovskites and zeolites. In particular we examine the ‘flexibility window’ phenomenon in zeolites and present novel results on the flexibility window of the EMT framework, which shed light on the role of structure-directing agents. Our key finding is that the crown ether, despite its steric bulk, does not limit the geometric flexibility of the framework. PMID:26634720

  4. Intrinsic flexibility of porous materials; theory, modelling and the flexibility window of the EMT zeolite framework.

    PubMed

    Fletcher, Rachel E; Wells, Stephen A; Leung, Ka Ming; Edwards, Peter P; Sartbaeva, Asel

    2015-12-01

    Framework materials have structures containing strongly bonded polyhedral groups of atoms connected through their vertices. Typically the energy cost for variations of the inter-polyhedral geometry is much less than the cost of distortions of the polyhedra themselves - as in the case of silicates, where the geometry of the SiO4 tetrahedral group is much more strongly constrained than the Si-O-Si bridging angle. As a result, framework materials frequently display intrinsic flexibility, and their dynamic and static properties are strongly influenced by low-energy collective motions of the polyhedra. Insight into these motions can be obtained in reciprocal space through the `rigid unit mode' (RUM) model, and in real-space through template-based geometric simulations. We briefly review the framework flexibility phenomena in energy-relevant materials, including ionic conductors, perovskites and zeolites. In particular we examine the `flexibility window' phenomenon in zeolites and present novel results on the flexibility window of the EMT framework, which shed light on the role of structure-directing agents. Our key finding is that the crown ether, despite its steric bulk, does not limit the geometric flexibility of the framework.

  5. The Framework for 0-D Atmospheric Modeling (F0AM) v3.1

    NASA Astrophysics Data System (ADS)

    Wolfe, Glenn M.; Marvin, Margaret R.; Roberts, Sandra J.; Travis, Katherine R.; Liao, Jin

    2016-09-01

    The Framework for 0-D Atmospheric Modeling (F0AM) is a flexible and user-friendly MATLAB-based platform for simulation of atmospheric chemistry systems. The F0AM interface incorporates front-end configuration of observational constraints and model setups, making it readily adaptable to simulation of photochemical chambers, Lagrangian plumes, and steady-state or time-evolving solar cycles. Six different chemical mechanisms and three options for calculation of photolysis frequencies are currently available. Example simulations are presented to illustrate model capabilities and, more generally, highlight some of the advantages and challenges of 0-D box modeling.

  6. Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model

    SciTech Connect

    Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.

    2001-11-09

    Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.

  7. A general science-based framework for dynamical spatio-temporal models

    USGS Publications Warehouse

    Wikle, C.K.; Hooten, M.B.

    2010-01-01

    Spatio-temporal statistical models are increasingly being used across a wide variety of scientific disciplines to describe and predict spatially-explicit processes that evolve over time. Correspondingly, in recent years there has been a significant amount of research on new statistical methodology for such models. Although descriptive models that approach the problem from the second-order (covariance) perspective are important, and innovative work is being done in this regard, many real-world processes are dynamic, and it can be more efficient in some cases to characterize the associated spatio-temporal dependence by the use of dynamical models. The chief challenge with the specification of such dynamical models has been related to the curse of dimensionality. Even in fairly simple linear, first-order Markovian, Gaussian error settings, statistical models are often over parameterized. Hierarchical models have proven invaluable in their ability to deal to some extent with this issue by allowing dependency among groups of parameters. In addition, this framework has allowed for the specification of science based parameterizations (and associated prior distributions) in which classes of deterministic dynamical models (e. g., partial differential equations (PDEs), integro-difference equations (IDEs), matrix models, and agent-based models) are used to guide specific parameterizations. Most of the focus for the application of such models in statistics has been in the linear case. The problems mentioned above with linear dynamic models are compounded in the case of nonlinear models. In this sense, the need for coherent and sensible model parameterizations is not only helpful, it is essential. Here, we present an overview of a framework for incorporating scientific information to motivate dynamical spatio-temporal models. First, we illustrate the methodology with the linear case. We then develop a general nonlinear spatio-temporal framework that we call general quadratic

  8. Combining the Strengths of Physically Based Models with Statistical Modelling Tools Using a Hierarchical Mixture of Experts Framework

    NASA Astrophysics Data System (ADS)

    Marshall, L. A.; Sharma, A.; Nott, D.

    2005-12-01

    Rigidity in a modelling framework has been known to result in considerable bias in cases where the system behaviour is closely linked to the catchment antecedent conditions. An alternative to accommodate such variations in the system makeup is to enable the model to be flexible enough to evolve as antecedent conditions change. We present a framework that incorporates such flexibility by expressing the model through the combination of a number of different model structures. Each structure is adopted at a given time with a probability that depends on the current hydrologic state of the catchment. This framework is known as a Hierarchical Mixture of Experts (HME). When applied in a hydrological context, the HME approach has two major functions. It can act as a powerful predictive tool where simulation is extended beyond the calibration period. It also offers a basis for model development and building based on interpretation of the final model architecture in calibration. The probabilistic nature of HME means that it is ideally specified using Bayesian inference. The Bayesian approach also formalises the incorporation of uncertainty in the model specification. The interpretability of the overall HME framework is largely influenced by the individual model structures. One model which can be applied in the HME context is the popular Topmodel. Topmodel is a modelling tool that allows the simulation of distributed catchment response to rainfall. Many different versions of the basic model structure exist as the underlying concepts are challenged by different catchment studies. One modification often made is to the description of the baseflow recession. This study will investigate the predictive capability of Topmodel when the model is specified using both a Bayesian and HME approach. The specification of the distribution of model errors is investigated by definition of several different probability distributions. The HME approach is applied in a framework that compares two

  9. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  10. A general ecophysiological framework for modelling the impact of pests and pathogens on forest ecosystems.

    PubMed Central

    Dietze, Michael C; Matthes, Jaclyn Hatala

    2014-01-01

    Forest insects and pathogens (FIPs) have enormous impacts on community dynamics, carbon storage and ecosystem services, however, ecosystem modelling of FIPs is limited due to their variability in severity and extent. We present a general framework for modelling FIP disturbances through their impacts on tree ecophysiology. Five pathways are identified as the basis for functional groupings: increases in leaf, stem and root turnover, and reductions in phloem and xylem transport. A simple ecophysiological model was used to explore the sensitivity of forest growth, mortality and ecosystem fluxes to varying outbreak severity. Across all pathways, low infection was associated with growth reduction but limited mortality. Moderate infection led to individual tree mortality, whereas high levels led to stand-level die-offs delayed over multiple years. Delayed mortality is consistent with observations and critical for capturing biophysical, biogeochemical and successional responses. This framework enables novel predictions under present and future global change scenarios. PMID:25168168

  11. Personalized in vitro cancer models to predict therapeutic response: Challenges and a framework for improvement.

    PubMed

    Morgan, Molly M; Johnson, Brian P; Livingston, Megan K; Schuler, Linda A; Alarid, Elaine T; Sung, Kyung E; Beebe, David J

    2016-09-01

    Personalized cancer therapy focuses on characterizing the relevant phenotypes of the patient, as well as the patient's tumor, to predict the most effective cancer therapy. Historically, these methods have not proven predictive in regards to predicting therapeutic response. Emerging culture platforms are designed to better recapitulate the in vivo environment, thus, there is renewed interest in integrating patient samples into in vitro cancer models to assess therapeutic response. Successful examples of translating in vitro response to clinical relevance are limited due to issues with patient sample acquisition, variability and culture. We will review traditional and emerging in vitro models for personalized medicine, focusing on the technologies, microenvironmental components, and readouts utilized. We will then offer our perspective on how to apply a framework derived from toxicology and ecology towards designing improved personalized in vitro models of cancer. The framework serves as a tool for identifying optimal readouts and culture conditions, thus maximizing the information gained from each patient sample.

  12. A model framework to represent plant-physiology and rhizosphere processes in soil profile simulation models

    NASA Astrophysics Data System (ADS)

    Vanderborght, J.; Javaux, M.; Couvreur, V.; Schröder, N.; Huber, K.; Abesha, B.; Schnepf, A.; Vereecken, H.

    2013-12-01

    of plant transpiration by root-zone produced plant hormones, and (iv) the impact of salt accumulation at the soil-root interface on root water uptake. We further propose a framework how this process knowledge could be implemented in root zone simulation models that do not resolve small scale processes.

  13. An interactive framework for developing simulation models of hospital accident and emergency services.

    PubMed

    Codrington-Virtue, Anthony; Whittlestone, Paul; Kelly, John; Chaussalet, Thierry

    2005-01-01

    Discrete-event simulation can be a valuable tool in modelling health care systems. This paper describes an interactive framework to model and simulate a hospital accident and emergency department. An interactive spreadsheet (Excel) facilitated the user-friendly input of data such as patient pathways, arrival times, service times and resources into the discrete event simulation package (SIMUL8). The framework was enhanced further by configuring SIMUL8 to visually show patient flow and activity on a schematic plan of an A&E. The patient flow and activity information included patient icons flowing along A&E corridors and pathways, processes undertaken in A&E work areas and queue activity. One major benefit of visually showing patient flow and activity was that modellers and decision makers could visually gain a dynamic insight into the performance of the overall system and visually see changes over the model run cycle. Another key benefit of the interactive framework was the ability to quickly and easily change model parameters to trial, test and compare different scenarios.

  14. The Importance of Communicating Uncertainty to the 3D Geological Framework Model of Alberta

    NASA Astrophysics Data System (ADS)

    MacCormack, Kelsey

    2015-04-01

    The Alberta Geological Survey (AGS) has been tasked with developing a 3-dimensional (3D) geological framework for Alberta (660,000 km2). Our goal is to develop 'The Framework' as a sophisticated platform, capable of integrating a variety of data types from multiple sources enabling the development of multi-scale, interdisciplinary models with built-in feedback mechanisms, allowing the individual components of the model to adapt and evolve over time as our knowledge and understanding of the subsurface increases. The geoscience information within these models is often taken at face value and assumed that the attribute accuracy is equivalent to the digital accuracy recorded by the computer, which can lead to overconfidence in the model results. We need to make sure that decision makers understand that models are simply versions of reality and all contain a certain amount of error and uncertainty. More importantly, it is necessary to convey that error and uncertainty are not bad, and should be quantified and understood rather than ignored. This presentation will focus on how the AGS is quantifying and communicating uncertainty within the Geologic Framework to decision makers and the general public, as well as utilizing uncertainty results to strategically prioritize future work.

  15. Toward university modeling instruction--biology: adapting curricular frameworks from physics to biology.

    PubMed

    Manthey, Seth; Brewe, Eric

    2013-06-01

    University Modeling Instruction (UMI) is an approach to curriculum and pedagogy that focuses instruction on engaging students in building, validating, and deploying scientific models. Modeling Instruction has been successfully implemented in both high school and university physics courses. Studies within the physics education research (PER) community have identified UMI's positive impacts on learning gains, equity, attitudinal shifts, and self-efficacy. While the success of this pedagogical approach has been recognized within the physics community, the use of models and modeling practices is still being developed for biology. Drawing from the existing research on UMI in physics, we describe the theoretical foundations of UMI and how UMI can be adapted to include an emphasis on models and modeling for undergraduate introductory biology courses. In particular, we discuss our ongoing work to develop a framework for the first semester of a two-semester introductory biology course sequence by identifying the essential basic models for an introductory biology course sequence.

  16. Toward University Modeling Instruction—Biology: Adapting Curricular Frameworks from Physics to Biology

    PubMed Central

    Manthey, Seth; Brewe, Eric

    2013-01-01

    University Modeling Instruction (UMI) is an approach to curriculum and pedagogy that focuses instruction on engaging students in building, validating, and deploying scientific models. Modeling Instruction has been successfully implemented in both high school and university physics courses. Studies within the physics education research (PER) community have identified UMI's positive impacts on learning gains, equity, attitudinal shifts, and self-efficacy. While the success of this pedagogical approach has been recognized within the physics community, the use of models and modeling practices is still being developed for biology. Drawing from the existing research on UMI in physics, we describe the theoretical foundations of UMI and how UMI can be adapted to include an emphasis on models and modeling for undergraduate introductory biology courses. In particular, we discuss our ongoing work to develop a framework for the first semester of a two-semester introductory biology course sequence by identifying the essential basic models for an introductory biology course sequence. PMID:23737628

  17. A Biophysical Modeling Framework for Assessing the Environmental Impact of Biofuel Production

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Izaurradle, C.; Manowitz, D.; West, T. O.; Post, W. M.; Thomson, A. M.; Nichols, J.; Bandaru, V.; Williams, J. R.

    2009-12-01

    Long-term sustainability of a biofuel economy necessitates environmentally friendly biofuel production systems. We describe a biophysical modeling framework developed to understand and quantify the environmental value and impact (e.g. water balance, nutrients balance, carbon balance, and soil quality) of different biomass cropping systems. This modeling framework consists of three major components: 1) a Geographic Information System (GIS) based data processing system, 2) a spatially-explicit biophysical modeling approach, and 3) a user friendly information distribution system. First, we developed a GIS to manage the large amount of geospatial data (e.g. climate, land use, soil, and hydrograhy) and extract input information for the biophysical model. Second, the Environmental Policy Integrated Climate (EPIC) biophysical model is used to predict the impact of various cropping systems and management intensities on productivity, water balance, and biogeochemical variables. Finally, a geo-database is developed to distribute the results of ecosystem service variables (e.g. net primary productivity, soil carbon balance, soil erosion, nitrogen and phosphorus losses, and N2O fluxes) simulated by EPIC for each spatial modeling unit online using PostgreSQL. We applied this framework in a Regional Intensive Management Area (RIMA) of 9 counties in Michigan. A total of 4,833 spatial units with relatively homogeneous biophysical properties were derived using SSURGO, Crop Data Layer, County, and 10-digit watershed boundaries. For each unit, EPIC was executed from 1980 to 2003 under 54 cropping scenarios (eg. corn, switchgrass, and hybrid poplar). The simulation results were compared with historical crop yields from USDA NASS. Spatial mapping of the results show high variability among different cropping scenarios in terms of the simulated ecosystem services variables. Overall, the framework developed in this study enables the incorporation of environmental factors into economic and

  18. Modelling framework for dynamic interaction between multiple pedestrians and vertical vibrations of footbridges

    NASA Astrophysics Data System (ADS)

    Venuti, Fiammetta; Racic, Vitomir; Corbetta, Alessandro

    2016-09-01

    After 15 years of active research on the interaction between moving people and civil engineering structures, there is still a lack of reliable models and adequate design guidelines pertinent to vibration serviceability of footbridges due to multiple pedestrians. There are three key issues that a new generation of models should urgently address: pedestrian "intelligent" interaction with the surrounding people and environment, effect of human bodies on dynamic properties of unoccupied structure and inter-subject and intra-subject variability of pedestrian walking loads. This paper presents a modelling framework of human-structure interaction in the vertical direction which addresses all three issues. The framework comprises two main models: (1) a microscopic model of multiple pedestrian traffic that simulates time varying position and velocity of each individual pedestrian on the footbridge deck, and (2) a coupled dynamic model of a footbridge and multiple walking pedestrians. The footbridge is modelled as a SDOF system having the dynamic properties of the unoccupied structure. Each walking pedestrian in a group or crowd is modelled as a SDOF system with an adjacent stochastic vertical force that moves along the footbridge following the trajectory and the gait pattern simulated by the microscopic model of pedestrian traffic. Performance of the suggested modelling framework is illustrated by a series of simulated vibration responses of a virtual footbridge due to light, medium and dense pedestrian traffic. Moreover, the Weibull distribution is shown to fit well the probability density function of the local peaks in the acceleration response. Considering the inherent randomness of the crowd, this makes it possible to determine the probability of exceeding any given acceleration value of the occupied bridge.

  19. A model framework for identifying genes that guide the evolution of heterochrony.

    PubMed

    Sun, Lidan; Ye, Meixia; Hao, Han; Wang, Ningtao; Wang, Yaqun; Cheng, Tangren; Zhang, Qixiang; Wu, Rongling

    2014-08-01

    Heterochrony, the phylogenic change in the time of developmental events or rate of development, has been thought to play an important role in producing phenotypic novelty during evolution. Increasing evidence suggests that specific genes are implicated in heterochrony, guiding the process of developmental divergence, but no quantitative models have been instrumented to map such heterochrony genes. Here, we present a computational framework for genetic mapping by which to characterize and locate quantitative trait loci (QTLs) that govern heterochrony described by four parameters, the timing of the inflection point, the timing of maximum acceleration of growth, the timing of maximum deceleration of growth, and the length of linear growth. The framework was developed from functional mapping, a dynamic model derived to map QTLs for the overall process and pattern of development. By integrating an optimality algorithm, the framework allows the so-called heterochrony QTLs (hQTLs) to be tested and quantified. Specific pipelines are given for testing how hQTLs control the onset and offset of developmental events, the rate of development, and duration of a particular developmental stage. Computer simulation was performed to examine the statistical properties of the model and demonstrate its utility to characterize the effect of hQTLs on population diversification due to heterochrony. By analyzing a genetic mapping data in rice, the framework identified an hQTL that controls the timing of maximum growth rate and duration of linear growth stage in plant height growth. The framework provides a tool to study how genetic variation translates into phenotypic innovation, leading a lineage to evolve, through heterochrony. PMID:24817546

  20. An analytical framework to assist decision makers in the use of forest ecosystem model predictions

    USGS Publications Warehouse

    Larocque, Guy R.; Bhatti, Jagtar S.; Ascough, J.C.; Liu, J.; Luckai, N.; Mailly, D.; Archambault, L.; Gordon, Andrew M.

    2011-01-01

    The predictions from most forest ecosystem models originate from deterministic simulations. However, few evaluation exercises for model outputs are performed by either model developers or users. This issue has important consequences for decision makers using these models to develop natural resource management policies, as they cannot evaluate the extent to which predictions stemming from the simulation of alternative management scenarios may result in significant environmental or economic differences. Various numerical methods, such as sensitivity/uncertainty analyses, or bootstrap methods, may be used to evaluate models and the errors associated with their outputs. However, the application of each of these methods carries unique challenges which decision makers do not necessarily understand; guidance is required when interpreting the output generated from each model. This paper proposes a decision flow chart in the form of an analytical framework to help decision makers apply, in an orderly fashion, different steps involved in examining the model outputs. The analytical framework is discussed with regard to the definition of problems and objectives and includes the following topics: model selection, identification of alternatives, modelling tasks and selecting alternatives for developing policy or implementing management scenarios. Its application is illustrated using an on-going exercise in developing silvicultural guidelines for a forest management enterprise in Ontario, Canada. ?? 2010.

  1. A framework for implementation of organ effect models in TOPAS with benchmarks extended to proton therapy

    NASA Astrophysics Data System (ADS)

    Ramos-Méndez, J.; Perl, J.; Schümann, J.; Shin, J.; Paganetti, H.; Faddegon, B.

    2015-07-01

    The aim of this work was to develop a framework for modeling organ effects within TOPAS (TOol for PArticle Simulation), a wrapper of the Geant4 Monte Carlo toolkit that facilitates particle therapy simulation. The DICOM interface for TOPAS was extended to permit contour input, used to assign voxels to organs. The following dose response models were implemented: The Lyman-Kutcher-Burman model, the critical element model, the population based critical volume model, the parallel-serial model, a sigmoid-based model of Niemierko for normal tissue complication probability and tumor control probability (TCP), and a Poisson-based model for TCP. The framework allows easy manipulation of the parameters of these models and the implementation of other models. As part of the verification, results for the parallel-serial and Poisson model for x-ray irradiation of a water phantom were compared to data from the AAPM Task Group 166. When using the task group dose-volume histograms (DVHs), results were found to be sensitive to the number of points in the DVH, with differences up to 2.4%, some of which are attributable to differences between the implemented models. New results are given with the point spacing specified. When using Monte Carlo calculations with TOPAS, despite the relatively good match to the published DVH’s, differences up to 9% were found for the parallel-serial model (for a maximum DVH difference of 2%) and up to 0.5% for the Poisson model (for a maximum DVH difference of 0.5%). However, differences of 74.5% (in Rectangle1), 34.8% (in PTV) and 52.1% (in Triangle) for the critical element, critical volume and the sigmoid-based models were found respectively. We propose a new benchmark for verification of organ effect models in proton therapy. The benchmark consists of customized structures in the spread out Bragg peak plateau, normal tissue, tumor, penumbra and in the distal region. The DVH’s, DVH point spacing, and results of the organ effect models are provided

  2. The SAM framework: modeling the effects of management factors on human behavior in risk analysis.

    PubMed

    Murphy, D M; Paté-Cornell, M E

    1996-08-01

    Complex engineered systems, such as nuclear reactors and chemical plants, have the potential for catastrophic failure with disastrous consequences. In recent years, human and management factors have been recognized as frequent root causes of major failures in such systems. However, classical probabilistic risk analysis (PRA) techniques do not account for the underlying causes of these errors because they focus on the physical system and do not explicitly address the link between components' performance and organizational factors. This paper describes a general approach for addressing the human and management causes of system failure, called the SAM (System-Action-Management) framework. Beginning with a quantitative risk model of the physical system, SAM expands the scope of analysis to incorporate first the decisions and actions of individuals that affect the physical system. SAM then links management factors (incentives, training, policies and procedures, selection criteria, etc.) to those decisions and actions. The focus of this paper is on four quantitative models of action that describe this last relationship. These models address the formation of intentions for action and their execution as a function of the organizational environment. Intention formation is described by three alternative models: a rational model, a bounded rationality model, and a rule-based model. The execution of intentions is then modeled separately. These four models are designed to assess the probabilities of individual actions from the perspective of management, thus reflecting the uncertainties inherent to human behavior. The SAM framework is illustrated for a hypothetical case of hazardous materials transportation. This framework can be used as a tool to increase the safety and reliability of complex technical systems by modifying the organization, rather than, or in addition to, re-designing the physical system.

  3. Integrating physical vulnerability models in a holistic framework-a tool for practitioners

    NASA Astrophysics Data System (ADS)

    Papathoma-Koehle, Maria; Liliana Ciurean, Roxana

    2014-05-01

    The cost of hydro-meteorological hazards is increasing globally not only due to the influence of climate change upon the intensity and frequency of various natural processes but also due to worldwide socio-economic changes that alter the spatial and temporal patterns of vulnerability to natural hazards. During the past decades, much more information has become available on the role of vulnerability assessment in decreasing risk levels. However, few attempts have been made to develop and implement a standardised procedure for assessing physical vulnerability to hydro-meteorological hazards which considers integrating different qualitative and quantitative models in a complementary manner. Moreover, to date, it is not clear to which extent the transferability of different models to different spatio-temporal contexts is feasible, and how practitioners and decision-makers can use these models in a dynamic environment. The objective of this research is to develop a physical vulnerability assessment framework that integrates different vulnerability models (vulnerability indicators, functions and matrices) and scenario analysis in order to investigate the temporal evolution of physical vulnerability of elements at risk to hydro-meteorological hazards. This study will first analyse and discuss the role of vulnerability assessment in reducing risk levels, in particular, how different methods of physical vulnerability modelling are currently applied in various stages of disaster risk management; what are their benefits and limitations; and, to which extent they can be used complementary in an integrated framework. The conceptual framework will make use of two case study areas to enable validation and comparison of results in two different socio-economic contexts. The resulting framework will contribute to the improvement of the risk assessment process and the development of risk reduction strategies.

  4. Short-term Forecasting Ground Magnetic Perturbations with the Space Weather Modeling Framework

    NASA Astrophysics Data System (ADS)

    Welling, Daniel; Toth, Gabor; Gombosi, Tamas; Singer, Howard; Millward, George

    2016-04-01

    Predicting ground-based magnetic perturbations is a critical step towards specifying and predicting geomagnetically induced currents (GICs) in high voltage transmission lines. Currently, the Space Weather Modeling Framework (SWMF), a flexible modeling framework for simulating the multi-scale space environment, is being transitioned from research to operational use (R2O) by NOAA's Space Weather Prediction Center. Upon completion of this transition, the SWMF will provide localized dB/dt predictions using real-time solar wind observations from L1 and the F10.7 proxy for EUV as model input. This presentation describes the operational SWMF setup and summarizes the changes made to the code to enable R2O progress. The framework's algorithm for calculating ground-based magnetometer observations will be reviewed. Metrics from data-model comparisons will be reviewed to illustrate predictive capabilities. Early data products, such as regional-K index and grids of virtual magnetometer stations, will be presented. Finally, early successes will be shared, including the code's ability to reproduce the recent March 2015 St. Patrick's Day Storm.

  5. Tailored motivational message generation: A model and practical framework for real-time physical activity coaching.

    PubMed

    Op den Akker, Harm; Cabrita, Miriam; Op den Akker, Rieks; Jones, Valerie M; Hermens, Hermie J

    2015-06-01

    This paper presents a comprehensive and practical framework for automatic generation of real-time tailored messages in behavior change applications. Basic aspects of motivational messages are time, intention, content and presentation. Tailoring of messages to the individual user may involve all aspects of communication. A linear modular system is presented for generating such messages. It is explained how properties of user and context are taken into account in each of the modules of the system and how they affect the linguistic presentation of the generated messages. The model of motivational messages presented is based on an analysis of existing literature as well as the analysis of a corpus of motivational messages used in previous studies. The model extends existing 'ontology-based' approaches to message generation for real-time coaching systems found in the literature. Practical examples are given on how simple tailoring rules can be implemented throughout the various stages of the framework. Such examples can guide further research by clarifying what it means to use e.g. user targeting to tailor a message. As primary example we look at the issue of promoting daily physical activity. Future work is pointed out in applying the present model and framework, defining efficient ways of evaluating individual tailoring components, and improving effectiveness through the creation of accurate and complete user- and context models. PMID:25843359

  6. ADM Analysis of gravity models within the framework of bimetric variational formalism

    SciTech Connect

    Golovnev, Alexey; Karčiauskas, Mindaugas; Nyrhinen, Hannu J. E-mail: mindaugas.karciauskas@helsinki.fi

    2015-05-01

    Bimetric variational formalism was recently employed to construct novel bimetric gravity models. In these models an affine connection is generated by an additional tensor field which is independent of the physical metric. In this work we demonstrate how the ADM decomposition can be applied to study such models and provide some technical intermediate details. Using ADM decomposition we are able to prove that a linear model is unstable as has previously been indicated by perturbative analysis. Moreover, we show that it is also very difficult if not impossible to construct a non-linear model which is ghost-free within the framework of bimetric variational formalism. However, we demonstrate that viable models are possible along similar lines of thought. To this end, we consider a set up in which the affine connection is a variation of the Levi-Civita one. As a proof of principle we construct a gravity model with a massless scalar field obtained this way.

  7. A Catchment-Based Land Surface Model for GCMs and the Framework for its Evaluation

    NASA Technical Reports Server (NTRS)

    Ducharen, A.; Koster, R. D.; Suarez, M. J.; Kumar, P.

    1998-01-01

    A new GCM-scale land surface modeling strategy that explicitly accounts for subgrid soil moisture variability and its effects on evaporation and runoff is now being explored. In a break from traditional modeling strategies, the continental surface is disaggregated into a mosaic of hydrological catchments, with boundaries that are not dictated by a regular grid but by topography. Within each catchment, the variability of soil moisture is deduced from TOP-MODEL equations with a special treatment of the unsaturated zone. This paper gives an overview of this new approach and presents the general framework for its off-line evaluation over North-America.

  8. Framework for analyzing ecological trait-based models in multidimensional niche spaces

    NASA Astrophysics Data System (ADS)

    Biancalani, Tommaso; DeVille, Lee; Goldenfeld, Nigel

    2015-05-01

    We develop a theoretical framework for analyzing ecological models with a multidimensional niche space. Our approach relies on the fact that ecological niches are described by sequences of symbols, which allows us to include multiple phenotypic traits. Ecological drivers, such as competitive exclusion, are modeled by introducing the Hamming distance between two sequences. We show that a suitable transform diagonalizes the community interaction matrix of these models, making it possible to predict the conditions for niche differentiation and, close to the instability onset, the asymptotically long time population distributions of niches. We exemplify our method using the Lotka-Volterra equations with an exponential competition kernel.

  9. Modeling somite scaling in small embryos in the framework of Turing patterns.

    PubMed

    Signon, Laurence; Nowakowski, Bogdan; Lemarchand, Annie

    2016-04-01

    The adaptation of prevertebra size to embryo size is investigated in the framework of a reaction-diffusion model involving a Turing pattern. The reaction scheme and Fick's first law of diffusion are modified in order to take into account the departure from dilute conditions induced by confinement in smaller embryos. In agreement with the experimental observations of scaling in somitogenesis, our model predicts the formation of smaller prevertebrae or somites in smaller embryos. These results suggest that models based on Turing patterns cannot be automatically disregarded by invoking the question of maintaining proportions in embryonic development. Our approach highlights the nontrivial role that the solvent can play in biology.

  10. Modeling somite scaling in small embryos in the framework of Turing patterns

    NASA Astrophysics Data System (ADS)

    Signon, Laurence; Nowakowski, Bogdan; Lemarchand, Annie

    2016-04-01

    The adaptation of prevertebra size to embryo size is investigated in the framework of a reaction-diffusion model involving a Turing pattern. The reaction scheme and Fick's first law of diffusion are modified in order to take into account the departure from dilute conditions induced by confinement in smaller embryos. In agreement with the experimental observations of scaling in somitogenesis, our model predicts the formation of smaller prevertebrae or somites in smaller embryos. These results suggest that models based on Turing patterns cannot be automatically disregarded by invoking the question of maintaining proportions in embryonic development. Our approach highlights the nontrivial role that the solvent can play in biology.

  11. Framework for analyzing ecological trait-based models in multidimensional niche spaces.

    PubMed

    Biancalani, Tommaso; DeVille, Lee; Goldenfeld, Nigel

    2015-05-01

    We develop a theoretical framework for analyzing ecological models with a multidimensional niche space. Our approach relies on the fact that ecological niches are described by sequences of symbols, which allows us to include multiple phenotypic traits. Ecological drivers, such as competitive exclusion, are modeled by introducing the Hamming distance between two sequences. We show that a suitable transform diagonalizes the community interaction matrix of these models, making it possible to predict the conditions for niche differentiation and, close to the instability onset, the asymptotically long time population distributions of niches. We exemplify our method using the Lotka-Volterra equations with an exponential competition kernel. PMID:26066119

  12. Probabilistic Boolean Network Modelling and Analysis Framework for mRNA Translation.

    PubMed

    Zhao, Yun-Bo; Krishnan, J

    2016-01-01

    mRNA translation is a complex process involving the progression of ribosomes on the mRNA, resulting in the synthesis of proteins, and is subject to multiple layers of regulation. This process has been modelled using different formalisms, both stochastic and deterministic. Recently, we introduced a Probabilistic Boolean modelling framework for mRNA translation, which possesses the advantage of tools for numerically exact computation of steady state probability distribution, without requiring simulation. Here, we extend this model to incorporate both random sequential and parallel update rules, and demonstrate its effectiveness in various settings, including its flexibility in accommodating additional static and dynamic biological complexities and its role in parameter sensitivity analysis. In these applications, the results from the model analysis match those of TASEP model simulations. Importantly, the proposed modelling framework maintains the stochastic aspects of mRNA translation and provides a way to exactly calculate probability distributions, providing additional tools of analysis in this context. Finally, the proposed modelling methodology provides an alternative approach to the understanding of the mRNA translation process, by bridging the gap between existing approaches, providing new analysis tools, and contributing to a more robust platform for modelling and understanding translation.

  13. A New Open Data Open Modeling Framework for the Geosciences Community (Invited)

    NASA Astrophysics Data System (ADS)

    Liang, X.; Salas, D.; Navarro, M.; Liang, Y.; Teng, W. L.; Hooper, R. P.; Restrepo, P. J.; Bales, J. D.

    2013-12-01

    A prototype Open Hydrospheric Modeling Framework (OHMF), also called Open Data Open Modeling framework, has been developed to address two key modeling challenges faced by the broad research community: (1) accessing external data from diverse sources and (2) execution, coupling, and evaluation/intercomparison of various and complex models. The former is achieved via the Open Data architecture, while the latter is achieved via the Open Modeling architecture. The Open Data architecture adopts a common internal data model and representation, to facilitate the integration of various external data sources into OHMF, using Data Agents that handle remote data access protocols (e.g., OPeNDAP, Web services), metadata standards, and source-specific implementations. These Data Agents hide the heterogeneity of the external data sources and provide a common interface to the OHMF system core. The Open Modeling architecture allows different models or modules to be easily integrated into OHMF. The OHMF architectural design offers a general many-to-many connectivity between individual models and external data sources, instead of one-to-one connectivity from data access to model simulation results. OHMF adopts a graphical scientific workflow, offers tools to re-scale in space and time, and provides multi-scale data fusion and assimilation functionality. Notably, the OHMF system employs a strategy that does not require re-compiling or adding interface codes for a user's model to be integrated. Thus, a corresponding model agent can be easily developed by a user. Once an agent is available for a model, it can be shared and used by others. An example will be presented to illustrate the prototype OHMF system and the automatic flow from accessing data to model simulation results in a user-friendly workflow-controlled environment.

  14. A framework for integrated, multi-scale model construction and uncertainty assessment

    NASA Astrophysics Data System (ADS)

    Schmitz, Oliver; de Kok, Jean-Luc; de Jong, Kor; Karssenberg, Derek

    2015-04-01

    The component-based software development practice promotes the construction of self-contained modules with defined input and output interfaces. In environmental modelling, we can adopt this development practice to construct more generic, reusable component models. Here, modellers need to implement a state transition function to describe a specific environmental process, and to specify the required external inputs and parameters to simulate the change of real-world processes over time. Depending on the usage of a component model, such as standalone execution or as part of an integrated model, the source of the external input needs to be specified. The required external inputs can thereby be obtained from disk by a file operation in case of a standalone execution; or inputs can be obtained from other component models, when the component model is used in an integrated model. Using different notations to specify input requirements, however, requires a modification of the state transition function per application case of a component model and therefore would reduce its generic nature. We propose the function object notation as a means to specify input sources of a component model and as a uniform syntax to express input requirements. At component initialisation, the function objects can be parametrised with different external sources. In addition to a uniform syntax, the function object notation allows modellers to specify a request-reply execution flow of the coupled models. We extended the request-reply execution approach to allow for Monte Carlo simulations, and implemented a software framework prototype in Python using the PCRaster module (http://www.pcraster.eu) for field-based modelling. We demonstrate the usage of the framework by building an exemplary integrated model by coupling components simulating land use change, hydrology and eucalyptus tree growth at different temporal discretisations to obtain the probability for bioenergy plantations in a hypothetical

  15. A Modelling Framework to Assess the Effect of Pressures on River Abiotic Habitat Conditions and Biota

    PubMed Central

    Kail, Jochem; Guse, Björn; Radinger, Johannes; Schröder, Maria; Kiesel, Jens; Kleinhans, Maarten; Schuurman, Filip; Fohrer, Nicola; Hering, Daniel; Wolter, Christian

    2015-01-01

    River biota are affected by global reach-scale pressures, but most approaches for predicting biota of rivers focus on river reach or segment scale processes and habitats. Moreover, these approaches do not consider long-term morphological changes that affect habitat conditions. In this study, a modelling framework was further developed and tested to assess the effect of pressures at different spatial scales on reach-scale habitat conditions and biota. Ecohydrological and 1D hydrodynamic models were used to predict discharge and water quality at the catchment scale and the resulting water level at the downstream end of a study reach. Long-term reach morphology was modelled using empirical regime equations, meander migration and 2D morphodynamic models. The respective flow and substrate conditions in the study reach were predicted using a 2D hydrodynamic model, and the suitability of these habitats was assessed with novel habitat models. In addition, dispersal models for fish and macroinvertebrates were developed to assess the re-colonization potential and to finally compare habitat suitability and the availability / ability of species to colonize these habitats. Applicability was tested and model performance was assessed by comparing observed and predicted conditions in the lowland Treene River in northern Germany. Technically, it was possible to link the different models, but future applications would benefit from the development of open source software for all modelling steps to enable fully automated model runs. Future research needs concern the physical modelling of long-term morphodynamics, feedback of biota (e.g., macrophytes) on abiotic habitat conditions, species interactions, and empirical data on the hydraulic habitat suitability and dispersal abilities of macroinvertebrates. The modelling framework is flexible and allows for including additional models and investigating different research and management questions, e.g., in climate impact research as well

  16. Developing a clinical utility framework to evaluate prediction models in radiogenomics

    NASA Astrophysics Data System (ADS)

    Wu, Yirong; Liu, Jie; Munoz del Rio, Alejandro; Page, David C.; Alagoz, Oguzhan; Peissig, Peggy; Onitilo, Adedayo A.; Burnside, Elizabeth S.

    2015-03-01

    Combining imaging and genetic information to predict disease presence and behavior is being codified into an emerging discipline called "radiogenomics." Optimal evaluation methodologies for radiogenomics techniques have not been established. We aim to develop a clinical decision framework based on utility analysis to assess prediction models for breast cancer. Our data comes from a retrospective case-control study, collecting Gail model risk factors, genetic variants (single nucleotide polymorphisms-SNPs), and mammographic features in Breast Imaging Reporting and Data System (BI-RADS) lexicon. We first constructed three logistic regression models built on different sets of predictive features: (1) Gail, (2) Gail+SNP, and (3) Gail+SNP+BI-RADS. Then, we generated ROC curves for three models. After we assigned utility values for each category of findings (true negative, false positive, false negative and true positive), we pursued optimal operating points on ROC curves to achieve maximum expected utility (MEU) of breast cancer diagnosis. We used McNemar's test to compare the predictive performance of the three models. We found that SNPs and BI-RADS features augmented the baseline Gail model in terms of the area under ROC curve (AUC) and MEU. SNPs improved sensitivity of the Gail model (0.276 vs. 0.147) and reduced specificity (0.855 vs. 0.912). When additional mammographic features were added, sensitivity increased to 0.457 and specificity to 0.872. SNPs and mammographic features played a significant role in breast cancer risk estimation (p-value < 0.001). Our decision framework comprising utility analysis and McNemar's test provides a novel framework to evaluate prediction models in the realm of radiogenomics.

  17. Recent numerical and algorithmic advances within the volume tracking framework for modeling interfacial flows

    DOE PAGES

    François, Marianne M.

    2015-05-28

    A review of recent advances made in numerical methods and algorithms within the volume tracking framework is presented. The volume tracking method, also known as the volume-of-fluid method has become an established numerical approach to model and simulate interfacial flows. Its advantage is its strict mass conservation. However, because the interface is not explicitly tracked but captured via the material volume fraction on a fixed mesh, accurate estimation of the interface position, its geometric properties and modeling of interfacial physics in the volume tracking framework remain difficult. Several improvements have been made over the last decade to address these challenges.more » In this study, the multimaterial interface reconstruction method via power diagram, curvature estimation via heights and mean values and the balanced-force algorithm for surface tension are highlighted.« less

  18. Recent numerical and algorithmic advances within the volume tracking framework for modeling interfacial flows

    SciTech Connect

    François, Marianne M.

    2015-05-28

    A review of recent advances made in numerical methods and algorithms within the volume tracking framework is presented. The volume tracking method, also known as the volume-of-fluid method has become an established numerical approach to model and simulate interfacial flows. Its advantage is its strict mass conservation. However, because the interface is not explicitly tracked but captured via the material volume fraction on a fixed mesh, accurate estimation of the interface position, its geometric properties and modeling of interfacial physics in the volume tracking framework remain difficult. Several improvements have been made over the last decade to address these challenges. In this study, the multimaterial interface reconstruction method via power diagram, curvature estimation via heights and mean values and the balanced-force algorithm for surface tension are highlighted.

  19. A unifying framework for systems modeling, control systems design, and system operation

    NASA Technical Reports Server (NTRS)

    Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.

    2005-01-01

    Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.

  20. Spatial optimization of watershed management practices for nitrogen load reduction using a modeling-optimization framework.

    PubMed

    Yang, Guoxiang; Best, Elly P H

    2015-09-15

    Best management practices (BMPs) can be used effectively to reduce nutrient loads transported from non-point sources to receiving water bodies. However, methodologies of BMP selection and placement in a cost-effective way are needed to assist watershed management planners and stakeholders. We developed a novel modeling-optimization framework that can be used to find cost-effective solutions of BMP placement to attain nutrient load reduction targets. This was accomplished by integrating a GIS-based BMP siting method, a WQM-TMDL-N modeling approach to estimate total nitrogen (TN) loading, and a multi-objective optimization algorithm. Wetland restoration and buffer strip implementation were the two BMP categories used to explore the performance of this framework, both differing greatly in complexity of spatial analysis for site identification. Minimizing TN load and BMP cost were the two objective functions for the optimization process. The performance of this framework was demonstrated in the Tippecanoe River watershed, Indiana, USA. Optimized scenario-based load reduction indicated that the wetland subset selected by the minimum scenario had the greatest N removal efficiency. Buffer strips were more effective for load removal than wetlands. The optimized solutions provided a range of trade-offs between the two objective functions for both BMPs. This framework can be expanded conveniently to a regional scale because the NHDPlus catchment serves as its spatial computational unit. The present study demonstrated the potential of this framework to find cost-effective solutions to meet a water quality target, such as a 20% TN load reduction, under different conditions. PMID:26188990

  1. Spatial optimization of watershed management practices for nitrogen load reduction using a modeling-optimization framework.

    PubMed

    Yang, Guoxiang; Best, Elly P H

    2015-09-15

    Best management practices (BMPs) can be used effectively to reduce nutrient loads transported from non-point sources to receiving water bodies. However, methodologies of BMP selection and placement in a cost-effective way are needed to assist watershed management planners and stakeholders. We developed a novel modeling-optimization framework that can be used to find cost-effective solutions of BMP placement to attain nutrient load reduction targets. This was accomplished by integrating a GIS-based BMP siting method, a WQM-TMDL-N modeling approach to estimate total nitrogen (TN) loading, and a multi-objective optimization algorithm. Wetland restoration and buffer strip implementation were the two BMP categories used to explore the performance of this framework, both differing greatly in complexity of spatial analysis for site identification. Minimizing TN load and BMP cost were the two objective functions for the optimization process. The performance of this framework was demonstrated in the Tippecanoe River watershed, Indiana, USA. Optimized scenario-based load reduction indicated that the wetland subset selected by the minimum scenario had the greatest N removal efficiency. Buffer strips were more effective for load removal than wetlands. The optimized solutions provided a range of trade-offs between the two objective functions for both BMPs. This framework can be expanded conveniently to a regional scale because the NHDPlus catchment serves as its spatial computational unit. The present study demonstrated the potential of this framework to find cost-effective solutions to meet a water quality target, such as a 20% TN load reduction, under different conditions.

  2. A Subbasin-based framework to represent land surface processes in an Earth System Model

    SciTech Connect

    Tesfa, Teklu K.; Li, Hongyi; Leung, Lai-Yung R.; Huang, Maoyi; Ke, Yinghai; Sun, Yu; Liu, Ying

    2014-05-20

    Realistically representing spatial heterogeneity and lateral land surface processes within and between modeling units in earth system models is important because of their implications to surface energy and water exchange. The traditional approach of using regular grids as computational units in land surface models and earth system models may lead to inadequate representation of lateral movements of water, energy and carbon fluxes, especially when the grid resolution increases. Here a new subbasin-based framework is introduced in the Community Land Model (CLM), which is the land component of the Community Earth System Model (CESM). Local processes are represented assuming each subbasin as a grid cell on a pseudo grid matrix with no significant modifications to the existing CLM modeling structure. Lateral routing of water within and between subbasins is simulated with the subbasin version of a recently-developed physically based routing model, Model for Scale Adaptive River Routing (MOSART). As an illustration, this new framework is implemented in the topographically diverse region of the U.S. Pacific Northwest. The modeling units (subbasins) are delineated from high-resolution Digital Elevation Model while atmospheric forcing and surface parameters are remapped from the corresponding high resolution datasets. The impacts of this representation on simulating hydrologic processes are explored by comparing it with the default (grid-based) CLM representation. In addition, the effects of DEM resolution on parameterizing topography and the subsequent effects on runoff processes are investigated. Limited model evaluation and comparison showed that small difference between the averaged forcing can lead to more significant difference in the simulated runoff and streamflow because of nonlinear horizontal processes. Topographic indices derived from high resolution DEM may not improve the overall water balance, but affect the partitioning between surface and subsurface runoff

  3. A hybrid model-classifier framework for managing prediction uncertainty in expensive optimisation problems

    NASA Astrophysics Data System (ADS)

    Tenne, Yoel; Izui, Kazuhiro; Nishiwaki, Shinji

    2012-07-01

    Many real-world optimisation problems rely on computationally expensive simulations to evaluate candidate solutions. Often, such problems will contain candidate solutions for which the simulation fails, for example, due to limitations of the simulation. Such candidate solutions can hinder the effectiveness of the optimisation since they may consume a large portion of the optimisation budget without providing new information to the optimiser, leading to search stagnation and a poor final result. Existing approaches to handle such designs either discard them altogether, or assign them a penalised fitness. However, this results in loss of beneficial information, or in a model with a severely deformed landscape. To address these issues, this study proposes a hybrid classifier-model framework. The role of the classifier is to predict which candidate solutions are likely to crash the simulation, and this prediction is then used to bias the search towards valid solutions. Furthermore, the proposed framework employs a trust-region approach, and several other procedures, to manage the model and classifier, and to ensure the progress of the optimisation. Performance analysis using an engineering application of airfoil shape optimisation shows the efficacy of the proposed framework, and the possibility to use the knowledge accumulated in the classifier to gain new insights into the problem being solved.

  4. Model-Based Network Meta-Analysis: A Framework for Evidence Synthesis of Clinical Trial Data.

    PubMed

    Mawdsley, D; Bennetts, M; Dias, S; Boucher, M; Welton, N J

    2016-08-01

    Model-based meta-analysis (MBMA) is increasingly used in drug development to inform decision-making and future trial designs, through the use of complex dose and/or time course models. Network meta-analysis (NMA) is increasingly being used by reimbursement agencies to estimate a set of coherent relative treatment effects for multiple treatments that respect the randomization within the trials. However, NMAs typically either consider different doses completely independently or lump them together, with few examples of models for dose. We propose a framework, model-based network meta-analysis (MBNMA), that combines both approaches, that respects randomization, and allows estimation and prediction for multiple agents and a range of doses, using plausible physiological dose-response models. We illustrate our approach with an example comparing the efficacies of triptans for migraine relief. This uses a binary endpoint, although we note that the model can be easily modified for other outcome types.

  5. A Framework for Quantitative Modeling of Neural Circuits Involved in Sleep-to-Wake Transition

    PubMed Central

    Sorooshyari, Siamak; Huerta, Ramón; de Lecea, Luis

    2015-01-01

    Identifying the neuronal circuits and dynamics of sleep-to-wake transition is essential to understanding brain regulation of behavioral states, including sleep–wake cycles, arousal, and hyperarousal. Recent work by different laboratories has used optogenetics to determine the role of individual neuromodulators in state transitions. The optogenetically driven data do not yet provide a multi-dimensional schematic of the mechanisms underlying changes in vigilance states. This work presents a modeling framework to interpret, assist, and drive research on the sleep-regulatory network. We identify feedback, redundancy, and gating hierarchy as three fundamental aspects of this model. The presented model is expected to expand as additional data on the contribution of each transmitter to a vigilance state becomes available. Incorporation of conductance-based models of neuronal ensembles into this model and existing models of cortical excitability will provide more comprehensive insight into sleep dynamics as well as sleep and arousal-related disorders. PMID:25767461

  6. Pursuing realistic hydrologic model under SUPERFLEX framework in a semi-humid catchment in China

    NASA Astrophysics Data System (ADS)

    Wei, Lingna; Savenije, Hubert H. G.; Gao, Hongkai; Chen, Xi

    2016-04-01

    Model realism is pursued perpetually by hydrologists for flood and drought prediction, integrated water resources management and decision support of water security. "Physical-based" distributed hydrologic models are speedily developed but they also encounter unneglectable challenges, for instance, computational time with low efficiency and parameters uncertainty. This study step-wisely tested four conceptual hydrologic models under the framework of SUPERFLEX in a small semi-humid catchment in southern Huai River basin of China. The original lumped FLEXL has hypothesized model structure of four reservoirs to represent canopy interception, unsaturated zone, subsurface flow of fast and slow components and base flow storage. Considering the uneven rainfall in space, the second model (FLEXD) is developed with same parameter set for different rain gauge controlling units. To reveal the effect of topography, terrain descriptor of height above the nearest drainage (HAND) combined with slope is applied to classify the experimental catchment into two landscapes. Then the third one (FLEXTOPO) builds different model blocks in consideration of the dominant hydrologic process corresponding to the topographical condition. The fourth one named FLEXTOPOD integrating the parallel framework of FLEXTOPO in four controlled units is designed to interpret spatial variability of rainfall patterns and topographic features. Through pairwise comparison, our results suggest that: (1) semi-distributed models (FLEXD and FLEXTOPOD) taking precipitation spatial heterogeneity into account has improved model performance with parsimonious parameter set, and (2) hydrologic model architecture with flexibility to reflect perceived dominant hydrologic processes can include the local terrain circumstances for each landscape. Hence, the modeling actions are coincided with the catchment behaviour and close to the "reality". The presented methodology is regarding hydrologic model as a tool to test our

  7. Integrating Sediment Connectivity into Water Resources Management Trough a Graph Theoretic, Stochastic Modeling Framework.

    NASA Astrophysics Data System (ADS)

    Schmitt, R. J. P.; Castelletti, A.; Bizzi, S.

    2014-12-01

    Understanding sediment transport processes at the river basin scale, their temporal spectra and spatial patterns is key to identify and minimize morphologic risks associated to channel adjustments processes. This work contributes a stochastic framework for modeling bed-load connectivity based on recent advances in the field (e.g., Bizzi & Lerner, 2013; Czubas & Foufoulas-Georgiu, 2014). It presents river managers with novel indicators from reach scale vulnerability to channel adjustment in large river networks with sparse hydrologic and sediment observations. The framework comprises three steps. First, based on a distributed hydrological model and remotely sensed information, the framework identifies a representative grain size class for each reach. Second, sediment residence time distributions are calculated for each reach in a Monte-Carlo approach applying standard sediment transport equations driven by local hydraulic conditions. Third, a network analysis defines the up- and downstream connectivity for various travel times resulting in characteristic up/downstream connectivity signatures for each reach. Channel vulnerability indicators quantify the imbalance between up/downstream connectivity for each travel time domain, representing process dependent latency of morphologic response. Last, based on the stochastic core of the model, a sensitivity analysis identifies drivers of change and major sources of uncertainty in order to target key detrimental processes and to guide effective gathering of additional data. The application, limitation and integration into a decision analytic framework is demonstrated for a major part of the Red River Basin in Northern Vietnam (179.000 km2). Here, a plethora of anthropic alterations ranging from large reservoir construction to land-use changes results in major downstream deterioration and calls for deriving concerted sediment management strategies to mitigate current and limit future morphologic alterations.

  8. A Linear Mixed Model Spline Framework for Analysing Time Course ‘Omics’ Data

    PubMed Central

    Straube, Jasmin; Gorse, Alain-Dominique

    2015-01-01

    Time course ‘omics’ experiments are becoming increasingly important to study system-wide dynamic regulation. Despite their high information content, analysis remains challenging. ‘Omics’ technologies capture quantitative measurements on tens of thousands of molecules. Therefore, in a time course ‘omics’ experiment molecules are measured for multiple subjects over multiple time points. This results in a large, high-dimensional dataset, which requires computationally efficient approaches for statistical analysis. Moreover, methods need to be able to handle missing values and various levels of noise. We present a novel, robust and powerful framework to analyze time course ‘omics’ data that consists of three stages: quality assessment and filtering, profile modelling, and analysis. The first step consists of removing molecules for which expression or abundance is highly variable over time. The second step models each molecular expression profile in a linear mixed model framework which takes into account subject-specific variability. The best model is selected through a serial model selection approach and results in dimension reduction of the time course data. The final step includes two types of analysis of the modelled trajectories, namely, clustering analysis to identify groups of correlated profiles over time, and differential expression analysis to identify profiles which differ over time and/or between treatment groups. Through simulation studies we demonstrate the high sensitivity and specificity of our approach for differential expression analysis. We then illustrate how our framework can bring novel insights on two time course ‘omics’ studies in breast cancer and kidney rejection. The methods are publicly available, implemented in the R CRAN package lmms. PMID:26313144

  9. Multi-object segmentation framework using deformable models for medical imaging analysis.

    PubMed

    Namías, Rafael; D'Amato, Juan Pablo; Del Fresno, Mariana; Vénere, Marcelo; Pirró, Nicola; Bellemare, Marc-Emmanuel

    2016-08-01

    Segmenting structures of interest in medical images is an important step in different tasks such as visualization, quantitative analysis, simulation, and image-guided surgery, among several other clinical applications. Numerous segmentation methods have been developed in the past three decades for extraction of anatomical or functional structures on medical imaging. Deformable models, which include the active contour models or snakes, are among the most popular methods for image segmentation combining several desirable features such as inherent connectivity and smoothness. Even though different approaches have been proposed and significant work has been dedicated to the improvement of such algorithms, there are still challenging research directions as the simultaneous extraction of multiple objects and the integration of individual techniques. This paper presents a novel open-source framework called deformable model array (DMA) for the segmentation of multiple and complex structures of interest in different imaging modalities. While most active contour algorithms can extract one region at a time, DMA allows integrating several deformable models to deal with multiple segmentation scenarios. Moreover, it is possible to consider any existing explicit deformable model formulation and even to incorporate new active contour methods, allowing to select a suitable combination in different conditions. The framework also introduces a control module that coordinates the cooperative evolution of the snakes and is able to solve interaction issues toward the segmentation goal. Thus, DMA can implement complex object and multi-object segmentations in both 2D and 3D using the contextual information derived from the model interaction. These are important features for several medical image analysis tasks in which different but related objects need to be simultaneously extracted. Experimental results on both computed tomography and magnetic resonance imaging show that the proposed

  10. Multi-object segmentation framework using deformable models for medical imaging analysis.

    PubMed

    Namías, Rafael; D'Amato, Juan Pablo; Del Fresno, Mariana; Vénere, Marcelo; Pirró, Nicola; Bellemare, Marc-Emmanuel

    2016-08-01

    Segmenting structures of interest in medical images is an important step in different tasks such as visualization, quantitative analysis, simulation, and image-guided surgery, among several other clinical applications. Numerous segmentation methods have been developed in the past three decades for extraction of anatomical or functional structures on medical imaging. Deformable models, which include the active contour models or snakes, are among the most popular methods for image segmentation combining several desirable features such as inherent connectivity and smoothness. Even though different approaches have been proposed and significant work has been dedicated to the improvement of such algorithms, there are still challenging research directions as the simultaneous extraction of multiple objects and the integration of individual techniques. This paper presents a novel open-source framework called deformable model array (DMA) for the segmentation of multiple and complex structures of interest in different imaging modalities. While most active contour algorithms can extract one region at a time, DMA allows integrating several deformable models to deal with multiple segmentation scenarios. Moreover, it is possible to consider any existing explicit deformable model formulation and even to incorporate new active contour methods, allowing to select a suitable combination in different conditions. The framework also introduces a control module that coordinates the cooperative evolution of the snakes and is able to solve interaction issues toward the segmentation goal. Thus, DMA can implement complex object and multi-object segmentations in both 2D and 3D using the contextual information derived from the model interaction. These are important features for several medical image analysis tasks in which different but related objects need to be simultaneously extracted. Experimental results on both computed tomography and magnetic resonance imaging show that the proposed

  11. Verification of Advances in a Coupled Snow-runoff Modeling Framework for Operational Streamflow Forecasts

    NASA Astrophysics Data System (ADS)

    Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.

    2011-12-01

    The National Oceanic and Atmospheric Administration's (NOAA's) River Forecast Centers (RFCs) issue hydrologic forecasts related to flood events, reservoir operations for water supply, streamflow regulation, and recreation on the nation's streams and rivers. The RFCs use the National Weather Service River Forecast System (NWSRFS) for streamflow forecasting which relies on a coupled snow model (i.e. SNOW17) and rainfall-runoff model (i.e. SAC-SMA) in snow-dominated regions of the US. Errors arise in various steps of the forecasting system from input data, model structure, model parameters, and initial states. The goal of the current study is to undertake verification of potential improvements in the SNOW17-SAC-SMA modeling framework developed for operational streamflow forecasts. We undertake verification for a range of parameters sets (i.e. RFC, DREAM (Differential Evolution Adaptive Metropolis)) as well as a data assimilation (DA) framework developed for the coupled models. Verification is also undertaken for various initial conditions to observe the influence of variability in initial conditions on the forecast. The study basin is the North Fork America River Basin (NFARB) located on the western side of the Sierra Nevada Mountains in northern California. Hindcasts are verified using both deterministic (i.e. Nash Sutcliffe efficiency, root mean square error, and joint distribution) and probabilistic (i.e. reliability diagram, discrimination diagram, containing ratio, and Quantile plots) statistics. Our presentation includes comparison of the performance of different optimized parameters and the DA framework as well as assessment of the impact associated with the initial conditions used for streamflow forecasts for the NFARB.

  12. A Python Plug-in Based Computational Framework for Spatially Distributed Environmental and Earth Sciences Modelling

    NASA Astrophysics Data System (ADS)

    Willgoose, G. R.

    2009-12-01

    One of the pioneering landform evolution models, SIBERIA, while developed in the 1980’s is still widely used in the science community and is a key component of engineering software used to assess the long-term stability of man-made landforms such as rehabilitated mine sites and nuclear waste repositories. While SIBERIA is very reliable, computationally fast and well tested (both its underlying science and the computer code) the range of emerging applications have challenged the ability of the author to maintain and extend the underlying computer code. Moreover, the architecture of the SIBERIA code is not well suited to collaborative extension of its capabilities without often triggering forking of the code base. This paper describes a new modelling framework designed to supersede SIBERIA (as well as other earth sciences codes by the author) called TelluSim. The design is such that it is potentially more than simply a new landform evolution model, but TelluSim is a more general dynamical system modelling framework using time evolving GIS data as its spatial discretisation. TelluSim is designed as an open modular framework facilitating open-sourcing of the code, while addressing compromises made in the original design of SIBERIA in the 1980’s. An important aspect of the design of TelluSim was to minimise the overhead in interfacing the modules with TelluSim, and minimise any requirement for recoding of existing software, so eliminating a major disadvantage of more complex frameworks. The presentation will discuss in more detail the reasoning behind the design of TelluSim, and experiences of the advantages and disadvantages of using Python relative to other approaches (e.g. Matlab, R). The paper will discuss examples of how TelluSim has facilitated the incorporation and testing of new algorithms, and environmental processes, and the support for novel science and data testing methodologies. It will also discuss plans to link TelluSim with other open source

  13. A versatile platform for multilevel modeling of physiological systems: template/instance framework for large-scale modeling and simulation.

    PubMed

    Asai, Yoshiyuki; Abe, Takeshi; Oka, Hideki; Okita, Masao; Okuyama, Tomohiro; Hagihara, Ken-Ichi; Ghosh, Samik; Matsuoka, Yukiko; Kurachi, Yoshihisa; Kitano, Hrioaki

    2013-01-01

    Building multilevel models of physiological systems is a significant and effective method for integrating a huge amount of bio-physiological data and knowledge obtained by earlier experiments and simulations. Since such models tend to be large in size and complicated in structure, appropriate software frameworks for supporting modeling activities are required. A software platform, PhysioDesigner, has been developed, which supports the process of creating multilevel models. Models developed on PhysioDesigner are established in an XML format called PHML. Every physiological entity in a model is represented as a module, and hence a model constitutes an aggregation of modules. When the number of entities of which the model is comprised is large, it is difficult to manage the entities manually, and some semiautomatic assistive functions are necessary. In this article, which focuses particularly on recently developed features of the platform for building large-scale models utilizing a template/instance framework and morphological information, the PhysioDesigner platform is introduced.

  14. Short term global health experiences and local partnership models: a framework.

    PubMed

    Loh, Lawrence C; Cherniak, William; Dreifuss, Bradley A; Dacso, Matthew M; Lin, Henry C; Evert, Jessica

    2015-01-01

    Contemporary interest in in short-term experiences in global health (STEGH) has led to important questions of ethics, responsibility, and potential harms to receiving communities. In addressing these issues, the role of local engagement through partnerships between external STEGH facilitating organization(s) and internal community organization(s) has been identified as crucial to mitigating potential pitfalls. This perspective piece offers a framework to categorize different models of local engagement in STEGH based on professional experiences and a review of the existing literature. This framework will encourage STEGH stakeholders to consider partnership models in the development and evaluation of new or existing programs.The proposed framework examines the community context in which STEGH may occur, and considers three broad categories: number of visiting external groups conducting STEGH (single/multiple), number of host entities that interact with the STEGH (none/single/multiple), and frequency of STEGH (continuous/intermittent). These factors culminate in a specific model that provides a description of opportunities and challenges presented by each model. Considering different models, single visiting partners, working without a local partner on an intermittent (or even one-time) basis provided the greatest flexibility to the STEGH participants, but represented the least integration locally and subsequently the greatest potential harm for the receiving community. Other models, such as multiple visiting teams continuously working with a single local partner, provided an opportunity for centralization of efforts and local input, but required investment in consensus-building and streamlining of processes across different groups. We conclude that involving host partners in the design, implementation, and evaluation of STEGH requires more effort on the part of visiting STEGH groups and facilitators, but has the greatest potential benefit for meaningful, locally

  15. A framework for weighted fusion of multiple statistical models of shape and appearance.

    PubMed

    Butakoff, Constantine; Frangi, Alejandro F

    2006-11-01

    This paper presents a framework for weighted fusion of several Active Shape and Active Appearance Models. The approach is based on the eigenspace fusion method proposed by Hall et al., which has been extended to fuse more than two weighted eigenspaces using unbiased mean and covariance matrix estimates. To evaluate the performance of fusion, a comparative assessment on segmentation precision as well as facial verification tests are performed using the AR, EQUINOX, and XM2VTS databases. Based on the results, it is concluded that the fusion is useful when the model needs to be updated online or when the original observations are absent. PMID:17063688

  16. Short term global health experiences and local partnership models: a framework.

    PubMed

    Loh, Lawrence C; Cherniak, William; Dreifuss, Bradley A; Dacso, Matthew M; Lin, Henry C; Evert, Jessica

    2015-12-18

    Contemporary interest in in short-term experiences in global health (STEGH) has led to important questions of ethics, responsibility, and potential harms to receiving communities. In addressing these issues, the role of local engagement through partnerships between external STEGH facilitating organization(s) and internal community organization(s) has been identified as crucial to mitigating potential pitfalls. This perspective piece offers a framework to categorize different models of local engagement in STEGH based on professional experiences and a review of the existing literature. This framework will encourage STEGH stakeholders to consider partnership models in the development and evaluation of new or existing programs.The proposed framework examines the community context in which STEGH may occur, and considers three broad categories: number of visiting external groups conducting STEGH (single/multiple), number of host entities that interact with the STEGH (none/single/multiple), and frequency of STEGH (continuous/intermittent). These factors culminate in a specific model that provides a description of opportunities and challenges presented by each model. Considering different models, single visiting partners, working without a local partner on an intermittent (or even one-time) basis provided the greatest flexibility to the STEGH participants, but represented the least integration locally and subsequently the greatest potential harm for the receiving community. Other models, such as multiple visiting teams continuously working with a single local partner, provided an opportunity for centralization of efforts and local input, but required investment in consensus-building and streamlining of processes across different groups. We conclude that involving host partners in the design, implementation, and evaluation of STEGH requires more effort on the part of visiting STEGH groups and facilitators, but has the greatest potential benefit for meaningful, locally

  17. SMART: A New Semi-distributed Hydrologic Modelling Framework for Soil Moisture and Runoff Simulations

    NASA Astrophysics Data System (ADS)

    Ajami, Hoori; Sharma, Ashish

    2016-04-01

    A new GIS-based semi-distributed hydrological modelling framework is developed based upon the delineation of contiguous and topologically connected Hydrologic Response Units (HRUs). The Soil Moisture and Runoff simulation Toolkit (SMART) performs topographic and geomorphic analysis of a catchment and delineates HRUs in each first order sub-basin. This HRU delineation approach maintains lateral flow dynamics in first order sub-basins and therefore it is suited for simulating runoff in upland catchments. Simulation elements in SMART are distributed cross sections or equivalent cross sections (ECS) in each first order sub-basin to represent hillslope hydrologic processes. Delineation of ECSs in SMART is performed by weighting the topographic and physiographic properties of the part or entire first-order sub-basin and has the advantage of reducing computational time/effort while maintaining reasonable accuracy in simulated hydrologic state and fluxes (e.g. soil moisture, evapotranspiration and runoff). SMART workflow is written in MATLAB to automate the HRU and cross section delineations, model simulations across multiple cross sections, and post-processing of model outputs to visualize the results. The MATLAB Parallel Processing Toolbox is used for simultaneous simulations of cross sections and is further reduced computational time. SMART workflow tasks are: 1) delineation of first order sub-basins of a catchment using a digital elevation model, 2) hillslope delineation, 3) landform delineation in every first order sub-basin based on topographic and geomorphic properties of a group of sub-basins or the entire catchment, 4) formulation of cross sections as well as equivalent cross sections in every first order sub-basin, and 5) deriving vegetation and soil parameters from spatially distributed land cover and soil information. The current version of SMART uses a 2-d distributed hydrological model based on the Richards' equation. However, any hydrologic model can be

  18. The Cummins model: a framework for teaching nursing students for whom English is a second language.

    PubMed

    Abriam-Yago, K; Yoder, M; Kataoka-Yahiro, M

    1999-04-01

    The health care system requires nurses with the language ability and the cultural knowledge to meet the health care needs of ethnic minority immigrants. The recruitment, admission, retention, and graduation of English as a Second Language (ESL) students are essential to provide the workforce to meet the demands of the multicultural community. Yet, ESL students possess language difficulties that affect their academic achievement in nursing programs. The application of the Cummins Model of language proficiency is discussed. The Cummins Model provides a framework for nursing faculty to develop educational support that meets the learning needs of ESL students.

  19. Super-Resolution Using Hidden Markov Model and Bayesian Detection Estimation Framework

    NASA Astrophysics Data System (ADS)

    Humblot, Fabrice; Mohammad-Djafari, Ali

    2006-12-01

    This paper presents a new method for super-resolution (SR) reconstruction of a high-resolution (HR) image from several low-resolution (LR) images. The HR image is assumed to be composed of homogeneous regions. Thus, the a priori distribution of the pixels is modeled by a finite mixture model (FMM) and a Potts Markov model (PMM) for the labels. The whole a priori model is then a hierarchical Markov model. The LR images are assumed to be obtained from the HR image by lowpass filtering, arbitrarily translation, decimation, and finally corruption by a random noise. The problem is then put in a Bayesian detection and estimation framework, and appropriate algorithms are developed based on Markov chain Monte Carlo (MCMC) Gibbs sampling. At the end, we have not only an estimate of the HR image but also an estimate of the classification labels which leads to a segmentation result.

  20. A Petri-Nets Based Unified Modeling Approach for Zachman Framework Cells

    NASA Astrophysics Data System (ADS)

    Ostadzadeh, S. Shervin; Nekoui, Mohammad Ali

    With a trend toward becoming more and more information based, enterprises constantly attempt to surpass the accomplishments of each other by improving their information activities. In this respect, Enterprise Architecture (EA) has proven to serve as a fundamental concept to accomplish this goal. Enterprise architecture clearly provides a thorough outline of the whole enterprise applications and systems with their relationships to enterprise business goals. To establish such an outline, a logical framework needs to be laid upon the entire information system called Enterprise Architecture Framework (EAF). Among various proposed EAF, Zachman Framework (ZF) has been widely accepted as a standard scheme for identifying and organizing descriptive representations that have critical roles in enterprise management and development. One of the problems faced in using ZF is the lack of formal and verifiable models for its cells. In this paper, we proposed a formal language based on Petri nets in order to obtain verifiable models for all cells in ZF. The presented method helps developers to validate and verify completely integrated business and IT systems which results in improve the effectiveness or efficiency of the enterprise itself.

  1. Simulating mesoscale coastal evolution for decadal coastal management: A new framework integrating multiple, complementary modelling approaches

    NASA Astrophysics Data System (ADS)

    van Maanen, Barend; Nicholls, Robert J.; French, Jon R.; Barkwith, Andrew; Bonaldo, Davide; Burningham, Helene; Brad Murray, A.; Payo, Andres; Sutherland, James; Thornhill, Gillian; Townend, Ian H.; van der Wegen, Mick; Walkden, Mike J. A.

    2016-03-01

    Coastal and shoreline management increasingly needs to consider morphological change occurring at decadal to centennial timescales, especially that related to climate change and sea-level rise. This requires the development of morphological models operating at a mesoscale, defined by time and length scales of the order 101 to 102 years and 101 to 102 km. So-called 'reduced complexity' models that represent critical processes at scales not much smaller than the primary scale of interest, and are regulated by capturing the critical feedbacks that govern landform behaviour, are proving effective as a means of exploring emergent coastal behaviour at a landscape scale. Such models tend to be computationally efficient and are thus easily applied within a probabilistic framework. At the same time, reductionist models, built upon a more detailed description of hydrodynamic and sediment transport processes, are capable of application at increasingly broad spatial and temporal scales. More qualitative modelling approaches are also emerging that can guide the development and deployment of quantitative models, and these can be supplemented by varied data-driven modelling approaches that can achieve new explanatory insights from observational datasets. Such disparate approaches have hitherto been pursued largely in isolation by mutually exclusive modelling communities. Brought together, they have the potential to facilitate a step change in our ability to simulate the evolution of coastal morphology at scales that are most relevant to managing erosion and flood risk. Here, we advocate and outline a new integrated modelling framework that deploys coupled mesoscale reduced complexity models, reductionist coastal area models, data-driven approaches, and qualitative conceptual models. Integration of these heterogeneous approaches gives rise to model compositions that can potentially resolve decadal- to centennial-scale behaviour of diverse coupled open coast, estuary and inner

  2. Creating "Intelligent" Climate Model Ensemble Averages Using a Process-Based Framework

    NASA Astrophysics Data System (ADS)

    Baker, N. C.; Taylor, P. C.

    2014-12-01

    The CMIP5 archive contains future climate projections from over 50 models provided by dozens of modeling centers from around the world. Individual model projections, however, are subject to biases created by structural model uncertainties. As a result, ensemble averaging of multiple models is often used to add value to model projections: consensus projections have been shown to consistently outperform individual models. Previous reports for the IPCC establish climate change projections based on an equal-weighted average of all model projections. However, certain models reproduce climate processes better than other models. Should models be weighted based on performance? Unequal ensemble averages have previously been constructed using a variety of mean state metrics. What metrics are most relevant for constraining future climate projections? This project develops a framework for systematically testing metrics in models to identify optimal metrics for unequal weighting multi-model ensembles. A unique aspect of this project is the construction and testing of climate process-based model evaluation metrics. A climate process-based metric is defined as a metric based on the relationship between two physically related climate variables—e.g., outgoing longwave radiation and surface temperature. Metrics are constructed using high-quality Earth radiation budget data from NASA's Clouds and Earth's Radiant Energy System (CERES) instrument and surface temperature data sets. It is found that regional values of tested quantities can vary significantly when comparing weighted and unweighted model ensembles. For example, one tested metric weights the ensemble by how well models reproduce the time-series probability distribution of the cloud forcing component of reflected shortwave radiation. The weighted ensemble for this metric indicates lower simulated precipitation (up to .7 mm/day) in tropical regions than the unweighted ensemble: since CMIP5 models have been shown to

  3. Using the Bifocal Modeling Framework to Resolve "Discrepant Events" Between Physical Experiments and Virtual Models in Biology

    NASA Astrophysics Data System (ADS)

    Blikstein, Paulo; Fuhrmann, Tamar; Salehi, Shima

    2016-08-01

    In this paper, we investigate an approach to supporting students' learning in science through a combination of physical experimentation and virtual modeling. We present a study that utilizes a scientific inquiry framework, which we call "bifocal modeling," to link student-designed experiments and computer models in real time. In this study, a group of high school students designed computer models of bacterial growth with reference to a simultaneous physical experiment they were conducting, and were able to validate the correctness of their model against the results of their experiment. Our findings suggest that as the students compared their virtual models with physical experiments, they encountered "discrepant events" that contradicted their existing conceptions and elicited a state of cognitive disequilibrium. This experience of conflict encouraged students to further examine their ideas and to seek more accurate explanations of the observed natural phenomena, improving the design of their computer models.

  4. Construction of 3-D geologic framework and textural models for Cuyama Valley groundwater basin, California

    USGS Publications Warehouse

    Sweetkind, Donald S.; Faunt, Claudia C.; Hanson, Randall T.

    2013-01-01

    Groundwater is the sole source of water supply in Cuyama Valley, a rural agricultural area in Santa Barbara County, California, in the southeasternmost part of the Coast Ranges of California. Continued groundwater withdrawals and associated water-resource management concerns have prompted an evaluation of the hydrogeology and water availability for the Cuyama Valley groundwater basin by the U.S. Geological Survey, in cooperation with the Water Agency Division of the Santa Barbara County Department of Public Works. As a part of the overall groundwater evaluation, this report documents the construction of a digital three-dimensional geologic framework model of the groundwater basin suitable for use within a numerical hydrologic-flow model. The report also includes an analysis of the spatial variability of lithology and grain size, which forms the geologic basis for estimating aquifer hydraulic properties. The geologic framework was constructed as a digital representation of the interpreted geometry and thickness of the principal stratigraphic units within the Cuyama Valley groundwater basin, which include younger alluvium, older alluvium, and the Morales Formation, and underlying consolidated bedrock. The framework model was constructed by creating gridded surfaces representing the altitude of the top of each stratigraphic unit from various input data, including lithologic and electric logs from oil and gas wells and water wells, cross sections, and geologic maps. Sediment grain-size data were analyzed in both two and three dimensions to help define textural variations in the Cuyama Valley groundwater basin and identify areas with similar geologic materials that potentially have fairly uniform hydraulic properties. Sediment grain size was used to construct three-dimensional textural models that employed simple interpolation between drill holes and two-dimensional textural models for each stratigraphic unit that incorporated spatial structure of the textural data.

  5. Groundwater modelling as a tool for the European Water Framework Directive (WFD) application: The Llobregat case

    NASA Astrophysics Data System (ADS)

    Vázquez-Suñé, E.; Abarca, E.; Carrera, J.; Capino, B.; Gámez, D.; Pool, M.; Simó, T.; Batlle, F.; Niñerola, J. M.; Ibáñez, X.

    The European Water Framework Directive establishes the basis for Community action in the field of water policy. Water authorities in Catalonia, together with users are designing a management program to improve groundwater status and to assess the impact of infrastructures and city-planning activities on the aquifers and their associated natural systems. The objective is to describe the role of groundwater modelling in addressing the issues raised by the Water Framework Directive, and its application to the Llobregat Delta, Barcelona, Spain. In this case modelling was used to address Water Framework Directive in the following: (1) Characterisation of aquifers and the status of groundwater by integration of existing knowledge and new hydrogeological information. Inverse modelling allowed us to reach an accurate description of the paths and mechanisms for the evolution of seawater intrusion. (2) Quantification of groundwater budget (mass balance). This is especially relevant for those terms that are difficult to asses, such as recharge from river infiltration during floods, which we have found to be very important. (3) Evaluation of groundwater-related environmental needs in aquatic ecosystems. The model allows quantifying groundwater input under natural conditions, which can be used as a reference level for stressed conditions. (4) Evaluation of possible impacts of territory planning (Llobregat river course modification, new railway tunnels, airport and docks enlargement, etc.). (5) Definition of management areas. (6) The assessment of possible future scenarios combined with optimization processes to quantify sustainable pumping rates and design measures to control seawater intrusion. The resulting model has been coupled to a user-friendly interface to allow water managers to design and address corrective measures in an agile and effective way.

  6. Modeling Plan-Related Clinical Complications Using Machine Learning Tools in a Multiplan IMRT Framework

    SciTech Connect

    Zhang, Hao H.; D'Souza, Warren D. Shi Leyuan; Meyer, Robert R.

    2009-08-01

    Purpose: To predict organ-at-risk (OAR) complications as a function of dose-volume (DV) constraint settings without explicit plan computation in a multiplan intensity-modulated radiotherapy (IMRT) framework. Methods and Materials: Several plans were generated by varying the DV constraints (input features) on the OARs (multiplan framework), and the DV levels achieved by the OARs in the plans (plan properties) were modeled as a function of the imposed DV constraint settings. OAR complications were then predicted for each of the plans by using the imposed DV constraints alone (features) or in combination with modeled DV levels (plan properties) as input to machine learning (ML) algorithms. These ML approaches were used to model two OAR complications after head-and-neck and prostate IMRT: xerostomia, and Grade 2 rectal bleeding. Two-fold cross-validation was used for model verification and mean errors are reported. Results: Errors for modeling the achieved DV values as a function of constraint settings were 0-6%. In the head-and-neck case, the mean absolute prediction error of the saliva flow rate normalized to the pretreatment saliva flow rate was 0.42% with a 95% confidence interval of (0.41-0.43%). In the prostate case, an average prediction accuracy of 97.04% with a 95% confidence interval of (96.67-97.41%) was achieved for Grade 2 rectal bleeding complications. Conclusions: ML can be used for predicting OAR complications during treatment planning allowing for alternative DV constraint settings to be assessed within the planning framework.

  7. Inferring landscape effects on gene flow: a new model selection framework.

    PubMed

    Shirk, A J; Wallin, D O; Cushman, S A; Rice, C G; Warheit, K I

    2010-09-01

    Populations in fragmented landscapes experience reduced gene flow, lose genetic diversity over time and ultimately face greater extinction risk. Improving connectivity in fragmented landscapes is now a major focus of conservation biology. Designing effective wildlife corridors for this purpose, however, requires an accurate understanding of how landscapes shape gene flow. The preponderance of landscape resistance models generated to date, however, is subjectively parameterized based on expert opinion or proxy measures of gene flow. While the relatively few studies that use genetic data are more rigorous, frameworks they employ frequently yield models only weakly related to the observed patterns of genetic isolation. Here, we describe a new framework that uses expert opinion as a starting point. By systematically varying each model parameter, we sought to either validate the assumptions of expert opinion, or identify a peak of support for a new model more highly related to genetic isolation. This approach also accounts for interactions between variables, allows for nonlinear responses and excludes variables that reduce model performance. We demonstrate its utility on a population of mountain goats inhabiting a fragmented landscape in the Cascade Range, Washington. PMID:20723066

  8. A Modeling Framework to Incorporate Effects of Infrastructure in Sociohydrological Systems

    NASA Astrophysics Data System (ADS)

    Muneepeerakul, R.

    2014-12-01

    In studying coupled natural-human systems, most modeling efforts focus on humans and the natural resources. In reality, however, humans rarely interact with these resources directly; the relationships between humans and resources are mediated by infrastructures. In sociohydrological systems, these include, for example, dams and irrigation canals. These infrastructures have important characteristics such as threshold behavior and a separate entity/organization tasked with maintaining them. These characteristics influence social dynamics within the system, which in turn determines the state of infrastructure and water usage, thereby exerting feedbacks onto the hydrological processes. Infrastructure is thus a necessary ingredient for modeling co-evolution of human and water in sociohydrological systems. A conceptual framework to address this gap has been proposed by Anderies, Janssen, and Ostrom (2004). Here we develop a model to operationalize the framework and report some preliminary results. Simple in its setup, the model highlights the structure of the social dilemmas and how it affects the system's sustainability. The model also offers a platform to explore how the system's sustainability may respond to external shocks from globalization and global climate change.

  9. Inferring landscape effects on gene flow: a new model selection framework.

    PubMed

    Shirk, A J; Wallin, D O; Cushman, S A; Rice, C G; Warheit, K I

    2010-09-01

    Populations in fragmented landscapes experience reduced gene flow, lose genetic diversity over time and ultimately face greater extinction risk. Improving connectivity in fragmented landscapes is now a major focus of conservation biology. Designing effective wildlife corridors for this purpose, however, requires an accurate understanding of how landscapes shape gene flow. The preponderance of landscape resistance models generated to date, however, is subjectively parameterized based on expert opinion or proxy measures of gene flow. While the relatively few studies that use genetic data are more rigorous, frameworks they employ frequently yield models only weakly related to the observed patterns of genetic isolation. Here, we describe a new framework that uses expert opinion as a starting point. By systematically varying each model parameter, we sought to either validate the assumptions of expert opinion, or identify a peak of support for a new model more highly related to genetic isolation. This approach also accounts for interactions between variables, allows for nonlinear responses and excludes variables that reduce model performance. We demonstrate its utility on a population of mountain goats inhabiting a fragmented landscape in the Cascade Range, Washington.

  10. Increased flexibility for modeling telemetry and nest-survival data using the multistate framework

    USGS Publications Warehouse

    Devineau, Olivier; Kendall, William L.; Doherty, Paul F.; Shenk, Tanya M.; White, Gary C.; Lukacs, Paul M.; Burnham, Kenneth P.

    2014-01-01

    Although telemetry is one of the most common tools used in the study of wildlife, advances in the analysis of telemetry data have lagged compared to progress in the development of telemetry devices. We demonstrate how standard known-fate telemetry and related nest-survival data analysis models are special cases of the more general multistate framework. We present a short theoretical development, and 2 case examples regarding the American black duck and the mallard. We also present a more complex lynx data analysis. Although not necessary in all situations, the multistate framework provides additional flexibility to analyze telemetry data, which may help analysts and biologists better deal with the vagaries of real-world data collection.

  11. A second gradient theoretical framework for hierarchical multiscale modeling of materials

    SciTech Connect

    Luscher, Darby J; Bronkhorst, Curt A; Mc Dowell, David L

    2009-01-01

    A theoretical framework for the hierarchical multiscale modeling of inelastic response of heterogeneous materials has been presented. Within this multiscale framework, the second gradient is used as a non local kinematic link between the response of a material point at the coarse scale and the response of a neighborhood of material points at the fine scale. Kinematic consistency between these scales results in specific requirements for constraints on the fluctuation field. The wryness tensor serves as a second-order measure of strain. The nature of the second-order strain induces anti-symmetry in the first order stress at the coarse scale. The multiscale ISV constitutive theory is couched in the coarse scale intermediate configuration, from which an important new concept in scale transitions emerges, namely scale invariance of dissipation. Finally, a strategy for developing meaningful kinematic ISVs and the proper free energy functions and evolution kinetics is presented.

  12. Modular GIS Framework for National Scale Hydrologic and Hydraulic Modeling Support

    NASA Astrophysics Data System (ADS)

    Djokic, D.; Noman, N.; Kopp, S.

    2015-12-01

    Geographic information systems (GIS) have been extensively used for pre- and post-processing of hydrologic and hydraulic models at multiple scales. An extensible GIS-based framework was developed for characterization of drainage systems (stream networks, catchments, floodplain characteristics) and model integration. The framework is implemented as a set of free, open source, Python tools and builds on core ArcGIS functionality and uses geoprocessing capabilities to ensure extensibility. Utilization of COTS GIS core capabilities allows immediate use of model results in a variety of existing online applications and integration with other data sources and applications.The poster presents the use of this framework to downscale global hydrologic models to local hydraulic scale and post process the hydraulic modeling results and generate floodplains at any local resolution. Flow forecasts from ECMWF or WRF-Hydro are downscaled and combined with other ancillary data for input into the RAPID flood routing model. RAPID model results (stream flow along each reach) are ingested into a GIS-based scale dependent stream network database for efficient flow utilization and visualization over space and time. Once the flows are known at localized reaches, the tools can be used to derive the floodplain depth and extent for each time step in the forecast at any available local resolution. If existing rating curves are available they can be used to relate the flow to the depth of flooding, or synthetic rating curves can be derived using the tools in the toolkit and some ancillary data/assumptions. The results can be published as time-enabled spatial services to be consumed by web applications that use floodplain information as an input. Some of the existing online presentation templates can be easily combined with available online demographic and infrastructure data to present the impact of the potential floods on the local community through simple, end user products. This framework

  13. A New Framework for Effective and Efficient Global Sensitivity Analysis of Earth and Environmental Systems Models

    NASA Astrophysics Data System (ADS)

    Razavi, Saman; Gupta, Hoshin

    2015-04-01

    Earth and Environmental Systems (EES) models are essential components of research, development, and decision-making in science and engineering disciplines. With continuous advances in understanding and computing power, such models are becoming more complex with increasingly more factors to be specified (model parameters, forcings, boundary conditions, etc.). To facilitate better understanding of the role and importance of different factors in producing the model responses, the procedure known as 'Sensitivity Analysis' (SA) can be very helpful. Despite the availability of a large body of literature on the development and application of various SA approaches, two issues continue to pose major challenges: (1) Ambiguous Definition of Sensitivity - Different SA methods are based in different philosophies and theoretical definitions of sensitivity, and can result in different, even conflicting, assessments of the underlying sensitivities for a given problem, (2) Computational Cost - The cost of carrying out SA can be large, even excessive, for high-dimensional problems and/or computationally intensive models. In this presentation, we propose a new approach to sensitivity analysis that addresses the dual aspects of 'effectiveness' and 'efficiency'. By effective, we mean achieving an assessment that is both meaningful and clearly reflective of the objective of the analysis (the first challenge above), while by efficiency we mean achieving statistically robust results with minimal computational cost (the second challenge above). Based on this approach, we develop a 'global' sensitivity analysis framework that efficiently generates a newly-defined set of sensitivity indices that characterize a range of important properties of metric 'response surfaces' encountered when performing SA on EES models. Further, we show how this framework embraces, and is consistent with, a spectrum of different concepts regarding 'sensitivity', and that commonly-used SA approaches (e.g., Sobol

  14. Land-Atmosphere Coupling in the Multi-Scale Modelling Framework

    NASA Astrophysics Data System (ADS)

    Kraus, P. M.; Denning, S.

    2015-12-01

    The Multi-Scale Modeling Framework (MMF), in which cloud-resolving models (CRMs) are embedded within general circulation model (GCM) gridcells to serve as the model's cloud parameterization, has offered a number of benefits to GCM simulations. The coupling of these cloud-resolving models directly to land surface model instances, rather than passing averaged atmospheric variables to a single instance of a land surface model, the logical next step in model development, has recently been accomplished. This new configuration offers conspicuous improvements to estimates of precipitation and canopy through-fall, but overall the model exhibits warm surface temperature biases and low productivity.This work presents modifications to a land-surface model that take advantage of the new multi-scale modeling framework, and accommodate the change in spatial scale from a typical GCM range of ~200 km to the CRM grid-scale of 4 km.A parameterization is introduced to apportion modeled surface radiation into direct-beam and diffuse components. The diffuse component is then distributed among the land-surface model instances within each GCM cell domain. This substantially reduces the number excessively low light values provided to the land-surface model when cloudy conditions are modeled in the CRM, associated with its 1-D radiation scheme. The small spatial scale of the CRM, ~4 km, as compared with the typical ~200 km GCM scale, provides much more realistic estimates of precipitation intensity, this permits the elimination of a model parameterization of canopy through-fall. However, runoff at such scales can no longer be considered as an immediate flow to the ocean. Allowing sub-surface water flow between land-surface instances within the GCM domain affords better realism and also reduces temperature and productivity biases.The MMF affords a number of opportunities to land-surface modelers, providing both the advantages of direct simulation at the 4 km scale and a much reduced

  15. Exact matrix treatment of an osmotic ensemble model of adsorption and pressure induced structural transitions in metal organic frameworks.

    PubMed

    Dunne, Lawrence J; Manos, George

    2016-03-14

    Here we present an exactly treated quasi-one dimensional statistical mechanical osmotic ensemble model of pressure and adsorption induced breathing structural transformations of metal-organic frameworks (MOFs). The treatment uses a transfer matrix method. The model successfully reproduces the gas and pressure induced structural changes which are observed experimentally in MOFs. The model treatment presented here is a significant step towards analytical statistical mechanical treatments of flexible metal-organic frameworks.

  16. A statistical framework for the validation of a population exposure model based on personal exposure data

    NASA Astrophysics Data System (ADS)

    Rodriguez, Delphy; Valari, Myrto; Markakis, Konstantinos; Payan, Sébastien

    2016-04-01

    Currently, ambient pollutant concentrations at monitoring sites are routinely measured by local networks, such as AIRPARIF in Paris, France. Pollutant concentration fields are also simulated with regional-scale chemistry transport models such as CHIMERE (http://www.lmd.polytechnique.fr/chimere) under air-quality forecasting platforms (e.g. Prev'Air http://www.prevair.org) or research projects. These data may be combined with more or less sophisticated techniques to provide a fairly good representation of pollutant concentration spatial gradients over urban areas. Here we focus on human exposure to atmospheric contaminants. Based on census data on population dynamics and demographics, modeled outdoor concentrations and infiltration of outdoor air-pollution indoors we have developed a population exposure model for ozone and PM2.5. A critical challenge in the field of population exposure modeling is model validation since personal exposure data are expensive and therefore, rare. However, recent research has made low cost mobile sensors fairly common and therefore personal exposure data should become more and more accessible. In view of planned cohort field-campaigns where such data will be available over the Paris region, we propose in the present study a statistical framework that makes the comparison between modeled and measured exposures meaningful. Our ultimate goal is to evaluate the exposure model by comparing modeled exposures to monitor data. The scientific question we address here is how to downscale modeled data that are estimated on the county population scale at the individual scale which is appropriate to the available measurements. To assess this question we developed a Bayesian hierarchical framework that assimilates actual individual data into population statistics and updates the probability estimate.

  17. How much cryosphere model complexity is just right? Exploration using the conceptual cryosphere hydrology framework

    NASA Astrophysics Data System (ADS)

    Mosier, Thomas M.; Hill, David F.; Sharp, Kendra V.

    2016-09-01

    Making meaningful projections of the impacts that possible future climates would have on water resources in mountain regions requires understanding how cryosphere hydrology model performance changes under altered climate conditions and when the model is applied to ungaged catchments. Further, if we are to develop better models, we must understand which specific process representations limit model performance. This article presents a modeling tool, named the Conceptual Cryosphere Hydrology Framework (CCHF), that enables implementing and evaluating a wide range of cryosphere modeling hypotheses. The CCHF represents cryosphere hydrology systems using a set of coupled process modules that allows easily interchanging individual module representations and includes analysis tools to evaluate model outputs. CCHF version 1 (Mosier, 2016) implements model formulations that require only precipitation and temperature as climate inputs - for example variations on simple degree-index (SDI) or enhanced temperature index (ETI) formulations - because these model structures are often applied in data-sparse mountain regions, and perform relatively well over short periods, but their calibration is known to change based on climate and geography. Using CCHF, we implement seven existing and novel models, including one existing SDI model, two existing ETI models, and four novel models that utilize a combination of existing and novel module representations. The novel module representations include a heat transfer formulation with net longwave radiation and a snowpack internal energy formulation that uses an approximation of the cold content. We assess the models for the Gulkana and Wolverine glaciated watersheds in Alaska, which have markedly different climates and contain long-term US Geological Survey benchmark glaciers. Overall we find that the best performing models are those that are more physically consistent and representative, but no single model performs best for all of our model

  18. Assessing Students' Understandings of Biological Models and Their Use in Science to Evaluate a Theoretical Framework

    ERIC Educational Resources Information Center

    Grünkorn, Juliane; Upmeier zu Belzen, Annette; Krüger, Dirk

    2014-01-01

    Research in the field of students' understandings of models and their use in science describes different frameworks concerning these understandings. Currently, there is no conjoint framework that combines these structures and so far, no investigation has focused on whether it reflects students' understandings sufficiently (empirical…

  19. An Overview of Models of Speaking Performance and Its Implications for the Development of Procedural Framework for Diagnostic Speaking Tests

    ERIC Educational Resources Information Center

    Zhao, Zhongbao

    2013-01-01

    This paper aims at developing a procedural framework for the development and validation of diagnostic speaking tests. The researcher reviews the current available models of speaking performance, analyzes the distinctive features and then points out the implications for the development of a procedural framework for diagnostic speaking tests. On…

  20. The Behavioral Ecological Model as a Framework for School-Based Anti-Bullying Health Promotion Interventions

    ERIC Educational Resources Information Center

    Dresler-Hawke, Emma; Whitehead, Dean

    2009-01-01

    This article presents a conceptual strategy which uses the Behavioral Ecological Model (BEM) as a health promotion framework to guide school-based bullying awareness programs and subsequent anti-bullying strategies for school nursing practice. Anti-bullying frameworks and tools are scarce despite the extent of the problem of bullying. This article…

  1. Revisions to the PARCC Model Content Frameworks for Mathematics and ELA/Literacy Based on Public Feedback

    ERIC Educational Resources Information Center

    Partnership for Assessment of Readiness for College and Careers (NJ1), 2011

    2011-01-01

    The PARCC Model Content Frameworks for Mathematics and ELA/Literacy have been developed through a state-led process in collaboration with members of the Common Core State Standards (CCSS) writing teams. The frameworks were reviewed by the public between August 3-31, 2011. Nearly 1,000 responses were collected, and respondents included K-12…

  2. Genetic Algorithm Based Framework for Automation of Stochastic Modeling of Multi-Season Streamflows

    NASA Astrophysics Data System (ADS)

    Srivastav, R. K.; Srinivasan, K.; Sudheer, K.

    2009-05-01

    Synthetic streamflow data generation involves the synthesis of likely streamflow patterns that are statistically indistinguishable from the observed streamflow data. The various kinds of stochastic models adopted for multi-season streamflow generation in hydrology are: i) parametric models which hypothesize the form of the periodic dependence structure and the distributional form a priori (examples are PAR, PARMA); disaggregation models that aim to preserve the correlation structure at the periodic level and the aggregated annual level; ii) Nonparametric models (examples are bootstrap/kernel based methods), which characterize the laws of chance, describing the stream flow process, without recourse to prior assumptions as to the form or structure of these laws; (k-nearest neighbor (k-NN), matched block bootstrap (MABB)); non-parametric disaggregation model. iii) Hybrid models which blend both parametric and non-parametric models advantageously to model the streamflows effectively. Despite many of these developments that have taken place in the field of stochastic modeling of streamflows over the last four decades, accurate prediction of the storage and the critical drought characteristics has been posing a persistent challenge to the stochastic modeler. This is partly because, usually, the stochastic streamflow model parameters are estimated by minimizing a statistically based objective function (such as maximum likelihood (MLE) or least squares (LS) estimation) and subsequently the efficacy of the models is being validated based on the accuracy of prediction of the estimates of the water-use characteristics, which requires large number of trial simulations and inspection of many plots and tables. Still accurate prediction of the storage and the critical drought characteristics may not be ensured. In this study a multi-objective optimization framework is proposed to find the optimal hybrid model (blend of a simple parametric model, PAR(1) model and matched block

  3. Threat driven modeling framework using petri nets for e-learning system.

    PubMed

    Khamparia, Aditya; Pandey, Babita

    2016-01-01

    Vulnerabilities at various levels are main cause of security risks in e-learning system. This paper presents a modified threat driven modeling framework, to identify the threats after risk assessment which requires mitigation and how to mitigate those threats. To model those threat mitigations aspects oriented stochastic petri nets are used. This paper included security metrics based on vulnerabilities present in e-learning system. The Common Vulnerability Scoring System designed to provide a normalized method for rating vulnerabilities which will be used as basis in metric definitions and calculations. A case study has been also proposed which shows the need and feasibility of using aspect oriented stochastic petri net models for threat modeling which improves reliability, consistency and robustness of the e-learning system. PMID:27119050

  4. How are organisational climate models and patient satisfaction related? A competing value framework approach.

    PubMed

    Ancarani, Alessandro; Di Mauro, Carmela; Giammanco, Maria Daniela

    2009-12-01

    Patient satisfaction has become an important indicator of process quality inside hospitals. Even so, the improvement of patient satisfaction cannot simply follow from the implementation of new incentives schemes and organisational arrangements; it also depends on hospitals' cultures and climates. This paper studies the impact of alternative models of organisational climate in hospital wards on patient satisfaction. Data gathered from seven public hospitals in Italy are used to explore this relationship. The theoretical approach adopted is the Competing Value Framework which classifies organisations according to their inward or outward focus and according to the importance assigned to control vs. flexibility. Results show that both a model stressing openness, change and innovation and a model emphasising cohesion and workers' morale are positively related to patient satisfaction, while a model based on managerial control is negatively associated with patient satisfaction.

  5. Feminist Framework Plus: Knitting Feminist Theories of Rape Etiology Into a Comprehensive Model.

    PubMed

    McPhail, Beverly A

    2016-07-01

    The radical-liberal feminist perspective on rape posits that the assault is motivated by power and control rather than sexual gratification and is a violent rather than a sexual act. However, rape is a complex act. Relying on only one early strand of feminist thought to explain the etiology of rape limits feminists' understanding of rape and the practice based upon the theory. The history of the adoption of the "power, not sex" theory is presented and the model critiqued. A more integrated model is developed and presented, the Feminist Framework Plus, which knits together five feminist theories into a comprehensive model that better explains the depth and breadth of the etiology of rape. Empirical evidence that supports each theory is detailed as well as the implications of the model on service provision, education, and advocacy. PMID:26018209

  6. Threat driven modeling framework using petri nets for e-learning system.

    PubMed

    Khamparia, Aditya; Pandey, Babita

    2016-01-01

    Vulnerabilities at various levels are main cause of security risks in e-learning system. This paper presents a modified threat driven modeling framework, to identify the threats after risk assessment which requires mitigation and how to mitigate those threats. To model those threat mitigations aspects oriented stochastic petri nets are used. This paper included security metrics based on vulnerabilities present in e-learning system. The Common Vulnerability Scoring System designed to provide a normalized method for rating vulnerabilities which will be used as basis in metric definitions and calculations. A case study has been also proposed which shows the need and feasibility of using aspect oriented stochastic petri net models for threat modeling which improves reliability, consistency and robustness of the e-learning system.

  7. How are organisational climate models and patient satisfaction related? A competing value framework approach.

    PubMed

    Ancarani, Alessandro; Di Mauro, Carmela; Giammanco, Maria Daniela

    2009-12-01

    Patient satisfaction has become an important indicator of process quality inside hospitals. Even so, the improvement of patient satisfaction cannot simply follow from the implementation of new incentives schemes and organisational arrangements; it also depends on hospitals' cultures and climates. This paper studies the impact of alternative models of organisational climate in hospital wards on patient satisfaction. Data gathered from seven public hospitals in Italy are used to explore this relationship. The theoretical approach adopted is the Competing Value Framework which classifies organisations according to their inward or outward focus and according to the importance assigned to control vs. flexibility. Results show that both a model stressing openness, change and innovation and a model emphasising cohesion and workers' morale are positively related to patient satisfaction, while a model based on managerial control is negatively associated with patient satisfaction. PMID:19850393

  8. Feminist Framework Plus: Knitting Feminist Theories of Rape Etiology Into a Comprehensive Model.

    PubMed

    McPhail, Beverly A

    2016-07-01

    The radical-liberal feminist perspective on rape posits that the assault is motivated by power and control rather than sexual gratification and is a violent rather than a sexual act. However, rape is a complex act. Relying on only one early strand of feminist thought to explain the etiology of rape limits feminists' understanding of rape and the practice based upon the theory. The history of the adoption of the "power, not sex" theory is presented and the model critiqued. A more integrated model is developed and presented, the Feminist Framework Plus, which knits together five feminist theories into a comprehensive model that better explains the depth and breadth of the etiology of rape. Empirical evidence that supports each theory is detailed as well as the implications of the model on service provision, education, and advocacy.

  9. A deep learning framework for modeling structural features of RNA-binding protein targets

    PubMed Central

    Zhang, Sai; Zhou, Jingtian; Hu, Hailin; Gong, Haipeng; Chen, Ligong; Cheng, Chao; Zeng, Jianyang

    2016-01-01

    RNA-binding proteins (RBPs) play important roles in the post-transcriptional control of RNAs. Identifying RBP binding sites and characterizing RBP binding preferences are key steps toward understanding the basic mechanisms of the post-transcriptional gene regulation. Though numerous computational methods have been developed for modeling RBP binding preferences, discovering a complete structural representation of the RBP targets by integrating their available structural features in all three dimensions is still a challenging task. In this paper, we develop a general and flexible deep learning framework for modeling structural binding preferences and predicting binding sites of RBPs, which takes (predicted) RNA tertiary structural information into account for the first time. Our framework constructs a unified representation that characterizes the structural specificities of RBP targets in all three dimensions, which can be further used to predict novel candidate binding sites and discover potential binding motifs. Through testing on the real CLIP-seq datasets, we have demonstrated that our deep learning framework can automatically extract effective hidden structural features from the encoded raw sequence and structural profiles, and predict accurate RBP binding sites. In addition, we have conducted the first study to show that integrating the additional RNA tertiary structural features can improve the model performance in predicting RBP binding sites, especially for the polypyrimidine tract-binding protein (PTB), which also provides a new evidence to support the view that RBPs may own specific tertiary structural binding preferences. In particular, the tests on the internal ribosome entry site (IRES) segments yield satisfiable results with experimental support from the literature and further demonstrate the necessity of incorporating RNA tertiary structural information into the prediction model. The source code of our approach can be found in https

  10. A Multi-model Data Assimilation Framework Via Ensemble Kalman Filter

    NASA Astrophysics Data System (ADS)

    Xue, L.; Zhang, D.

    2013-12-01

    The ensemble Kalman filter (EnKF) is a widely-used data assimilation method that has the capacity to sequentially update system parameters and states as new observations become available. One noticeable feature of EnKF is that it not only can provide optimal updates of model parameters and state variables, but also can give the uncertainty associated with them in each assimilation step. The natural system is open and complex, rendering it prone to multiple interpretations and mathematical descriptions. Bayesian model averaging (BMA) is an effective method to account for the uncertainty stemming from the model itself. In this paper, EnKF is embedded into the BMA framework, where individual posterior probability distributions of state vectors after each assimilation step are linearly integrated together through posterior model weights. A two-dimensional illustrative example is employed to demonstrate the proposed multi-model data assimilation approach via EnKF. Results show that statistical bias and uncertainty underestimation can occur when the data assimilation process relies on a single postulated model. The posterior model weight can adjust itself dynamically in time according to its consistency with observations. The performances of log conductivity estimation and head prediction are compared to the standard EnKF method based on the postulated single model and the proposed multi-model EnKF method. Comparisons show that multi-model EnKF performs better in terms of log score and coverage when sufficient observations have been assimilated.

  11. A Two-Part Mixed-Effects Modeling Framework For Analyzing Whole-Brain Network Data

    PubMed Central

    Simpson, Sean L.; Laurienti, Paul J.

    2015-01-01

    Whole-brain network analyses remain the vanguard in neuroimaging research, coming to prominence within the last decade. Network science approaches have facilitated these analyses and allowed examining the brain as an integrated system. However, statistical methods for modeling and comparing groups of networks have lagged behind. Fusing multivariate statistical approaches with network science presents the best path to develop these methods. Toward this end, we propose a two-part mixed-effects modeling framework that allows modeling both the probability of a connection (presence/absence of an edge) and the strength of a connection if it exists. Models within this framework enable quantifying the relationship between an outcome (e.g., disease status) and connectivity patterns in the brain while reducing spurious correlations through inclusion of confounding covariates. They also enable prediction about an outcome based on connectivity structure and vice versa, simulating networks to gain a better understanding of normal ranges of topological variability, and thresholding networks leveraging group information. Thus, they provide a comprehensive approach to studying system level brain properties to further our understanding of normal and abnormal brain function. PMID:25796135

  12. A two-part mixed-effects modeling framework for analyzing whole-brain network data.

    PubMed

    Simpson, Sean L; Laurienti, Paul J

    2015-06-01

    Whole-brain network analyses remain the vanguard in neuroimaging research, coming to prominence within the last decade. Network science approaches have facilitated these analyses and allowed examining the brain as an integrated system. However, statistical methods for modeling and comparing groups of networks have lagged behind. Fusing multivariate statistical approaches with network science presents the best path to develop these methods. Toward this end, we propose a two-part mixed-effects modeling framework that allows modeling both the probability of a connection (presence/absence of an edge) and the strength of a connection if it exists. Models within this framework enable quantifying the relationship between an outcome (e.g., disease status) and connectivity patterns in the brain while reducing spurious correlations through inclusion of confounding covariates. They also enable prediction about an outcome based on connectivity structure and vice versa, simulating networks to gain a better understanding of normal ranges of topological variability, and thresholding networks leveraging group information. Thus, they provide a comprehensive approach to studying system level brain properties to further our understanding of normal and abnormal brain function. PMID:25796135

  13. Modeling Framework and Validation of a Smart Grid and Demand Response System for Wind Power Integration

    SciTech Connect

    Broeer, Torsten; Fuller, Jason C.; Tuffner, Francis K.; Chassin, David P.; Djilali, Ned

    2014-01-31

    Electricity generation from wind power and other renewable energy sources is increasing, and their variability introduces new challenges to the power system. The emergence of smart grid technologies in recent years has seen a paradigm shift in redefining the electrical system of the future, in which controlled response of the demand side is used to balance fluctuations and intermittencies from the generation side. This paper presents a modeling framework for an integrated electricity system where loads become an additional resource. The agent-based model represents a smart grid power system integrating generators, transmission, distribution, loads and market. The model incorporates generator and load controllers, allowing suppliers and demanders to bid into a Real-Time Pricing (RTP) electricity market. The modeling framework is applied to represent a physical demonstration project conducted on the Olympic Peninsula, Washington, USA, and validation simulations are performed using actual dynamic data. Wind power is then introduced into the power generation mix illustrating the potential of demand response to mitigate the impact of wind power variability, primarily through thermostatically controlled loads. The results also indicate that effective implementation of Demand Response (DR) to assist integration of variable renewable energy resources requires a diversity of loads to ensure functionality of the overall system.

  14. Towards Controlling the Glycoform: A Model Framework Linking Extracellular Metabolites to Antibody Glycosylation

    PubMed Central

    Jedrzejewski, Philip M.; del Val, Ioscani Jimenez; Constantinou, Antony; Dell, Anne; Haslam, Stuart M.; Polizzi, Karen M.; Kontoravdi, Cleo

    2014-01-01

    Glycoproteins represent the largest group of the growing number of biologically-derived medicines. The associated glycan structures and their distribution are known to have a large impact on pharmacokinetics. A modelling framework was developed to provide a link from the extracellular environment and its effect on intracellular metabolites to the distribution of glycans on the constant region of an antibody product. The main focus of this work is the mechanistic in silico reconstruction of the nucleotide sugar donor (NSD) metabolic network by means of 34 species mass balances and the saturation kinetics rates of the 60 metabolic reactions involved. NSDs are the co-substrates of the glycosylation process in the Golgi apparatus and their simulated dynamic intracellular concentration profiles were linked to an existing model describing the distribution of N-linked glycan structures of the antibody constant region. The modelling framework also describes the growth dynamics of the cell population by means of modified Monod kinetics. Simulation results match well to experimental data from a murine hybridoma cell line. The result is a modelling platform which is able to describe the product glycoform based on extracellular conditions. It represents a first step towards the in silico prediction of the glycoform of a biotherapeutic and provides a platform for the optimisation of bioprocess conditions with respect to product quality. PMID:24637934

  15. Linked Hydrologic-Hydrodynamic Model Framework to Forecast Impacts of Rivers on Beach Water Quality

    NASA Astrophysics Data System (ADS)

    Anderson, E. J.; Fry, L. M.; Kramer, E.; Ritzenthaler, A.

    2014-12-01

    The goal of NOAA's beach quality forecasting program is to use a multi-faceted approach to aid in detection and prediction of bacteria in recreational waters. In particular, our focus has been on the connection between tributary loads and bacteria concentrations at nearby beaches. While there is a clear link between stormwater runoff and beach water quality, quantifying the contribution of river loadings to nearshore bacterial concentrations is complicated due to multiple processes that drive bacterial concentrations in rivers as well as those processes affecting the fate and transport of bacteria upon exiting the rivers. In order to forecast potential impacts of rivers on beach water quality, we developed a linked hydrologic-hydrodynamic water quality framework that simulates accumulation and washoff of bacteria from the landscape, and then predicts the fate and transport of washed off bacteria from the watershed to the coastal zone. The framework includes a watershed model (IHACRES) to predict fecal indicator bacteria (FIB) loadings to the coastal environment (accumulation, wash-off, die-off) as a function of effective rainfall. These loadings are input into a coastal hydrodynamic model (FVCOM), including a bacteria transport model (Lagrangian particle), to simulate 3D bacteria transport within the coastal environment. This modeling system provides predictive tools to assist local managers in decision-making to reduce human health threats.

  16. Developing a utility decision framework to evaluate predictive models in breast cancer risk estimation

    PubMed Central

    Wu, Yirong; Abbey, Craig K.; Chen, Xianqiao; Liu, Jie; Page, David C.; Alagoz, Oguzhan; Peissig, Peggy; Onitilo, Adedayo A.; Burnside, Elizabeth S.

    2015-01-01

    Abstract. Combining imaging and genetic information to predict disease presence and progression is being codified into an emerging discipline called “radiogenomics.” Optimal evaluation methodologies for radiogenomics have not been well established. We aim to develop a decision framework based on utility analysis to assess predictive models for breast cancer diagnosis. We garnered Gail risk factors, single nucleotide polymorphisms (SNPs), and mammographic features from a retrospective case-control study. We constructed three logistic regression models built on different sets of predictive features: (1) Gail, (2) Gail + Mammo, and (3) Gail + Mammo + SNP. Then we generated receiver operating characteristic (ROC) curves for three models. After we assigned utility values for each category of outcomes (true negatives, false positives, false negatives, and true positives), we pursued optimal operating points on ROC curves to achieve maximum expected utility of breast cancer diagnosis. We performed McNemar’s test based on threshold levels at optimal operating points, and found that SNPs and mammographic features played a significant role in breast cancer risk estimation. Our study comprising utility analysis and McNemar’s test provides a decision framework to evaluate predictive models in breast cancer risk estimation. PMID:26835489

  17. OpenDrift - an open source framework for ocean trajectory modeling

    NASA Astrophysics Data System (ADS)

    Dagestad, Knut-Frode; Breivik, Øyvind; Ådlandsvik, Bjørn

    2016-04-01

    We will present a new, open source tool for modeling the trajectories and fate of particles or substances (Lagrangian Elements) drifting in the ocean, or even in the atmosphere. The software is named OpenDrift, and has been developed at Norwegian Meteorological Institute in cooperation with Institute of Marine Research. OpenDrift is a generic framework written in Python, and is openly available at https://github.com/knutfrode/opendrift/. The framework is modular with respect to three aspects: (1) obtaining input data, (2) the transport/morphological processes, and (3) exporting of results to file. Modularity is achieved through well defined interfaces between components, and use of a consistent vocabulary (CF conventions) for naming of variables. Modular input implies that it is not necessary to preprocess input data (e.g. currents, wind and waves from Eulerian models) to a particular file format. Instead "reader modules" can be written/used to obtain data directly from any original source, including files or through web based protocols (e.g. OPeNDAP/Thredds). Modularity of processes implies that a model developer may focus on the geophysical processes relevant for the application of interest, without needing to consider technical tasks such as reading, reprojecting, and colocating input data, rotation and scaling of vectors and model output. We will show a few example applications of using OpenDrift for predicting drifters, oil spills, and search and rescue objects.

  18. Physically based estimation of soil water retention from textural data: General framework, new models, and streamlined existing models

    USGS Publications Warehouse

    Nimmo, J.R.; Herkelrath, W.N.; Laguna, Luna A.M.

    2007-01-01

    Numerous models are in widespread use for the estimation of soil water retention from more easily measured textural data. Improved models are needed for better prediction and wider applicability. We developed a basic framework from which new and existing models can be derived to facilitate improvements. Starting from the assumption that every particle has a characteristic dimension R associated uniquely with a matric pressure ?? and that the form of the ??-R relation is the defining characteristic of each model, this framework leads to particular models by specification of geometric relationships between pores and particles. Typical assumptions are that particles are spheres, pores are cylinders with volume equal to the associated particle volume times the void ratio, and that the capillary inverse proportionality between radius and matric pressure is valid. Examples include fixed-pore-shape and fixed-pore-length models. We also developed alternative versions of the model of Arya and Paris that eliminate its interval-size dependence and other problems. The alternative models are calculable by direct application of algebraic formulas rather than manipulation of data tables and intermediate results, and they easily combine with other models (e.g., incorporating structural effects) that are formulated on a continuous basis. Additionally, we developed a family of models based on the same pore geometry as the widely used unsaturated hydraulic conductivity model of Mualem. Predictions of measurements for different suitable media show that some of the models provide consistently good results and can be chosen based on ease of calculations and other factors. ?? Soil Science Society of America. All rights reserved.

  19. ON JOINT DETERMINISTIC GRID MODELING AND SUB-GRID VARIABILITY CONCEPTUAL FRAMEWORK FOR MODEL EVALUATION

    EPA Science Inventory

    The general situation, (but exemplified in urban areas), where a significant degree of sub-grid variability (SGV) exists in grid models poses problems when comparing gridbased air quality modeling results with observations. Typically, grid models ignore or parameterize processes ...

  20. Framework for modeling urban restoration resilience time in the aftermath of an extreme event

    USGS Publications Warehouse

    Ramachandran, Varun; Long, Suzanna K.; Shoberg, Thomas G.; Corns, Steven; Carlo, Héctor

    2015-01-01

    The impacts of extreme events continue long after the emergency response has terminated. Effective reconstruction of supply-chain strategic infrastructure (SCSI) elements is essential for postevent recovery and the reconnectivity of a region with the outside. This study uses an interdisciplinary approach to develop a comprehensive framework to model resilience time. The framework is tested by comparing resilience time results for a simulated EF-5 tornado with ground truth data from the tornado that devastated Joplin, Missouri, on May 22, 2011. Data for the simulated tornado were derived for Overland Park, Johnson County, Kansas, in the greater Kansas City, Missouri, area. Given the simulated tornado, a combinatorial graph considering the damages in terms of interconnectivity between different SCSI elements is derived. Reconstruction in the aftermath of the simulated tornado is optimized using the proposed framework to promote a rapid recovery of the SCSI. This research shows promising results when compared with the independent quantifiable data obtained from Joplin, Missouri, returning a resilience time of 22 days compared with 25 days reported by city and state officials.

  1. Framework for e-learning assessment in dental education: a global model for the future.

    PubMed

    Arevalo, Carolina R; Bayne, Stephen C; Beeley, Josie A; Brayshaw, Christine J; Cox, Margaret J; Donaldson, Nora H; Elson, Bruce S; Grayden, Sharon K; Hatzipanagos, Stylianos; Johnson, Lynn A; Reynolds, Patricia A; Schönwetter, Dieter J

    2013-05-01

    The framework presented in this article demonstrates strategies for a global approach to e-curricula in dental education by considering a collection of outcome assessment tools. By combining the outcomes for overall assessment, a global model for a pilot project that applies e-assessment tools to virtual learning environments (VLE), including haptics, is presented. Assessment strategies from two projects, HapTEL (Haptics in Technology Enhanced Learning) and UDENTE (Universal Dental E-learning), act as case-user studies that have helped develop the proposed global framework. They incorporate additional assessment tools and include evaluations from questionnaires and stakeholders' focus groups. These measure each of the factors affecting the classical teaching/learning theory framework as defined by Entwistle in a standardized manner. A mathematical combinatorial approach is proposed to join these results together as a global assessment. With the use of haptic-based simulation learning, exercises for tooth preparation assessing enamel and dentine were compared to plastic teeth in manikins. Equivalence for student performance for haptic versus traditional preparation methods was established, thus establishing the validity of the haptic solution for performing these exercises. Further data collected from HapTEL are still being analyzed, and pilots are being conducted to validate the proposed test measures. Initial results have been encouraging, but clearly the need persists to develop additional e-assessment methods for new learning domains.

  2. Atomic charges for modeling metal–organic frameworks: Why and how

    SciTech Connect

    Hamad, Said Balestra, Salvador R.G.; Bueno-Perez, Rocio; Calero, Sofia; Ruiz-Salvador, A. Rabdel

    2015-03-15

    Atomic partial charges are parameters of key importance in the simulation of Metal–Organic Frameworks (MOFs), since Coulombic interactions decrease with the distance more slowly than van der Waals interactions. But despite its relevance, there is no method to unambiguously assign charges to each atom, since atomic charges are not quantum observables. There are several methods that allow the calculation of atomic charges, most of them starting from the electronic wavefunction or the electronic density or the system, as obtained with quantum mechanics calculations. In this work, we describe the most common methods employed to calculate atomic charges in MOFs. In order to show the influence that even small variations of structure have on atomic charges, we present the results that we obtained for DMOF-1. We also discuss the effect that small variations of atomic charges have on the predicted structural properties of IRMOF-1. - Graphical abstract: We review the different method with which to calculate atomic partial charges that can be used in force field-based calculations. We also present two examples that illustrate the influence of the geometry on the calculated charges and the influence of the charges on structural properties. - Highlights: • The choice of atomic charges is crucial in modeling adsorption and diffusion in MOFs. • Methods for calculating atomic charges in MOFs are reviewed. • We discuss the influence of the framework geometry on the calculated charges. • We discuss the influence of the framework charges on structural the properties.

  3. Tools and Algorithms to Link Horizontal Hydrologic and Vertical Hydrodynamic Models and Provide a Stochastic Modeling Framework

    NASA Astrophysics Data System (ADS)

    Salah, Ahmad M.; Nelson, E. James; Williams, Gustavious P.

    2010-04-01

    We present algorithms and tools we developed to automatically link an overland flow model to a hydrodynamic water quality model with different spatial and temporal discretizations. These tools run the linked models which provide a stochastic simulation frame. We also briefly present the tools and algorithms we developed to facilitate and analyze stochastic simulations of the linked models. We demonstrate the algorithms by linking the Gridded Surface Subsurface Hydrologic Analysis (GSSHA) model for overland flow with the CE-QUAL-W2 model for water quality and reservoir hydrodynamics. GSSHA uses a two-dimensional horizontal grid while CE-QUAL-W2 uses a two-dimensional vertical grid. We implemented the algorithms and tools in the Watershed Modeling System (WMS) which allows modelers to easily create and use models. The algorithms are general and could be used for other models. Our tools create and analyze stochastic simulations to help understand uncertainty in the model application. While a number of examples of linked models exist, the ability to perform automatic, unassisted linking is a step forward and provides the framework to easily implement stochastic modeling studies.

  4. A Human Sensor Network Framework in Support of Near Real Time Situational Geophysical Modeling

    NASA Astrophysics Data System (ADS)

    Aulov, O.; Price, A.; Smith, J. A.; Halem, M.

    2013-12-01

    The area of Disaster Management is well established among Federal Agencies such as FEMA, EPA, NOAA and NASA. These agencies have well formulated frameworks for response and mitigation based on near real time satellite and conventional observing networks for assimilation into geophysical models. Forecasts from these models are used to communicate with emergency responders and the general public. More recently, agencies have started using social media to broadcast warnings and alerts to potentially affected communities. In this presentation, we demonstrate the added benefits of mining and assimilating the vast amounts of social media data available from heterogeneous hand held devices and social networks into established operational geophysical modeling frameworks as they apply to the five cornerstones of disaster management - Prevention, Mitigation, Preparedness, Response and Recovery. Often, in situations of extreme events, social media provide the earliest notification of adverse extreme events. However, various forms of social media data also can provide useful geolocated and time stamped in situ observations, complementary to directly sensed conventional observations. We use the concept of a Human Sensor Network where one views social media users as carrying field deployed "sensors" whose posts are the remotely "sensed instrument measurements.' These measurements can act as 'station data' providing the resolution and coverage needed for extreme event specific modeling and validation. Here, we explore the use of social media through the use of a Human Sensor Network (HSN) approach as another data input source for assimilation into geophysical models. Employing the HSN paradigm can provide useful feedback in near real-time, but presents software challenges for rapid access, quality filtering and transforming massive social media data into formats consistent with the operational models. As a use case scenario, we demonstrate the value of HSN for disaster management

  5. The Application of Modeling and Simulation in Capacity Management within the ITIL Framework

    NASA Technical Reports Server (NTRS)

    Rahmani, Sonya; vonderHoff, Otto

    2010-01-01

    Tightly integrating modeling and simulation techniques into Information Technology Infrastructure Library (ITIL) practices can be one of the driving factors behind a successful and cost-effective capacity management effort for any Information Technology (IT) system. ITIL is a best practices framework for managing IT infrastructure, development and operations. Translating ITIL theory into operational reality can be a challenge. This paper aims to highlight how to best integrate modeling and simulation into an ITIL implementation. For cases where the project team initially has difficulty gaining consensus on investing in modeling and simulation resources, a clear definition for M&S implementation into the ITIL framework, specifically its role in supporting Capacity Management, is critical to gaining the support required to garner these resources. This implementation should also help to clearly define M&S support to the overall system mission. This paper will describe the development of an integrated modeling approach and how best to tie M&S to definitive goals for evaluating system capacity and performance requirements. Specifically the paper will discuss best practices for implementing modeling and simulation into ITIL. These practices hinge on implementing integrated M&S methods that 1) encompass at least two or more predictive modeling techniques, 2) complement each one's respective strengths and weaknesses to support the validation of predicted results, and 3) are tied to the system's performance and workload monitoring efforts. How to structure two forms of modeling: statistical and simUlation in the development of "As Is" and "To Be" efforts will be used to exemplify the integrated M&S methods. The paper will show how these methods can better support the project's overall capacity management efforts.

  6. Elementary Metabolite Units (EMU): a novel framework for modeling isotopic distributions

    PubMed Central

    Antoniewicz, Maciek R.; Kelleher, Joanne K.; Stephanopoulos, Gregory

    2007-01-01

    Metabolic Flux Analysis (MFA) has emerged as a tool of great significance for metabolic engineering and mammalian physiology. An important limitation of MFA, as carried out via stable isotope labeling and GC/MS and NMR measurements, is the large number of isotopomer or cumomer equations that need to be solved, especially when multiple isotopic tracers are used for the labeling of the system. This restriction reduces the ability of MFA to fully utilize the power of multiple isotopic tracers in elucidating the physiology of realistic situations comprising complex bioreaction networks. Here, we present a novel framework for the modeling of isotopic labeling systems that significantly reduces the number of system variables without any loss of information. The elementary metabolite unit (EMU) framework is based on a highly efficient decomposition method that identifies the minimum amount of information needed to simulate isotopic labeling within a reaction network using the knowledge of atomic transitions occurring in the network reactions. The functional units generated by the decomposition algorithm, called elementary metabolite units, form the new basis for generating system equations that describe the relationship between fluxes and stable isotope measurements. Isotopomer abundances simulated using the EMU framework are identical to those obtained using the isotopomer and cumomer methods, however, require significantly less computation time. For a typical 13C-labeling system the total number of equations that needs to be solved is reduced by one order-of-magnitude (100s EMUs vs. 1000s isotopomers). As such, the EMU framework is most efficient for the analysis of labeling by multiple isotopic tracers. For example, analysis of the gluconeogenesis pathway with 2H, 13C, and 18O tracers requires only 354 EMUs, compared to more than 2 million isotopomers. PMID:17088092

  7. Quantitative Hydrogeological Framework Interpretations from Modeling Helicopter Electromagnetic Survey Data, Nebraska Panhandle

    NASA Astrophysics Data System (ADS)

    Abraham, J. D.; Ball, L. B.; Bedrosian, P. A.; Cannia, J. C.; Deszcz-Pan, M.; Minsley, B. J.; Peterson, S. M.; Smith, B. D.

    2009-12-01

    The need for allocation and management of water resources within the state of Nebraska has created a demand for innovative approaches to data collection for development of hydrogeologic frameworks to be used for 2D and 3D groundwater models. In 2008, the USGS in cooperation with the North Platte Natural Resources District, the South Platte Natural Resources District, and the University of Nebraska Conservation and Survey Division began using frequency domain helicopter electromagnetic (HEM) surveys to map selected sections of the Nebraska Panhandle. The surveys took place in selected sections of the North Platte River valley, Lodgepole Creek, and portions of the adjacent tablelands. The objective of the surveys is to map the aquifers of the area to improve understanding of the groundwater-surface water relationships and develop better hydrogeologic frameworks used in making more accurate 3D groundwater models of the area. For the HEM method to have an impact in a groundwater model at the basin scale, hydrostratigraphic units need to have detectable physical property (electrical resistivity) contrasts. When these contrasts exist within the study area and they are detectable from an airborne platform, large areas can be surveyed to rapidly generate 2D and 3D maps and models of 3D hydrogeologic features. To make the geophysical data useful to multidimensional groundwater models, numerical inversion is necessary to produce a depth-dependent physical property data set reflecting hydrogeologic features. These maps and depth images of electrical resistivity in themselves are not useful for the hydrogeologist. They need to be turned into maps and depth images of the hydrostratigraphic units and hydrogeologic features. Through a process of numerical imaging, inversion, sensitivity analysis, geological ground truthing (boreholes), geological interpretation, hydrogeologic features are characterized. Resistivity depth sections produced from this process are used to pick

  8. Distribution-enhanced homogenization framework and model for heterogeneous elasto-plastic problems

    NASA Astrophysics Data System (ADS)

    Alleman, Coleman; Luscher, D. J.; Bronkhorst, Curt; Ghosh, Somnath

    2015-12-01

    Multi-scale computational models offer tractable means to simulate sufficiently large spatial domains comprised of heterogeneous materials by resolving material behavior at different scales and communicating across these scales. Within the framework of computational multi-scale analyses, hierarchical models enable unidirectional transfer of information from lower to higher scales, usually in the form of effective material properties. Determining explicit forms for the macroscale constitutive relations for complex microstructures and nonlinear processes generally requires numerical homogenization of the microscopic response. Conventional low-order homogenization uses results of simulations of representative microstructural domains to construct appropriate expressions for effective macroscale constitutive parameters written as a function of the microstructural characterization. This paper proposes an alternative novel approach, introduced as the distribution-enhanced homogenization framework or DEHF, in which the macroscale constitutive relations are formulated in a series expansion based on the microscale constitutive relations and moments of arbitrary order of the microscale field variables. The framework does not make any a priori assumption on the macroscale constitutive behavior being represented by a homogeneous effective medium theory. Instead, the evolution of macroscale variables is governed by the moments of microscale distributions of evolving field variables. This approach demonstrates excellent accuracy in representing the microscale fields through their distributions. An approximate characterization of the microscale heterogeneity is accounted for explicitly in the macroscale constitutive behavior. Increasing the order of this approximation results in increased fidelity of the macroscale approximation of the microscale constitutive behavior. By including higher-order moments of the microscale fields in the macroscale problem, micromechanical analyses do

  9. A Computational Framework for 3D Mechanical Modeling of Plant Morphogenesis with Cellular Resolution

    PubMed Central

    Gilles, Benjamin; Hamant, Olivier; Boudaoud, Arezki; Traas, Jan; Godin, Christophe

    2015-01-01

    The link between genetic regulation and the definition of form and size during morphogenesis remains largely an open question in both plant and animal biology. This is partially due to the complexity of the process, involving extensive molecular networks, multiple feedbacks between different scales of organization and physical forces operating at multiple levels. Here we present a conceptual and modeling framework aimed at generating an integrated understanding of morphogenesis in plants. This framework is based on the biophysical properties of plant cells, which are under high internal turgor pressure, and are prevented from bursting because of the presence of a rigid cell wall. To control cell growth, the underlying molecular networks must interfere locally with the elastic and/or plastic extensibility of this cell wall. We present a model in the form of a three dimensional (3D) virtual tissue, where growth depends on the local modulation of wall mechanical properties and turgor pressure. The model shows how forces generated by turgor-pressure can act both cell autonomously and non-cell autonomously to drive growth in different directions. We use simulations to explore lateral organ formation at the shoot apical meristem. Although different scenarios lead to similar shape changes, they are not equivalent and lead to different, testable predictions regarding the mechanical and geometrical properties of the growing lateral organs. Using flower development as an example, we further show how a limited number of gene activities can explain the complex shape changes that accompany organ outgrowth. PMID:25569615

  10. A computational framework for 3D mechanical modeling of plant morphogenesis with cellular resolution.

    PubMed

    Boudon, Frédéric; Chopard, Jérôme; Ali, Olivier; Gilles, Benjamin; Hamant, Olivier; Boudaoud, Arezki; Traas, Jan; Godin, Christophe

    2015-01-01

    The link between genetic regulation and the definition of form and size during morphogenesis remains largely an open question in both plant and animal biology. This is partially due to the complexity of the process, involving extensive molecular networks, multiple feedbacks between different scales of organization and physical forces operating at multiple levels. Here we present a conceptual and modeling framework aimed at generating an integrated understanding of morphogenesis in plants. This framework is based on the biophysical properties of plant cells, which are under high internal turgor pressure, and are prevented from bursting because of the presence of a rigid cell wall. To control cell growth, the underlying molecular networks must interfere locally with the elastic and/or plastic extensibility of this cell wall. We present a model in the form of a three dimensional (3D) virtual tissue, where growth depends on the local modulation of wall mechanical properties and turgor pressure. The model shows how forces generated by turgor-pressure can act both cell autonomously and non-cell autonomously to drive growth in different directions. We use simulations to explore lateral organ formation at the shoot apical meristem. Although different scenarios lead to similar shape changes, they are not equivalent and lead to different, testable predictions regarding the mechanical and geometrical properties of the growing lateral organs. Using flower development as an example, we further show how a limited number of gene activities can explain the complex shape changes that accompany organ outgrowth.

  11. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model.

    PubMed

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework.

  12. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model

    PubMed Central

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework. PMID:26543899

  13. Framework for determining airport daily departure and arrival delay thresholds: statistical modelling approach.

    PubMed

    Wesonga, Ronald; Nabugoomu, Fabian

    2016-01-01

    The study derives a framework for assessing airport efficiency through evaluating optimal arrival and departure delay thresholds. Assumptions of airport efficiency measurements, though based upon minimum numeric values such as 15 min of turnaround time, cannot be extrapolated to determine proportions of delay-days of an airport. This study explored the concept of delay threshold to determine the proportion of delay-days as an expansion of the theory of delay and our previous work. Data-driven approach using statistical modelling was employed to a limited set of determinants of daily delay at an airport. For the purpose of testing the efficacy of the threshold levels, operational data for Entebbe International Airport were used as a case study. Findings show differences in the proportions of delay at departure (μ = 0.499; 95 % CI = 0.023) and arrival (μ = 0.363; 95 % CI = 0.022). Multivariate logistic model confirmed an optimal daily departure and arrival delay threshold of 60 % for the airport given the four probable thresholds {50, 60, 70, 80}. The decision for the threshold value was based on the number of significant determinants, the goodness of fit statistics based on the Wald test and the area under the receiver operating curves. These findings propose a modelling framework to generate relevant information for the Air Traffic Management relevant in planning and measurement of airport operational efficiency. PMID:27441145

  14. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model.

    PubMed

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework. PMID:26543899

  15. A computational framework for 3D mechanical modeling of plant morphogenesis with cellular resolution.

    PubMed

    Boudon, Frédéric; Chopard, Jérôme; Ali, Olivier; Gilles, Benjamin; Hamant, Olivier; Boudaoud, Arezki; Traas, Jan; Godin, Christophe

    2015-01-01

    The link between genetic regulation and the definition of form and size during morphogenesis remains largely an open question in both plant and animal biology. This is partially due to the complexity of the process, involving extensive molecular networks, multiple feedbacks between different scales of organization and physical forces operating at multiple levels. Here we present a conceptual and modeling framework aimed at generating an integrated understanding of morphogenesis in plants. This framework is based on the biophysical properties of plant cells, which are under high internal turgor pressure, and are prevented from bursting because of the presence of a rigid cell wall. To control cell growth, the underlying molecular networks must interfere locally with the elastic and/or plastic extensibility of this cell wall. We present a model in the form of a three dimensional (3D) virtual tissue, where growth depends on the local modulation of wall mechanical properties and turgor pressure. The model shows how forces generated by turgor-pressure can act both cell autonomously and non-cell autonomously to drive growth in different directions. We use simulations to explore lateral organ formation at the shoot apical meristem. Although different scenarios lead to similar shape changes, they are not equivalent and lead to different, testable predictions regarding the mechanical and geometrical properties of the growing lateral organs. Using flower development as an example, we further show how a limited number of gene activities can explain the complex shape changes that accompany organ outgrowth. PMID:25569615

  16. Molecular separations with breathing metal-organic frameworks: modelling packed bed adsorbers.

    PubMed

    Van Assche, Tom R C; Baron, Gino V; Denayer, Joeri F M

    2016-03-14

    Various metal-organic framework (MOFs) adsorbents show peculiar adsorption behaviour as they can adopt different crystal phases, each phase with its own adsorption characteristics. Besides external stimuli such as temperature or light, different species of guest adsorbate can trigger a transition (breathing) of the host structure at a different pressure. Such phase transitions also occur during dynamic separations on a packed bed of adsorbent, where the concentrations of the adsorbates vary throughout axial column distance and time. This work presents a general strategy to model the adsorption behavior of such phase changing adsorbents during column separations and focuses on remarkable model predictions for pure components and binary mixtures in diluted and non-diluted conditions. During binary breakthrough experiments, the behaviour of flexible adsorbents is quite complex. A succession of complete or even partial phase transformations (resulting in phase coexistence) can occur during the adsorption process. A variety of unusual breakthrough profiles is observed for diluted binary mixtures. Simulations reveal at least five types of breakthrough profiles to emerge. The occurrence of these cases can be rationalized by the hodograph technique, combined with the phase diagram of the adsorbent. The remarkable experimental breakthrough profiles observed for ortho-xylene/ethylbenzene (diluted) and CO2/CH4 (non-diluted) separation on the flexible MIL-53 framework can be rationalized by application of the proposed model strategy. PMID:26885972

  17. The health belief model as a conceptual framework for explaining contraceptive compliance.

    PubMed

    Katatsky, M E

    1977-01-01

    The problem of compliance in family planning is discussed in relation to the lack of theoretical and conceptual clarity in research, which has produced contradictory, and often inconsistent findings. Current research has contributed little to explaining the phenomenon of compliance and to directing further research. The Health Belief Model, which has been demonstrated to have application in the areas of preventive health behavior and compliance with medical regimens, is offered as a potentially useful conceptual framework for family planning research. The generalization of the Health Belief Model to family planning behavior is seen as an out-growth of the theoretical strength of the Model and the similarities between family planning and other health behavior.

  18. From terrestrial to aquatic fluxes: Integrating stream dynamics within a dynamic global vegetation modeling framework

    NASA Astrophysics Data System (ADS)

    Hoy, Jerad; Poulter, Benjamin; Emmett, Kristen; Cross, Molly; Al-Chokhachy, Robert; Maneta, Marco

    2016-04-01

    Integrated terrestrial ecosystem models simulate the dynamics and feedbacks between climate, vegetation, disturbance, and hydrology and are used to better understand biogeography and biogeochemical cycles. Extending dynamic vegetation models to the aquatic interface requires coupling surface and sub-surface runoff to catchment routing schemes and has the potential to enhance how researchers and managers investigate how changes in the environment might impact the availability of water resources for human and natural systems. In an effort towards creating such a coupled model, we developed catchment-based hydrologic routing and stream temperature model to pair with LPJ-GUESS, a dynamic global vegetation model. LPJ-GUESS simulates detailed stand-level vegetation dynamics such as growth, carbon allocation, and mortality, as well as various physical and hydrologic processes such as canopy interception and through-fall, and can be applied at small spatial scales, i.e., 1 km. We demonstrate how the coupled model can be used to investigate the effects of transient vegetation dynamics and CO2 on seasonal and annual stream discharge and temperature regimes. As a direct management application, we extend the modeling framework to predict habitat suitability for fish habitat within the Greater Yellowstone Ecosystem, a 200,000 km2 region that provides critical habitat for a range of aquatic species. The model is used to evaluate, quantitatively, the effects of management practices aimed to enhance hydrologic resilience to climate change, and benefits for water storage and fish habitat in the coming century.

  19. Random Testing and Model Checking: Building a Common Framework for Nondeterministic Exploration

    NASA Technical Reports Server (NTRS)

    Groce, Alex; Joshi, Rajeev

    2008-01-01

    Two popular forms of dynamic analysis, random testing and explicit-state software model checking, are perhaps best viewed as search strategies for exploring the state spaces introduced by nondeterminism in program inputs. We present an approach that enables this nondeterminism to be expressed in the SPIN model checker's PROMELA language, and then lets users generate either model checkers or random testers from a single harness for a tested C program. Our approach makes it easy to compare model checking and random testing for models with precisely the same input ranges and probabilities and allows us to mix random testing with model checking's exhaustive exploration of non-determinism. The PROMELA language, as intended in its design, serves as a convenient notation for expressing nondeterminism and mixing random choices with nondeterministic choices. We present and discuss a comparison of random testing and model checking. The results derive from using our framework to test a C program with an effectively infinite state space, a module in JPL's next Mars rover mission. More generally, we show how the ability of the SPIN model checker to call C code can be used to extend SPIN's features, and hope to inspire others to use the same methods to implement dynamic analyses that can make use of efficient state storage, matching, and backtracking.

  20. An implementation framework for wastewater treatment models requiring a minimum programming expertise.

    PubMed

    Rodríguez, J; Premier, G C; Dinsdale, R; Guwy, A J

    2009-01-01

    Mathematical modelling in environmental biotechnology has been a traditionally difficult resource to access for researchers and students without programming expertise. The great degree of flexibility required from model implementation platforms to be suitable for research applications restricts their use to programming expert users. More user friendly software packages however do not normally incorporate the necessary flexibility for most research applications. This work presents a methodology based on Excel and Matlab-Simulink for both flexible and accessible implementation of mathematical models by researchers with and without programming expertise. The models are almost fully defined in an Excel file in which the names and values of the state variables and parameters are easily created. This information is automatically processed in Matlab to create the model structure and almost immediate model simulation, after only a minimum Matlab code definition, is possible. The framework proposed also provides programming expert researchers with a highly flexible and modifiable platform on which to base more complex model implementations. The method takes advantage of structural generalities in most mathematical models of environmental bioprocesses while enabling the integration of advanced elements (e.g. heuristic functions, correlations). The methodology has already been successfully used in a number of research studies.

  1. An implementation framework for wastewater treatment models requiring a minimum programming expertise.

    PubMed

    Rodríguez, J; Premier, G C; Dinsdale, R; Guwy, A J

    2009-01-01

    Mathematical modelling in environmental biotechnology has been a traditionally difficult resource to access for researchers and students without programming expertise. The great degree of flexibility required from model implementation platforms to be suitable for research applications restricts their use to programming expert users. More user friendly software packages however do not normally incorporate the necessary flexibility for most research applications. This work presents a methodology based on Excel and Matlab-Simulink for both flexible and accessible implementation of mathematical models by researchers with and without programming expertise. The models are almost fully defined in an Excel file in which the names and values of the state variables and parameters are easily created. This information is automatically processed in Matlab to create the model structure and almost immediate model simulation, after only a minimum Matlab code definition, is possible. The framework proposed also provides programming expert researchers with a highly flexible and modifiable platform on which to base more complex model implementations. The method takes advantage of structural generalities in most mathematical models of environmental bioprocesses while enabling the integration of advanced elements (e.g. heuristic functions, correlations). The methodology has already been successfully used in a number of research studies. PMID:19182350

  2. An integrated Modelling framework to monitor and predict trends of agricultural management (iMSoil)

    NASA Astrophysics Data System (ADS)

    Keller, Armin; Della Peruta, Raneiro; Schaepman, Michael; Gomez, Marta; Mann, Stefan; Schulin, Rainer

    2014-05-01

    Agricultural systems lay at the interface between natural ecosystems and the anthroposphere. Various drivers induce pressures on the agricultural systems, leading to changes in farming practice. The limitation of available land and the socio-economic drivers are likely to result in further intensification of agricultural land management, with implications on fertilization practices, soil and pest management, as well as crop and livestock production. In order to steer the development into desired directions, tools are required by which the effects of these pressures on agricultural management and resulting impacts on soil functioning can be detected as early as possible, future scenarios predicted and suitable management options and policies defined. In this context, the use of integrated models can play a major role in providing long-term predictions of soil quality and assessing the sustainability of agricultural soil management. Significant progress has been made in this field over the last decades. Some of these integrated modelling frameworks include biophysical parameters, but often the inherent characteristics and detailed processes of the soil system have been very simplified. The development of such tools has been hampered in the past by a lack of spatially explicit soil and land management information at regional scale. The iMSoil project, funded by the Swiss National Science Foundation in the national research programme NRP68 "soil as a resource" (www.nrp68.ch) aims at developing and implementing an integrated modeling framework (IMF) which can overcome the limitations mentioned above, by combining socio-economic, agricultural land management, and biophysical models, in order to predict the long-term impacts of different socio-economic scenarios on the soil quality. In our presentation we briefly outline the approach that is based on an interdisciplinary modular framework that builds on already existing monitoring tools and model components that are

  3. Development of a comprehensive air quality modeling framework for a coastal urban airshed in south Texas

    NASA Astrophysics Data System (ADS)

    Farooqui, Mohmmed Zuber

    Tropospheric ozone is one of the major air pollution problems affecting urban areas of United States as well as other countries in the world. Analysis of surface observed ozone levels in south and central Texas revealed several days exceeding 8-hour average ozone National Ambient of Air Quality Standards (NAAQS) over the past decade. Two major high ozone episodes were identified during September of 1999 and 2002. A photochemical modeling framework for the high ozone episodes in 1999 and 2002 were developed for the Corpus Christi urban airshed. The photochemical model was evaluated as per U.S. Environmental Protection Agency (EPA) recommended statistical methods and the models performed within the limits set by EPA. An emission impact assessment of various sources within the urban airshed was conducted using the modeling framework. It was noted that by nudging MM5 with surface observed meteorological parameters and sea-surface temperature, the coastal meteorological predictions improved. Consequently, refined meteorology helped the photochemical model to better predict peak ozone levels in urban airsheds along the coastal margins of Texas including in Corpus Christi. The emissions assessment analysis revealed that Austin and San Antonio areas were significantly affected by on-road mobile emissions from light-duty gasoline and heavy-duty diesel vehicles. The urban areas of San Antonio, Austin, and Victoria areas were estimated to be NOx sensitive. Victoria was heavily influenced by point sources in the region while Corpus Christi was influenced by both point and non-road mobile sources and was identified to be sensitive to VOC emissions. A rise in atmospheric temperature due to climate change potentially increase ozone exceedances and the peak ozone levels within the study region and this will be a major concern for air quality planners. This study noted that any future increase in ambient temperature would result in a significant increase in the urban and regional

  4. The TimeGeo modeling framework for urban motility without travel surveys.

    PubMed

    Jiang, Shan; Yang, Yingxiang; Gupta, Siddharth; Veneziano, Daniele; Athavale, Shounak; González, Marta C

    2016-09-13

    Well-established fine-scale urban mobility models today depend on detailed but cumbersome and expensive travel surveys for their calibration. Not much is known, however, about the set of mechanisms needed to generate complete mobility profiles if only using passive datasets with mostly sparse traces of individuals. In this study, we present a mechanistic modeling framework (TimeGeo) that effectively generates urban mobility patterns with resolution of 10 min and hundreds of meters. It ties together the inference of home and work activity locations from data, with the modeling of flexible activities (e.g., other) in space and time. The temporal choices are captured by only three features: the weekly home-based tour number, the dwell rate, and the burst rate. These combined generate for each individual: (i) stay duration of activities, (ii) number of visited locations per day, and (iii) daily mobility networks. These parameters capture how an individual deviates from the circadian rhythm of the population, and generate the wide spectrum of empirically observed mobility behaviors. The spatial choices of visited locations are modeled by a rank-based exploration and preferential return (r-EPR) mechanism that incorporates space in the EPR model. Finally, we show that a hierarchical multiplicative cascade method can measure the interaction between land use and generation of trips. In this way, urban structure is directly related to the observed distance of travels. This framework allows us to fully embrace the massive amount of individual data generated by information and communication technologies (ICTs) worldwide to comprehensively model urban mobility without travel surveys.

  5. The TimeGeo modeling framework for urban motility without travel surveys.

    PubMed

    Jiang, Shan; Yang, Yingxiang; Gupta, Siddharth; Veneziano, Daniele; Athavale, Shounak; González, Marta C

    2016-09-13

    Well-established fine-scale urban mobility models today depend on detailed but cumbersome and expensive travel surveys for their calibration. Not much is known, however, about the set of mechanisms needed to generate complete mobility profiles if only using passive datasets with mostly sparse traces of individuals. In this study, we present a mechanistic modeling framework (TimeGeo) that effectively generates urban mobility patterns with resolution of 10 min and hundreds of meters. It ties together the inference of home and work activity locations from data, with the modeling of flexible activities (e.g., other) in space and time. The temporal choices are captured by only three features: the weekly home-based tour number, the dwell rate, and the burst rate. These combined generate for each individual: (i) stay duration of activities, (ii) number of visited locations per day, and (iii) daily mobility networks. These parameters capture how an individual deviates from the circadian rhythm of the population, and generate the wide spectrum of empirically observed mobility behaviors. The spatial choices of visited locations are modeled by a rank-based exploration and preferential return (r-EPR) mechanism that incorporates space in the EPR model. Finally, we show that a hierarchical multiplicative cascade method can measure the interaction between land use and generation of trips. In this way, urban structure is directly related to the observed distance of travels. This framework allows us to fully embrace the massive amount of individual data generated by information and communication technologies (ICTs) worldwide to comprehensively model urban mobility without travel surveys. PMID:27573826

  6. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: Earth System Modeling Software Framework Survey

    NASA Technical Reports Server (NTRS)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.

  7. Hierarchial mark?recapture models: a framework for inference about demographic processes

    USGS Publications Warehouse

    Link, W.A.; Barker, R.J.

    2004-01-01

    The development of sophisticated mark?recapture models over the last four decades has provided fundamental tools for the study of wildlife populations, allowing reliable inference about population sizes and demographic rates based on clearly formulated models for the sampling processes. Mark?recapture models are now routinely described by large numbers of parameters. These large models provide the next challenge to wildlife modelers: the extraction of signal from noise in large collections of parameters. Pattern among parameters can be described by strong, deterministic relations (as in ultrastructural models) but is more flexibly and credibly modeled using weaker, stochastic relations. Trend in survival rates is not likely to be manifest by a sequence of values falling precisely on a given parametric curve; rather, if we could somehow know the true values, we might anticipate a regression relation between parameters and explanatory variables, in which true value equals signal plus noise. Hierarchical models provide a useful framework for inference about collections of related parameters. Instead of regarding parameters as fixed but unknown quantities, we regard them as realizations of stochastic processes governed by hyperparameters. Inference about demographic processes is based on investigation of these hyperparameters. We advocate the Bayesian paradigm as a natural, mathematically and scientifically sound basis for inference about hierarchical models. We describe analysis of capture?recapture data from an open population based on hierarchical extensions of the Cormack?Jolly?Seber model. In addition to recaptures of marked animals, we model first captures of animals and losses on capture, and are thus able to estimate survival probabilities n (i.e., the complement of death or permanent emigration) and per capita growth rates f (i.e., the sum of recruitment and immigration rates). Covariation in these rates, a feature of demographic interest, is explicitly

  8. Hierarchial mark-recapture models: a framework for inference about demographic processes

    USGS Publications Warehouse

    Link, W.A.; Barker, R.J.

    2004-01-01

    The development of sophisticated mark-recapture models over the last four decades has provided fundamental tools for the study of wildlife populations, allowing reliable inference about population sizes and demographic rates based on clearly formulated models for the sampling processes. Mark-recapture models are now routinely described by large numbers of parameters. These large models provide the next challenge to wildlife modelers: the extraction of signal from noise in large collections of parameters. Pattern among parameters can be described by strong, deterministic relations (as in ultrastructural models) but is more flexibly and credibly modeled using weaker, stochastic relations. Trend in survival rates is not likely to be manifest by a sequence of values falling precisely on a given parametric curve; rather, if we could somehow know the true values, we might anticipate a regression relation between parameters and explanatory variables, in which true value equals signal plus noise. Hierarchical models provide a useful framework for inference about collections of related parameters. Instead of regarding parameters as fixed but unknown quantities, we regard them as realizations of stochastic processes governed by hyperparameters. Inference about demographic processes is based on investigation of these hyperparameters. We advocate the Bayesian paradigm as a natural, mathematically and scientifically sound basis for inference about hierarchical models. We describe analysis of capture-recapture data from an open population based on hierarchical extensions of the Cormack-Jolly-Seber model. In addition to recaptures of marked animals, we model first captures of animals and losses on capture, and are thus able to estimate survival probabilities w (i.e., the complement of death or permanent emigration) and per capita growth rates f (i.e., the sum of recruitment and immigration rates). Covariation in these rates, a feature of demographic interest, is explicitly

  9. Evaluation and improvement of the cloud resolving model component of the multi-scale modeling framework

    SciTech Connect

    Xu, Kuan-Man; Cheng, Anning

    2009-10-01

    Developed, implemented and tested an improved Colorado State University (CSU) SAM (System for Atmospheric Modeling) cloud-resolving model (CRM) with the advanced third-order turbulence closure (IPHOC).

  10. Developing a medication communication framework across continuums of care using the Circle of Care Modeling approach

    PubMed Central

    2013-01-01

    Background Medication errors are a common type of preventable errors in health care causing unnecessary patient harm, hospitalization, and even fatality. Improving communication between providers and between providers and patients is a key aspect of decreasing medication errors and improving patient safety. Medication management requires extensive collaboration and communication across roles and care settings, which can reduce (or contribute to) medication-related errors. Medication management involves key recurrent activities (determine need, prescribe, dispense, administer, and monitor/evaluate) with information communicated within and between each. Despite its importance, there is a lack of conceptual models that explore medication communication specifically across roles and settings. This research seeks to address that gap. Methods The Circle of Care Modeling (CCM) approach was used to build a model of medication communication activities across the circle of care. CCM positions the patient in the centre of his or her own healthcare system; providers and other roles are then modeled around the patient as a web of relationships. Recurrent medication communication activities were mapped to the medication management framework. The research occurred in three iterations, to test and revise the model: Iteration 1 consisted of a literature review and internal team discussion, Iteration 2 consisted of interviews, observation, and a discussion group at a Community Health Centre, and Iteration 3 consisted of interviews and a discussion group in the larger community. Results Each iteration provided further detail to the Circle of Care medication communication model. Specific medication communication activities were mapped along each communication pathway between roles and to the medication management framework. We could not map all medication communication activities to the medication management framework; we added Coordinate as a separate and distinct recurrent activity

  11. A statistical learning framework for groundwater nitrate models of the Central Valley, California, USA

    USGS Publications Warehouse

    Nolan, Bernard T.; Fienen, Michael N.; Lorenz, David L.

    2015-01-01

    We used a statistical learning framework to evaluate the ability of three machine-learning methods to predict nitrate concentration in shallow groundwater of the Central Valley, California: boosted regression trees (BRT), artificial neural networks (ANN), and Bayesian networks (BN). Machine learning methods can learn complex patterns in the data but because of overfitting may not generalize well to new data. The statistical learning framework involves cross-validation (CV) training and testing data and a separate hold-out data set for model evaluation, with the goal of optimizing predictive performance by controlling for model overfit. The order of prediction performance according to both CV testing R2 and that for the hold-out data set was BRT > BN > ANN. For each method we identified two models based on CV testing results: that with maximum testing R2 and a version with R2 within one standard error of the maximum (the 1SE model). The former yielded CV training R2 values of 0.94–1.0. Cross-validation testing R2 values indicate predictive performance, and these were 0.22–0.39 for the maximum R2 models and 0.19–0.36 for the 1SE models. Evaluation with hold-out data suggested that the 1SE BRT and ANN models predicted better for an independent data set compared with the maximum R2 versions, which is relevant to extrapolation by mapping. Scatterplots of predicted vs. observed hold-out data obtained for final models helped identify prediction bias, which was fairly pronounced for ANN and BN. Lastly, the models were compared with multiple linear regression (MLR) and a previous random forest regression (RFR) model. Whereas BRT results were comparable to RFR, MLR had low hold-out R2 (0.07) and explained less than half the variation in the training data. Spatial patterns of predictions by the final, 1SE BRT model agreed reasonably well with previously observed patterns of nitrate occurrence in groundwater of the Central Valley.

  12. A statistical learning framework for groundwater nitrate models of the Central Valley, California, USA

    NASA Astrophysics Data System (ADS)

    Nolan, Bernard T.; Fienen, Michael N.; Lorenz, David L.

    2015-12-01

    We used a statistical learning framework to evaluate the ability of three machine-learning methods to predict nitrate concentration in shallow groundwater of the Central Valley, California: boosted regression trees (BRT), artificial neural networks (ANN), and Bayesian networks (BN). Machine learning methods can learn complex patterns in the data but because of overfitting may not generalize well to new data. The statistical learning framework involves cross-validation (CV) training and testing data and a separate hold-out data set for model evaluation, with the goal of optimizing predictive performance by controlling for model overfit. The order of prediction performance according to both CV testing R2 and that for the hold-out data set was BRT > BN > ANN. For each method we identified two models based on CV testing results: that with maximum testing R2 and a version with R2 within one standard error of the maximum (the 1SE model). The former yielded CV training R2 values of 0.94-1.0. Cross-validation testing R2 values indicate predictive performance, and these were 0.22-0.39 for the maximum R2 models and 0.19-0.36 for the 1SE models. Evaluation with hold-out data suggested that the 1SE BRT and ANN models predicted better for an independent data set compared with the maximum R2 versions, which is relevant to extrapolation by mapping. Scatterplots of predicted vs. observed hold-out data obtained for final models helped identify prediction bias, which was fairly pronounced for ANN and BN. Lastly, the models were compared with multiple linear regression (MLR) and a previous random forest regression (RFR) model. Whereas BRT results were comparable to RFR, MLR had low hold-out R2 (0.07) and explained less than half the variation in the training data. Spatial patterns of predictions by the final, 1SE BRT model agreed reasonably well with previously observed patterns of nitrate occurrence in groundwater of the Central Valley.

  13. An open and extensible framework for spatially explicit land use change modelling: the lulcc R package

    NASA Astrophysics Data System (ADS)

    Moulds, S.; Buytaert, W.; Mijic, A.

    2015-10-01

    We present the lulcc software package, an object-oriented framework for land use change modelling written in the R programming language. The contribution of the work is to resolve the following limitations associated with the current land use change modelling paradigm: (1) the source code for model implementations is frequently unavailable, severely compromising the reproducibility of scientific results and making it impossible for members of the community to improve or adapt models for their own purposes; (2) ensemble experiments to capture model structural uncertainty are difficult because of fundamental differences between implementations of alternative models; and (3) additional software is required because existing applications frequently perform only the spatial allocation of change. The package includes a stochastic ordered allocation procedure as well as an implementation of the CLUE-S algorithm. We demonstrate its functionality by simulating land use change at the Plum Island Ecosystems site, using a data set included with the package. It is envisaged that lulcc will enable future model development and comparison within an open environment.

  14. Combined deterministic-stochastic framework for modeling the agglomeration of colloidal particles

    NASA Astrophysics Data System (ADS)

    Mortuza, S. M.; Kariyawasam, Lahiru K.; Banerjee, Soumik

    2015-07-01

    We present a multiscale model, based on molecular dynamics (MD) and kinetic Monte Carlo (kMC), to study the aggregation driven growth of colloidal particles. Coarse-grained molecular dynamics (CGMD) simulations are employed to detect key agglomeration events and calculate the corresponding rate constants. The kMC simulations employ these rate constants in a stochastic framework to track the growth of the agglomerates over longer time scales and length scales. One of the hallmarks of the model is a unique methodology to detect and characterize agglomeration events. The model accounts for individual cluster-scale effects such as change in size due to aggregation as well as local molecular-scale effects such as changes in the number of neighbors of each molecule in a colloidal cluster. Such definition of agglomeration events allows us to grow the cluster to sizes that are inaccessible to molecular simulations as well as track the shape of the growing cluster. A well-studied system, comprising fullerenes in NaCl electrolyte solution, was simulated to validate the model. Under the simulated conditions, the agglomeration process evolves from a diffusion limited cluster aggregation (DLCA) regime to percolating cluster in transition and finally to a gelation regime. Overall the data from the multiscale numerical model shows good agreement with existing theory of colloidal particle growth. Although in the present study we validated our model by specifically simulating fullerene agglomeration in electrolyte solution, the model is versatile and can be applied to a wide range of colloidal systems.

  15. Takagi-Sugeno fuzzy models in the framework of orthonormal basis functions.

    PubMed

    Machado, Jeremias B; Campello, Ricardo J G B; Amaral, Wagner Caradori

    2013-06-01

    An approach to obtain Takagi-Sugeno (TS) fuzzy models of nonlinear dynamic systems using the framework of orthonormal basis functions (OBFs) is presented in this paper. This approach is based on an architecture in which local linear models with ladder-structured generalized OBFs (GOBFs) constitute the fuzzy rule consequents and the outputs of the corresponding GOBF filters are input variables for the rule antecedents. The resulting GOBF-TS model is characterized by having only real-valued parameters that do not depend on any user specification about particular types of functions to be used in the orthonormal basis. The fuzzy rules of the model are initially obtained by means of a well-known technique based on fuzzy clustering and least squares. Those rules are then simplified, and the model parameters (GOBF poles, GOBF expansion coefficients, and fuzzy membership functions) are subsequently adjusted by using a nonlinear optimization algorithm. The exact gradients of an error functional with respect to the parameters to be optimized are computed analytically. Those gradients provide exact search directions for the optimization process, which relies solely on input-output data measured from the system to be modeled. An example is presented to illustrate the performance of this approach in the modeling of a complex nonlinear dynamic system. PMID:23096073

  16. The Unified Plant Growth Model (UPGM): software framework overview and model application

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Since the Environmental Policy Integrated Climate (EPIC) model was developed in 1989, the EPIC plant growth component has been incorporated into other erosion and crop management models (e.g., WEPS, WEPP, SWAT, ALMANAC, and APEX) and modified to meet model developer research objectives. This has re...

  17. Fish dispersal in fragmented landscapes: a modeling framework for quantifying the permeability of structural barriers.

    PubMed

    Pépino, Marc; Rodríguez, Marco A; Magnan, Pierre

    2012-07-01

    Dispersal is a key determinant of the spatial distribution and abundance of populations, but human-made fragmentation can create barriers that hinder dispersal and reduce population viability. This study presents a modeling framework based on dispersal kernels (modified Laplace distributions) that describe stream fish dispersal in the presence of obstacles to passage. We used mark-recapture trials to quantify summer dispersal of brook trout (Salvelinus fontinalis) in four streams crossed by a highway. The analysis identified population heterogeneity in dispersal behavior, as revealed by the presence of a dominant sedentary component (48-72% of all individuals) characterized by short mean dispersal distance (<10 m), and a secondary mobile component characterized by longer mean dispersal distance (56-1086 m). We did not detect evidence of barrier effects on dispersal through highway crossings. Simulation of various plausible scenarios indicated that detectability of barrier effects was strongly dependent on features of sampling design, such as spatial configuration of the sampling area, barrier extent, and sample size. The proposed modeling framework extends conventional dispersal kernels by incorporating structural barriers. A major strength of the approach is that ecological process (dispersal model) and sampling design (observation model) are incorporated simultaneously into the analysis. This feature can facilitate the use of prior knowledge to improve sampling efficiency of mark-recapture trials in movement studies. Model-based estimation of barrier permeability and its associated uncertainty provides a rigorous approach for quantifying the effect of barriers on stream fish dispersal and assessing population dynamics of stream fish in fragmented landscapes.

  18. . Ecological conceptual models: a framework and case study on ecosystem management for South Florida sustainability

    USGS Publications Warehouse

    Gentile, J.H.; Harwell, M.A.; Cropper, W.; Harwell, C. C.; DeAngelis, Donald L.; Davis, S.; Ogden, J.C.; Lirman, D.

    2001-01-01

    The Everglades and South Florida ecosystems are the focus of national and international attention because of their current degraded and threatened state. Ecological risk assessment, sustainability and ecosystem and adaptive management principles and processes are being used nationally as a decision and policy framework for a variety of types of ecological assessments. The intent of this study is to demonstrate the application of these paradigms and principles at a regional scale. The effects-directed assessment approach used in this study consists of a retrospective, eco-epidemiological phase to determine the causes for the current conditions and a prospective predictive risk-based assessment using scenario analysis to evaluate future options. Embedded in these assessment phases is a process that begins with the identification of goals and societal preferences which are used to develop an integrated suite of risk-based and policy relevant conceptual models. Conceptual models are used to illustrate the linkages among management (societal) actions, environmental stressors, and societal/ecological effects, and provide the basis for developing and testing causal hypotheses. These models, developed for a variety of landscape units and their drivers, stressors, and endpoints, are used to formulate hypotheses to explain the current conditions. They are also used as the basis for structuring management scenarios and analyses to project the temporal and spatial magnitude of risk reduction and system recovery. Within the context of recovery, the conceptual models are used in the initial development of performance criteria for those stressors that are determined to be most important in shaping the landscape, and to guide the use of numerical models used to develop quantitative performance criteria in the scenario analysis. The results will be discussed within an ecosystem and adaptive management framework that provides the foundation for decision making.

  19. Selective 4D modelling framework for spatial-temporal land information management system

    NASA Astrophysics Data System (ADS)

    Doulamis, Anastasios; Soile, Sofia; Doulamis, Nikolaos; Chrisouli, Christina; Grammalidis, Nikos; Dimitropoulos, Kosmas; Manesis, Charalambos; Potsiou, Chryssy; Ioannidis, Charalabos

    2015-06-01

    This paper introduces a predictive (selective) 4D modelling framework where only the spatial 3D differences are modelled at the forthcoming time instances, while regions of no significant spatial-temporal alterations remain intact. To accomplish this, initially spatial-temporal analysis is applied between 3D digital models captured at different time instances. So, the creation of dynamic change history maps is made. Change history maps indicate spatial probabilities of regions needed further 3D modelling at forthcoming instances. Thus, change history maps are good examples for a predictive assessment, that is, to localize surfaces within the objects where a high accuracy reconstruction process needs to be activated at the forthcoming time instances. The proposed 4D Land Information Management System (LIMS) is implemented using open interoperable standards based on the CityGML framework. CityGML allows the description of the semantic metadata information and the rights of the land resources. Visualization aspects are also supported to allow easy manipulation, interaction and representation of the 4D LIMS digital parcels and the respective semantic information. The open source 3DCityDB incorporating a PostgreSQL geo-database is used to manage and manipulate 3D data and their semantics. An application is made to detect the change through time of a 3D block of plots in an urban area of Athens, Greece. Starting with an accurate 3D model of the buildings in 1983, a change history map is created using automated dense image matching on aerial photos of 2010. For both time instances meshes are created and through their comparison the changes are detected.

  20. A Framework for the Abstraction of Mesoscale Modeling for Weather Simulation

    NASA Astrophysics Data System (ADS)

    Limpasuvan, V.; Ujcich, B. E.

    2009-12-01

    Widely disseminated weather forecast results (e. g. from various national centers and private companies) are useful for typical users in gauging future atmospheric disturbances. However, these canonical forecasts may not adequately meet the needs of end-users in the various scientific fields since a predetermined model, as structured by the model administrator, produces these forecasts. To perform his/her own successful forecasts, a user faces a steep learning curve involving the collection of initial condition data (e.g. radar, satellite, and reanalyses) and operation of a suitable model (and associated software/computing). In this project, we develop an intermediate (prototypical) software framework and a web-based front-end interface that allow for the abstraction of an advanced weather model upon which the end-user can perform customizable forecasts and analyses. Having such an accessible, front-end interface for a weather model can benefit educational programs at the secondary school and undergraduate level, scientific research in the fields like fluid dynamics and meteorology, and the general public. In all cases, our project allows the user to generate a localized domain of choice, run the desired forecast on a remote high-performance computer cluster, and visually see the results. For instance, an undergraduate science curriculum could incorporate the resulting weather forecast performed under this project in laboratory exercises. Scientific researchers and graduate students would be able to readily adjust key prognostic variables in the simulation within this project’s framework. The general public within the contiguous United States could also run a simplified version of the project’s software with adjustments in forecast clarity (spatial resolution) and region size (domain). Special cases of general interests, in which a detailed forecast may be required, would be over areas of possible strong weather activities.

  1. A translational research framework for enhanced validity of mouse models of psychopathological states in depression.

    PubMed

    Pryce, Christopher R; Seifritz, Erich

    2011-04-01

    Depression presents as a disorder of feelings and thoughts that debilitate daily functioning and can be life threatening. Increased understanding of these specific emotional-cognitive pathological states and their underlying pathophysiologies and neuropathologies is fundamental to an increased understanding of the disorder and, therefore, to development of much-needed improved therapies. Despite this, there is a current lack of emphasis on development and application of translational (i.e. valid) neuropsychological measures in depression research. The appropriate strategy is neuropsychological research translated, bi-directionally, between epidemiological and clinical human research and in vivo - ex vivo preclinical research conducted, primarily, with mice. This paper presents a translational framework to stimulate and inform such research, in four inter-dependent sections. (1) A depression systems-model describes the pathway between human environment-gene (E-G) epidemiology, pathophysiology, psycho- and neuropathology, symptoms, and diagnosis. This model indicates that G→emotional-cognitive endophenotypes and E-G/endophenotype→emotional-cognitive state markers are central to experimental and translational depression research. (2) Human neuropsychological tests with (potential) translational value for the quantitative study of these endophenotypes and state markers are presented. (3) The analogous rodent behavioural tests are presented and their translational validity in terms of providing analogue emotional-cognitive endophenotypes and state markers are discussed. (4) The need for aetiological validity of mouse models in terms of G→endophenotypes and E-G→state markers is presented. We conclude that the informed application of the proposed neuropsychological translational framework will yield mouse models of high face, construct and aetiological validity with respect to emotional-cognitive dysfunction in depression. These models, together with the available

  2. Integrative Analysis of Metabolomics and Transcriptomics Data: A Unified Model Framework to Identify Underlying System Pathways

    PubMed Central

    Brink-Jensen, Kasper; Bak, Søren; Jørgensen, Kirsten; Ekstrøm, Claus Thorn

    2013-01-01

    The abundance of high-dimensional measurements in the form of gene expression and mass spectroscopy calls for models to elucidate the underlying biological system. For widely studied organisms like yeast, it is possible to incorporate prior knowledge from a variety of databases, an approach used in several recent studies. However if such information is not available for a particular organism these methods fall short. In this paper we propose a statistical method that is applicable to a dataset consisting of Liquid Chromatography-Mass Spectroscopy (LC-MS) and gene expression (DNA microarray) measurements from the same samples, to identify genes controlling the production of metabolites. Due to the high dimensionality of both LC-MS and DNA microarray data, dimension reduction and variable selection are key elements of the analysis. Our proposed approach starts by identifying the basis functions (“building blocks”) that constitute the output from a mass spectrometry experiment. Subsequently, the weights of these basis functions are related to the observations from the corresponding gene expression data in order to identify which genes are associated with specific patterns seen in the metabolite data. The modeling framework is extremely flexible as well as computationally fast and can accommodate treatment effects and other variables related to the experimental design. We demonstrate that within the proposed framework, genes regulating the production of specific metabolites can be identified correctly unless the variation in the noise is more than twice that of the signal. PMID:24086255

  3. A New Framework to Compare Mass-Flux Schemes Within the AROME Numerical Weather Prediction Model

    NASA Astrophysics Data System (ADS)

    Riette, Sébastien; Lac, Christine

    2016-08-01

    In the Application of Research to Operations at Mesoscale (AROME) numerical weather forecast model used in operations at Météo-France, five mass-flux schemes are available to parametrize shallow convection at kilometre resolution. All but one are based on the eddy-diffusivity-mass-flux approach, and differ in entrainment/detrainment, the updraft vertical velocity equation and the closure assumption. The fifth is based on a more classical mass-flux approach. Screen-level scores obtained with these schemes show few discrepancies and are not sufficient to highlight behaviour differences. Here, we describe and use a new experimental framework, able to compare and discriminate among different schemes. For a year, daily forecast experiments were conducted over small domains centred on the five French metropolitan radio-sounding locations. Cloud base, planetary boundary-layer height and normalized vertical profiles of specific humidity, potential temperature, wind speed and cloud condensate were compared with observations, and with each other. The framework allowed the behaviour of the different schemes in and above the boundary layer to be characterized. In particular, the impact of the entrainment/detrainment formulation, closure assumption and cloud scheme were clearly visible. Differences mainly concerned the transport intensity thus allowing schemes to be separated into two groups, with stronger or weaker updrafts. In the AROME model (with all interactions and the possible existence of compensating errors), evaluation diagnostics gave the advantage to the first group.

  4. FACET: A simulation software framework for modeling complex societal processes and interactions

    SciTech Connect

    Christiansen, J. H.

    2000-06-02

    FACET, the Framework for Addressing Cooperative Extended Transactions, was developed at Argonne National Laboratory to address the need for a simulation software architecture in the style of an agent-based approach, but with sufficient robustness, expressiveness, and flexibility to be able to deal with the levels of complexity seen in real-world social situations. FACET is an object-oriented software framework for building models of complex, cooperative behaviors of agents. It can be used to implement simulation models of societal processes such as the complex interplay of participating individuals and organizations engaged in multiple concurrent transactions in pursuit of their various goals. These transactions can be patterned on, for example, clinical guidelines and procedures, business practices, government and corporate policies, etc. FACET can also address other complex behaviors such as biological life cycles or manufacturing processes. To date, for example, FACET has been applied to such areas as land management, health care delivery, avian social behavior, and interactions between natural and social processes in ancient Mesopotamia.

  5. A new hybrid framework to efficiently model lines of sight to gravitational lenses

    NASA Astrophysics Data System (ADS)

    McCully, Curtis; Keeton, Charles R.; Wong, Kenneth C.; Zabludoff, Ann I.

    2014-10-01

    In strong gravitational lens systems, the light bending is usually dominated by one main galaxy, but may be affected by other mass along the line of sight (LOS). Shear and convergence can be used to approximate the contributions from less significant perturbers (e.g. those that are projected far from the lens or have a small mass), but higher order effects need to be included for objects that are closer or more massive. We develop a framework for multiplane lensing that can handle an arbitrary combination of tidal planes treated with shear and convergence and planes treated exactly (i.e. including higher order terms). This framework addresses all of the traditional lensing observables including image positions, fluxes, and time delays to facilitate lens modelling that includes the non-linear effects due to mass along the LOS. It balances accuracy (accounting for higher order terms when necessary) with efficiency (compressing all other LOS effects into a set of matrices that can be calculated up front and cached for lens modelling). We identify a generalized multiplane mass sheet degeneracy, in which the effective shear and convergence are sums over the lensing planes with specific, redshift-dependent weighting factors.

  6. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    NASA Technical Reports Server (NTRS)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  7. A robust nonparametric framework for reconstruction of stochastic differential equation models

    NASA Astrophysics Data System (ADS)

    Rajabzadeh, Yalda; Rezaie, Amir Hossein; Amindavar, Hamidreza

    2016-05-01

    In this paper, we employ a nonparametric framework to robustly estimate the functional forms of drift and diffusion terms from discrete stationary time series. The proposed method significantly improves the accuracy of the parameter estimation. In this framework, drift and diffusion coefficients are modeled through orthogonal Legendre polynomials. We employ the least squares regression approach along with the Euler-Maruyama approximation method to learn coefficients of stochastic model. Next, a numerical discrete construction of mean squared prediction error (MSPE) is established to calculate the order of Legendre polynomials in drift and diffusion terms. We show numerically that the new method is robust against the variation in sample size and sampling rate. The performance of our method in comparison with the kernel-based regression (KBR) method is demonstrated through simulation and real data. In case of real dataset, we test our method for discriminating healthy electroencephalogram (EEG) signals from epilepsy ones. We also demonstrate the efficiency of the method through prediction in the financial data. In both simulation and real data, our algorithm outperforms the KBR method.

  8. A Modular Framework for Modeling Hardware Elements in Distributed Engine Control Systems

    NASA Technical Reports Server (NTRS)

    Zinnecker, Alicia M.; Culley, Dennis E.; Aretskin-Hariton, Eliot D.

    2015-01-01

    Progress toward the implementation of distributed engine control in an aerospace application may be accelerated through the development of a hardware-in-the-loop (HIL) system for testing new control architectures and hardware outside of a physical test cell environment. One component required in an HIL simulation system is a high-fidelity model of the control platform: sensors, actuators, and the control law. The control system developed for the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k) provides a verifiable baseline for development of a model for simulating a distributed control architecture. This distributed controller model will contain enhanced hardware models, capturing the dynamics of the transducer and the effects of data processing, and a model of the controller network. A multilevel framework is presented that establishes three sets of interfaces in the control platform: communication with the engine (through sensors and actuators), communication between hardware and controller (over a network), and the physical connections within individual pieces of hardware. This introduces modularity at each level of the model, encouraging collaboration in the development and testing of various control schemes or hardware designs. At the hardware level, this modularity is leveraged through the creation of a SimulinkR library containing blocks for constructing smart transducer models complying with the IEEE 1451 specification. These hardware models were incorporated in a distributed version of the baseline C-MAPSS40k controller and simulations were run to compare the performance of the two models. The overall tracking ability differed only due to quantization effects in the feedback measurements in the distributed controller. Additionally, it was also found that the added complexity of the smart transducer models did not prevent real-time operation of the distributed controller model, a requirement of an HIL system.

  9. A Modular Framework for Modeling Hardware Elements in Distributed Engine Control Systems

    NASA Technical Reports Server (NTRS)

    Zinnecker, Alicia M.; Culley, Dennis E.; Aretskin-Hariton, Eliot D.

    2014-01-01

    Progress toward the implementation of distributed engine control in an aerospace application may be accelerated through the development of a hardware-in-the-loop (HIL) system for testing new control architectures and hardware outside of a physical test cell environment. One component required in an HIL simulation system is a high-fidelity model of the control platform: sensors, actuators, and the control law. The control system developed for the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k) provides a verifiable baseline for development of a model for simulating a distributed control architecture. This distributed controller model will contain enhanced hardware models, capturing the dynamics of the transducer and the effects of data processing, and a model of the controller network. A multilevel framework is presented that establishes three sets of interfaces in the control platform: communication with the engine (through sensors and actuators), communication between hardware and controller (over a network), and the physical connections within individual pieces of hardware. This introduces modularity at each level of the model, encouraging collaboration in the development and testing of various control schemes or hardware designs. At the hardware level, this modularity is leveraged through the creation of a Simulink(R) library containing blocks for constructing smart transducer models complying with the IEEE 1451 specification. These hardware models were incorporated in a distributed version of the baseline C-MAPSS40k controller and simulations were run to compare the performance of the two models. The overall tracking ability differed only due to quantization effects in the feedback measurements in the distributed controller. Additionally, it was also found that the added complexity of the smart transducer models did not prevent real-time operation of the distributed controller model, a requirement of an HIL system.

  10. A Modular Framework for Modeling Hardware Elements in Distributed Engine Control Systems

    NASA Technical Reports Server (NTRS)

    Zinnecker, Alicia Mae; Culley, Dennis E.; Aretskin-Hariton, Eliot D.

    2014-01-01

    Progress toward the implementation of distributed engine control in an aerospace application may be accelerated through the development of a hardware-in-the-loop (HIL) system for testing new control architectures and hardware outside of a physical test cell environment. One component required in an HIL simulation system is a high-fidelity model of the control platform: sensors, actuators, and the control law. The control system developed for the Commercial Modular Aero-Propulsion System Simulation 40k (40,000 pound force thrust) (C-MAPSS40k) provides a verifiable baseline for development of a model for simulating a distributed control architecture. This distributed controller model will contain enhanced hardware models, capturing the dynamics of the transducer and the effects of data processing, and a model of the controller network. A multilevel framework is presented that establishes three sets of interfaces in the control platform: communication with the engine (through sensors and actuators), communication between hardware and controller (over a network), and the physical connections within individual pieces of hardware. This introduces modularity at each level of the model, encouraging collaboration in the development and testing of various control schemes or hardware designs. At the hardware level, this modularity is leveraged through the creation of a Simulink (R) library containing blocks for constructing smart transducer models complying with the IEEE 1451 specification. These hardware models were incorporated in a distributed version of the baseline C-MAPSS40k controller and simulations were run to compare the performance of the two models. The overall tracking ability differed only due to quantization effects in the feedback